As mentioned in my previous post I have been using Songbird/Nightingale for quite some time, in spite of the drawback mentioned in the post.
No matter if using Songbird or Nightingale, one of my main problem still remained the same: The playlists are trapped somewhere inside the library with no way to export as playlist files. Absolutely no way? That’s not the whole truth, however, as there are (or were) addons like Playlist Export Tool, Export My Playlists or FolderSync. Thanks to the developers, by the way – those addons were really useful to me!
Unfortunately, with every new songbird release, all addons stopped working. In other words: Whenever I made the mistake of updating updated, I wasn’t able to export playlists anymore. I actually don’t even know if there are any addons left, that are compatible to the most recent version of Songbird.
The playlist solution
One more good thing about Songbird (and Nightingale as well), is that it uses an SQLite database. This allows for accessing the Songbird database from a variety of programming languages without getting your hands dirty and makes way for a “third-party” tool, that is capable of exporting playlists from the Songbird database and doesn’t depend on the Songbird version. I developed an exporter in Java and been using it some time to make my Songbird playlists available on my NAS.
As I thought this exporter might be useful to others, I refactored the quick and dirty source code and published it on GitHub. So now, I’m proud to present songbirdDbTools a Java-based playlist exporter for Songbird/Nightingale that was just released in its very first version. Hopefully, it will be of use for somebody else, who was missing this functionality as much as I did 🙂
The name is a bit of an exaggeration at this point, as the tool provides only the export functionality. However, I put some effort in designing songbirdDbTools to be as extensible as possible. I have a couple of things in mind that would be useful.
For example synchronizing playlists. That is, exporting not only the playlist but copy the member files as well. This might come handy for effectively synchronizing files to mobile devices.
Or finding zombies and ghosts (like the Excorcist used to do, three years ago). Another neat feature might be to find out all playlists a file belongs
I’ve been using Songbird ever since it was a promising, upcoming, cross-platform open source media player. Back then, I even had it running on a parallel installation of Windows and Fedora on (physically) the same library 🙂
Since then, it seems they cut the support for Linux 😦 and POTI Inc. (the company behind Songbird) seem to focus on mobile/web and losing more and more interest in the good old desktop version. At least, that’s what springs to mind when searching the songbird web page for the desktop version.
In addition, there’s this everlasting performance problem, which seems to be inevitable as soon as your library reaches the magic 10k song limit.
Still, I like Songbird’s functionality, it’s open source nature and the addon system. Therefore I never got comfortable with iTunes, Amarok or whatever.
Only just recently, I came across a Songbird fork that looks pretty promising: Nightingale. It supports Linux and there still seems to be some development going on.
Trying Nightingale with your existing Songbird Database or even migrating to Nightingale is fairly easy, as the database as well as addons seem to be compatible with Songbird.
That’s what worked for me (on Windows):
Back up songbird folders (just in case):
Create symlinks from Songbird to Nightingale folders:
I have been using Microsoft Robocopy for years, as it is an easy (by now even a built-in) way to do incremental backups under Windows.
Until recently, I used to backup my data from an internal NTFS hard drive to an external NTFS hard drive. As of late, I’m the proud owner of a DS213+ NAS, which runs some Linux base OS and Ext4 hard drives.
As I’m still looking for the perfect backup/versioning client (rsync on windows?!), I thought to stick to Robocopy in the meantime. Unfortunately, my backup scripts, which have done a splendid job of incrementally backing up my data to an external hard drive for years, now do a full backup to the NAS every time.
As it turns out, there is not only one reason for this behavior, but two:
Here are the solutions solving these issues (at least for me), as well as some additional hints to using Robocopy.
At first, Robocopy kept telling me NEWER or OLDER for each file (even though the file didn’t change), resulting in copying the file instead of skipping it.
First, make sure that both the NAS and the client PC have the same system time (use an NTP server, for example).
If the problem still persists, a good solution is to make Robocopy use FAT file times (/FFT).
This results in a 2-second-granularity, that is a file is only declared NEWER or OLDER, when it there is a difference of more than two seconds between the source and the destination file. If this option is not set, a nanosecond-granularity is used. Obviously, Samba’s time granularity is not as precise, and therefore the time stamps hardly ever match.
2. File size
If your incremental works by now, skip the rest of the article.
As for me, after solving the above problem, the incremental backups still didn’t work.
Robocopy kept telling me CHANGED for most files. Out of the frying pan into the fire!
What does CHANGED mean? The answer can be found here:
The source and destination files have identical time stamps but
different file sizes. The file is copied; to skip this file, use /XC.
Skipping all files with different sizes? No, that’s some dangerous idea when backing up data. So what now?
But why do they have a different size at all? Thats some file on the client PC:
And that’s the same file after transferring to the NAS:
Tthe attentive observer might have recognized that the size on the disk is different.
The reason for this can be found in different block sizes used in NAS and Client. I was wondering first, because I set up both NTFS and Ext4 with a Block size of 4K.
However, the Samba server has a default block size of 1k! So setting up the Samba with an explicit block size that matches the one of your client PC solves this issue.
SSH to your (Synology) NAS.
Enter this line bellow the [global] tag (use the block size of your host file system, e.g. 4K = 4×1024=4096)
block size = 4096
Press ESC, then enter :wq and press ENTER.
Restart the samba server by
That solved my problems and I can now do incremental backups again.
Until I finally have set up perfect rsync for windows solution 🙂
Alternative solution for 1. and 2.
There is, however, an alternative to the solutions for 1. and 2.:
Use the Archive bit. Each file has an archive bit. Everytime you change the file, the bit is set. This behavior can be utilized by Robocopy. Using the /m switch makes Robocopy reset the archive bit on each source file and skips all files whose archive bit is set. That is, it copies only files that changed since the last backup. No need for caring about nasty time stamps or stupid file sizes.
There is one drawback, however. When you want to make a full backup, or you backup your data to several devices, you must not use the /m switch or your backups will be incomplete.
Every now and then even a windows user might feel the urge to run some batch processes.
In eclipse, batch files can be run by right clicking on the file | Open With | System Editor. This runs the batch file in a new console window.
However, the execution of the batch file starts working in the eclipse installation directory. That is, path that are relative to the folder of the batch file are now invalid.
The (seemingly) simplest solution might be to just use absolute file paths. However, this might turn out a bad idea. For example, when sharing the project – as others might store their workspace on a different position within their file systems.
So, basically, there are three ways to avoid this problem.
Add the following line to the beginning of your batch file:
@cd /D %~dp0
This will change the working directory of to the path of the batch file. So every subsequent command in the batch file is relative to the path of the batch file itself.
Note: the /D-switch makes sure the file path is changed even though it resides on a different partition.
You can verify this behavior by executing the current batch file in eclipse:
:: Right now we're in the eclipse working directory
@cd /D %~dp0
:: We changed the directory to the script location
:: We now can use workspace relative paths!
This will put out the path to the eclipses installation and then the path to the batch file.
Using eclipse’s external tools
This approach offers the advantage that the batch file’s output is shown on the eclipse console view. However, a launch configuration is needed.
The following steps are necessary:
In the menu, click Run | External Tools or on the drop down menu next to .
Click External Tools configurations….
Click Program , then .
Choose Location and Working Directory within your workspace.
(If you want to persist the launch configuration (e.g. for sharing among your team), go to the Common tab, press the radio button Shared file and enter a path, such as /<project>/launch or whatever suits.)
Finally, hit Run and check eclipse’s console view.
Using Ant’s exec task
First of all we need an Ant build file:
Right click on the project | New | File.
In the upcoming dialog enter a file name ending in “.xml” and press Finish. Note that if you name the file “build.xml”, you will have less trouble running the file later.
Within the editor paste an Ant-script, such as this:
As you can see, there are two ways of running a batch file from ant: Using the exec task directly or using the exec task to run cmd.exe and passing the batch file to this process using the /c-switch (as described in the exec task’s manual). Which one you use, is up to you. Either way, you can pass parameters to the batch file using the <arg> tag.
Now that we have a build file, what will we do with it? If you called it “build.xml” earlier, just right click the file | Run As (or hit Ctrl + F11 from the editor). Then click Ant Build an that’s it!
(If you ignored my warning and chose a different file name, you’ll need a run configuration to execute the Ant build. This is pretty much the same as described above for using eclipse’s external tools to execute the batch file directly. Except that you have to click on Ant Build instead of Program.
Using Wicked Shell Plugin
If you’re using the command line a lot, the Wicked Shell might be interesting to you. It allows for using the command line from within eclipse. It also provides a comfortable way of running batch files. See this blog post for more info.
It you’re running eclipse on an OS other than Windows, Wicked Shell works as well. It always runs the system’s default shell, i.e. bash for Linux and OS X.
So, if you want to run batch files from eclipse, just choose one of the approaches described above. Personally, I prefer the first approach, because it comes with the least overhead. No additional launch configuration needed, just double click the batch file and that’s it.
For this post I used the eclipse icons published here. As mentioned there, they are part of the Eclipse Project and licensed under the EPL.
Every time when using (or having to use) the command line in Windows, it takes time until your eyes adjust to the darkness. There’s one thing, however, I’ll never get accustomed to: Working without the Unix/Linux/GNU (whatever you wanna call it) command-line tools. Fortunately, I don’t have to get accustomed to that: There’s plenty of solution for solving this problem out there. In this article I’m going to elaborate on these three:
The “classic” solutions – Cygwin and virtual machine
The lightweight alternative – UnxUtils
A surprising alternative – Git
When refering to the Unix/Linux/GNU command-line tools, I’ll stick to to the term Unix tools, as the heading predicts.
The “classic” solutions
The most popular ways of “getting that Linux feeling on Windows” most likely are Cygwin (a linux-like console, that even provides an X-server) or using a virtual machine like VirtualBox or VMWare and a (small) Linux distribution such as Damn Small Linux.
Of course, not using Windows at all would be a decent solution as well 😉
The lightweight alternative
There are scenarios, where you might not want to or even can’t install Cygwin or a virtual machine. Maybe you’re just looking for a quick way to access these Unix tools from the windows console and have no use for an X-Sever. For this purpose, I use a collection of tools called UnxUtils. Actually, these tools have been there for a long time. The latest version is five and a half years old! Still, it’s downloaded several hundred times a day – impressive!
Now, the nice thing about these tools is, that they are ports to Windows. That is, they are native Windows applications that can run directly from the Windows command prompt (cmd.exe).
Even better, you don’t need to install anything. Just download, extract to some location on your hard drive (in fact, I even carry the utilities around one my flash drive) and you’re almost there. In order not to type the whole path to the UnxUtils binaries every time you intend to use one of them, this path should be added to the beginning of the PATH environment variable. You can either add it permanently (I did this on my Windows computer) or add it temporarily to a specific instance of cmd.exe. To make the UnxUtils portable, I put this small batch script in the UnxUtils folder on my flash drive:
This script opens a console window where you can execute statements like this:
Finally, ending the awful task of analyzing logs with Windows “on-board equipment” 🙂
Note that after adding the path to UnxUtil’s binaries to the beginning of PATH, it’s not possible any more to use Windows-tools that have the same name as one of the UnxUtils, such as find and sort. So, if you prefer the windows-style search tool, you better check the contents of the usr\local\wbin in UnxUtils path first and delete the tools you don’t need.
Unfortunately, I ran into a disadvantage of UnxTools. A rather memory intensive operation like this:
find d:\ | xargs grep "someExpression"
yields an error “xargs cannot fork”. The description for this error at GNU says that the system has run out of process slots, “because the system is very busy and the system has reached its maximum process limit, or because you have a resource limit in place and you’ve reached it”. Not enough memory for cmd.exe? Any ideas?
Someone even filed a bug for that problem, but it obviously has never been fixed (as said, the latest version is more than five years old). So, no solution here, unfortunately 😦
A surprising alternative
More recently I stumbled upon an alternative to UnxTools – Git for Windows. Isn’t that a source code management system? It is, but on Windows, it ships with a console application. When installing Git, you can choose, to either use Git’s console application (Git Bash) or to integrate the Git console’s binaries to Windows’ PATH variable, just as described in the solution above.
Note that neither Git nor UnxUtils contain a native version of vi(m). One way to use it would be the Git Bash. The Git Bash feels very much like Cygwin – so not as lightweight. You can read more about Git Bash and its differences to Cygwin here: A Windows console that does not suck.
To use the Git’s Windows ports of the Unix command-line tools from your flash drive – like described for UnxTools above, without permanently changing the PATH variable – just copy Git’s bin directory and a batch file like the following to the drive:
Executing this file will start a console window with the proper PATH set, so you can start finding and greping right away.
Fortunately, the resource error described above doesn’t occur when using the tools provided by Git.
Case closed! 😀