Android apps – nusic: Find New Music Albums

Have you ever stumbled upon a new album of one of your favorite artist and recognized they released several albums since you last checked?

nusic - your new music

nusic – your new music

Problems like this can now be solved conveniently with nusic for Android.

All you need is an Android device that contains music of all the artists you like.

After installing, nusic regularly checks if there are upcoming releases for the artists on your device and informs you about any news.

Just install the app, start it once and it will keep you up to date about new releases via android notifications. Never again miss any releases of your favorite artists!
 
You can install nusic from Google Play or get the APK from GitHub.

Get it on Google Play
If you should encounter any errors, please report them here instead of just giving a poor rating.

By the way, nusic is open source. Please contribute by forking nusic on GitHub.

 

How does it work?

nusic regularly checks MusicBrainz – the open music encyclopedia – for new releases of the artists on your device.

That’s all there is. You don’t need an account and the app is not pulling any other data from your phone!

Advertisements

Automatically downloading/backing up/dumping/exporting databases from remote hosts via the web

The problem

You operate a database-backed website (e.g. Drupal) where you can’t access cron jobs, cgi, perl and outgoing connections. So one idea to back up your database on a regular basis (which is always a good idea) is to download SQL dumps via a web-based administration tool (such as the backup and migrate plugin for drupal). Unfortunately, these kinds of downloads cannot simply be automated on the shell by using curl or wget, because they require a bit of javascript, for example to outsmart the php timeout.

The solution

Use a headless browser (that is, a browser without graphical user interface) to automate the task. It fetches the desired page, logs in, (virtually) clicks the download button and downloads the dump file.

It should be a command line tool, in order to run it as cron job from a some server (e.g. a NAS).

Personally, I liked the idea of PhantomJS, but it was not available for my Synology DS213+ PowerPC and I didn’t like the idea of building it from source.

So my plan B was to write a small Java program (remoteDbDumper)  that uses the HtmlUnit framework (our headless browser).

How to use

  1. Install drupal plugin backup and migrate.
  2. Download and extract remoteDbDumper.
  3. Start it from the shell.
    remoteDbDumper -u <username> -p <password> -o <output dir> <url to backup and migrate>

    Note that output dir must be an existing directory

    1. Linux example:
      ./remoteDbDumper.sh -u user -p topsecret -o ~/backup https://ho.st/drupal/?q=admin/config/system/backup_migrate
      
    2. Windows example
      remoteDbDumper.bat -u user -p topsecret -o "%HOMEPATH%\backup" https://ho.st/drupal/?q=admin/config/system/backup_migrate
      
  4. Use the scheduling mechanism of your choice to call remoteDbDumper regularly, creating backups.

Example (Synology)

Just a short exemplary scenario on how to use remoteDbDumper on a Synology Diskstation (running DSM 4.2) to regularly back up a drupal database.

  1. (if Java is not installed) install Java:
    If available for your Diskstation, use the Java Manager package. Otherwise, you could use a third party Java package (that’s what I had to do).
  2. Download, extract and copy remoteDbDumper to the NAS (e.g. to \\diskstation\public\, which corresponds to /volume1/public/)
  3. SSH to the NAS and check if it works
    /volume1/public/remoteDbDumper-1.0/remoteDbDumper.sh -u user -p topsecret -o /volume1/someUser/ https://ho.st/drupal/?q=admin/config/system/backup_migrate
    
  4. (optional) Wrap the command line call in a shell script, e.g.
    BASEDIR=$(dirname $0)
    $BASEDIR/remoteDbDumper-1.0/remoteDbDumper.sh -u user -p topsecret -o $1 https://ho.st/drupal/?q=admin/config/system/backup_migrate
    
  5. Either use the web frontend  or the crontab to schedule the back up.
    1. Web frontend:
      Go to http://diskstation:5000, (or whatever combination of host name and port you’re using)
      login as admin,
      click on Control Panel | Task Scheduler.
      Then click on Create | User-defined Script.
      Enter a task name, choose a user (preferably not root), set up a schedule (e.g. every sunday at 8 p.m.).
      Finally enter the path to remoteDbDumpe or the script (4.) respectively. For the example above, the path would look like this:

      /volume1/public/dumpDb.sh /volume1/public/
      
    2. If you insist to do it on foot, here’s what to enter in crontab:
      vi /etc/crontab
      
      #minute hour    mday    month   wday    who              command
      0       20      *       *       0       enterUserHere    /volume1/public/dumpDb.sh /volume1/public/
      
    3. Set a maker in your calender for the next scheduled run, to check if it worked.

Future tasks

At the current state remoteDbDumper can only backup drupal databases. Fair enough.

However, with just a little more effort it would be possible to extend remoteDbDumper to support addition web-based database administration tools, such as  mysqldumper, phpMyBackupPro, phpMyAdmin or phpPhAdmin.

In order to do so, just fork the repo on github and implement the interface DbDump.

Android: Custom Rom – Installing Google Play only (Customize GApps)

Due to license restrictions, Google’s proprietary applications (Play, Talk, YouTube, etc.) don’t come pre-installed with Android custom roms.

That leaves you with two options:

  1. Being happy you got rid of all the bloatware, effectively achieving a “google-free” android.
  2. Google-ify your custom rom by installing the complete “google stack” (GApps) separately.

But what if you prefer the google-free alternative but you purchased apps on Google Play before and want to keep using them?

Or maybe you just realized after hours of searching that a lot of apps cannot be found on other app stores?

You have no choice but to reinstall Google Play. However, using the GApps package as is, results in a bunch of apps and frameworks you don’t need, if you only want to have Play.

So the least thing you can do is to only install the apps/services you really need, avoiding the bloatware mentioned before.

To achieve this, you have to “customize” your GApps package.

This post shows how.

[EDIT 2015-02-08: Paranoid Android offers GApps in different packages sizes (Stock, Full, Mini, Micro, Nano, Pico). Using those might be easier than stripping them yourself. See [GAPPS][4.4.x] OFFICIAL Up-to-Date PA-GOOGLE APPS (All ROM’s) and [GAPPS][5.0.x][BETA] OFFICIAL Up-to-Date PA-GOOGLE APPS (All ROM’s). If those work for you please leave a comment, I’m very interested in any experience.]

[EDIT 2016-05-23: The Open GApps Project now conveniently offers different variants (“aroma” to “pico”) for different android versions (4.4 to 6.0) and different platforms (ARM, x86 and 64 Bits each).  You might want to give “pico” a try and skip the steps stated bellow 🙂 ]

Stripping Gapps

  1. Download the GApps bundle for your Android version
  2. Open the zip file (e.g. with 7-zip)
  3. Go to the system\app\ folder
  4. Delete all apks you don’t want.
    If you want Google Play only, you must keep the following ones:
    GmsCore.apk (Google Play Services)
    GoogleLoginService.apk
    GoogleServicesFramework.apk
    Phonesky.apk (Google Play Store)
    If you don’t keep all four of them, Play is not gonna work properly.
  5. Flash the zip file to your device (as described here, for example).
  6. Reboot
  7. Use Google Play

Further actions

If you’re a bit paranoid, I suggest using LBE Security Master to revoke permissions from Google Services Framework.

If you’re even more paranoid, don’t forget to delete your Google account from your device each type you’re done with Play 🙂

Additional packages worth mentioning

Before stripping GApps, you might consider using another nice feature (introduced with Android 4.1 I think) – offline voice typing. It provides robust voice recognition that doesn’t phone home and works without network connection.

To use it, just leave the VoiceSearchStub.apk within the system\app\ folder of GApps before you flash it.

Update (2013/04/29): This seems to work at first (you can download the offline dictionaries and tab the microphone button) but then the actual voice recognition doesn’t work. Epic Fail!

It’s much more easy to just install the Google Search App via Google Play (once you have got it installed as described above). It includes the option for downloading offline dictionaries and using the voice recognition.

And again, if you’re a bit paranoid, better stop the App from phoning home –  using for example LBE Security Master. 😉

Songbird/Nightingale: Improving search performance

Only just recently, I complained about an everlasting performance problem.

Shortly after writing this, I stumbled upon this nice tweak (thanks michaelvandeborne!):

  1. Click on File | New Tab
  2. Enter about:config, then promise that you’ll be careful.
  3. Enter songbird.dbengine.cacheSize
  4. Increase the value. Start with 5000.
    You might also try to increase or lower it a little and see if the performance increases any further.

 

Songbird/Nightingale: Exporting playlists

The playlist problem

As mentioned in my previous post I have been using Songbird/Nightingale for quite some time, in spite of the drawback mentioned in the post.

No matter if using Songbird or Nightingale, one of my main problem still remained the same: The playlists are trapped somewhere inside the library with no way to export as playlist files. Absolutely no way? That’s not the whole truth, however, as there are (or were) addons like Playlist Export Tool, Export My Playlists or FolderSync. Thanks to the developers, by the way – those addons were really useful to me!

Unfortunately, with every new songbird release, all addons stopped working. In other words: Whenever I made the mistake of updating updated, I wasn’t able to export playlists anymore. I actually don’t even know if there are any addons left, that are compatible to the most recent version of Songbird.

The playlist solution

One more good thing about Songbird (and Nightingale as well), is that it uses an SQLite database. This allows for accessing the Songbird database from a variety of programming languages without getting your hands dirty and makes way for a “third-party” tool, that is capable of exporting playlists from the Songbird database and doesn’t depend on the Songbird version. I developed an exporter in Java and been using it some time to make my Songbird playlists available on my NAS.

As I thought this exporter might be useful to others, I refactored the quick and dirty source code and published it on GitHub. So now, I’m proud to present songbirdDbTools a Java-based playlist exporter for Songbird/Nightingale that was just released in its very first version. Hopefully, it will be of use for somebody else, who was missing this functionality as much as I did 🙂

 

The name is a bit of an exaggeration at this point, as the tool provides only the export functionality. However, I put some effort in designing songbirdDbTools to be as extensible as possible. I have a couple of things in mind that would be useful.
For example synchronizing playlists. That is, exporting not only the playlist but copy the member files as well. This might come handy for effectively synchronizing files to mobile devices.
Or finding zombies and ghosts (like the Excorcist used to do, three years ago). Another neat feature might be to find out all playlists a file belongs

If only I had more time!

So, just in case you’re interested in contributing: Fork songbirdDbTools on GitHub!

Songbird/Nightingale: Using Songbird database in Nightingale

Songbird vs Nightingale

I’ve been using Songbird ever since it was a promising, upcoming, cross-platform open source media player. Back then, I even had it running on a parallel installation of Windows and Fedora on (physically) the same library 🙂

Since then, it seems they cut the support for Linux 😦 and POTI Inc. (the company behind Songbird) seem to focus on mobile/web and losing more and more interest in the good old desktop version. At least, that’s what springs to mind when searching the songbird web page for the desktop version.

getsongbird.com - where is the download link for the desktop version?

getsongbird.com – where is the link to the desktop version?

In addition, there’s this everlasting performance problem, which seems to be inevitable as soon as your library reaches the magic 10k song limit.

Still, I like Songbird’s functionality, it’s open source nature and the addon system. Therefore I never got comfortable with iTunes, Amarok or whatever.
Only just recently, I came across a Songbird fork that looks pretty promising: Nightingale. It supports Linux and there still seems to be some development going on.

getnightingale.com -

getnightingale.com – no need to search for the link to download

Trying Nightingale with your existing Songbird Database or even migrating to Nightingale is fairly easy, as the database as well as addons seem to be compatible with Songbird.

That’s what worked for me (on Windows):

  • Back up songbird folders (just in case):
    • %HOMEDRIVE%\%HOMEPATH%\AppData\Local\Songbird2 and
    • %HOMEDRIVE%\%HOMEPATH%\AppData\Roaming\Songbird2.
  • Create symlinks from Songbird to Nightingale folders:
    • mklink /D %HOMEDRIVE%\%HOMEPATH%\AppData\Local\Nightingale %HOMEDRIVE%\%HOMEPATH%\AppData\Local\Songbird2
    • mklink /D %HOMEDRIVE%\%HOMEPATH%\AppData\Roaming\Nightingale %HOMEDRIVE%\%HOMEPATH%\AppData\Roaming\Songbird2

This should make your Songbird database available on both Nightingale and Songbird. I’d recommend not to run them both in parallel.

Microsoft Robocopy vs Linux NAS: Robocopy Pitfalls

Intro

I have been using Microsoft Robocopy for years, as it is an easy (by now even a built-in) way to do incremental backups under Windows.

Until recently, I used to backup my data from an internal NTFS hard drive to an external NTFS hard drive. As of late, I’m the proud owner of a DS213+ NAS, which runs some Linux base OS and Ext4 hard drives.

As I’m still looking for the perfect backup/versioning client (rsync on windows?!), I thought to stick to Robocopy in the meantime. Unfortunately, my backup scripts, which have done a splendid job of incrementally backing up my data to an external hard drive for years, now do a full backup to the NAS every time.

As it turns out, there is not only one reason for this behavior, but two:

  1. Timestamp
  2. File size

Here are the solutions solving these issues (at least for me), as well as some additional hints to using Robocopy.

1. Timestamp

At first, Robocopy kept telling me NEWER or OLDER for each file (even though the file didn’t change), resulting in copying the file instead of skipping it.

Solution:

First, make sure that both the NAS and the client PC have the same system time (use an NTP server, for example).

If the problem still persists, a good solution is to make Robocopy use FAT file times (/FFT).

This results in a 2-second-granularity, that is a file is only declared NEWER or OLDER, when it there is a difference of more than two seconds between the source and the destination file. If this option is not set, a nanosecond-granularity is used. Obviously, Samba’s time granularity is not as precise, and therefore the time stamps hardly ever match.

2. File size

If your incremental works by now, skip the rest of the article.

As for me, after solving the above problem, the incremental backups still didn’t work.

Robocopy kept telling me CHANGED for most files. Out of the frying pan into the fire!

What does CHANGED mean? The answer can be found here:

The source and destination files have identical time stamps but
different file sizes. The file is copied; to skip this file, use /XC.

Skipping all files with different sizes? No, that’s some dangerous idea when backing up data. So what now?

But why do they have a different size at all? Thats some file on the client PC:

SomeFile on the Client PC

SomeFile on the Client PC

And that’s the same file after transferring to the NAS:

SomeFile on the NAS

SomeFile on the NAS

Tthe attentive observer might have recognized that the size on the disk is different.

The reason for this can be found in different block sizes used in NAS and Client. I was wondering first, because I set up both NTFS and Ext4 with a Block size of 4K.

However, the Samba server has a default block size of 1k! So setting up the Samba with an explicit block size that matches the one of your client PC solves this issue.

How?

SSH to your (Synology) NAS.

 vi /usr/syno/etc/smb.conf

Press i.

Enter this line bellow the [global] tag (use the block size of your host file system, e.g. 4K = 4×1024=4096)

         block size = 4096

Press ESC, then enter :wq and press ENTER.

Restart the samba server by

/usr/syno/etc/rc.d/S80samba.sh restart

That solved my problems and I can now do incremental backups again.
Until I finally have set up perfect rsync for windows solution 🙂

Alternative solution for 1. and 2.

There is, however, an alternative to the solutions for 1. and 2.:

Use the Archive bit. Each file has an archive bit. Everytime you change the file, the bit is set. This behavior can be utilized by Robocopy. Using the /m switch makes Robocopy reset the archive bit on each source file and skips all files whose archive bit is set. That is, it copies only files that changed since the last backup. No need for caring about nasty time stamps or stupid file sizes.

There is one drawback, however. When you want to make a full backup, or you backup your data to several devices, you must not use the /m switch or your backups will be incomplete.

Additional Hints

While I’m on it: Here’s the Robocopy options I use. See Robocopy | SS64.com for complete reference.

robocopy "<source>" "<dest>" /MIR /V /NP /TEE /LOG:"%~f0.log" /Z /R:10 /W:10 /FFT /DCOPY:T
  • /MIR – MIRror a directory tree, that is, copy all subfolders and purge extra files on the destination
  • /V – Verbose output log, showing skipped files
  • /NP – No Progress. Don’t flood log file with % copied for each file.
  • /TEE – Output to console window, as well as the log file.
  • /LOG:”%~f0.log” – Output status to file <name-of-batch-file>.log, overwrite existing one.
  • /Z – Copy files in restartable mode (survive network glitch)
  • /R:10 – Number of Retries on failed copies – default is 1 million. Reducing the number is helpful when trying to copy locked files.
  • /W:10 – Wait time between retries – default is 30 seconds. Reducing the number is helpful when trying to copy locked files.
  • /FFT – Assume FAT File Times, 2-second date/time granularity. See above.
  • /DCOPY:T – Copy Directory Timestamps. Why is this not default? You definitely want to keep your directory timestamps!