Integrating Blipstar with a backend database to dynamically update your store locator

Sync with your database and avoid manual uploads

Ditch the spreadsheet! A smarter way to update your location data

Overview of auto-syncing location data using a CSV file
Overview of sync your internal location data with Blipstar

For most businesses their store locations change infrequently, so uploading a revised spreadsheet every few months is not an issue. But if you're managing a large number of stores, or information changes on a regular basis, it can be a little cumbersome. Especially so if your store location data is held in an internal database of some description - exporting it as a CSV file to then import it into your Blipstar database is rather inefficient. If only there was a simple way to sync your store locator with your internal database...

Put your data online and Blipstar can do the rest

Well, there is a way - it's not particularly elegant but it works!

So what's involved? Basically (see screenshot)...

  1. You (or your web-team) write a script that dynamically and periodically queries your internal database
  2. The script generates a CSV file of store locations (in the format understood by Blipstar)
  3. That CSV file is then made available at a URL on your web server
  4. You tell Blipstar the URL of the file (so it knows where to look) on the "Locations" page
  5. Every hour or so Blipstar fetches the CSV file and updates your store locator accordingly
  6. So now, if you add a store to your internal database it will automatically appear in your store locator

Implementation details

One nice thing about this approach is it doesn't matter what database you use (MySQL, SQL Server, Oracle...) or your language of choice (Java, PHP, Perl, Python, .Net, C#..) - as long as you can dynamically generate a CSV file at timed intervals (e.g. once a day) - you're in business. CSV is one of the simplest file formats going and most programming languages provide a way to query a database and output text to a file.


  • The accepted column headings (e.g. Name, Address) are exactly the same as when manually importing a CSV file.
  • To avoid unwanted users accessing your data you should publish the CSV file at a hard-to-guess URL (so add lots of random digits, letters and symbols).
  • If your web server supports compression (most do) make sure it is enabled.
  • Make sure your web server can output the text/csv mime type (again, most do).
  • If you operate in a Linux server environment a CRON job is ideal at regularly running scripts.

As an example we've put a test CSV file at the following URL: To link to a live file you...

Live sync entry screenshot
Setting up a link to your online CSV data.
  1. Go to Locations
  2. Enter the URL in the appropriate box (see screenshot)
  3. Select the auto-update option
  4. Click "Import"

That's it! As we said the process won't win any awards for innovation but for a platform-neutral way of syncing data it does the job. If you have any questions on setting up your live data link just let us know via the contact page. We also provide an option where you can FTP the CSV file if that works better for you - contact us for details.

Providing store locator solutions since 2006.   Share: Linkedin  |  Twitter  |  Google+  |  Facebook  |  Pinterest