Running tmp and swap from your fast SSD in AWS

Ensure your SSDs are mounted on /mnt

# mount -l

Add a 8GB swapfile to /mnt

# cd /mnt
# /bin/dd if=/dev/zero of=swapfile bs=1M count=8192
# chown root:root swapfile
# chmod 0600 swapfile

Activate your new swapfile

$ mkswap /mnt/swapfile
$ swapon /mnt/swapfile

I symlink tmp to the mnt directory as well…

# mkdir /mnt/tmp
# chown -R root:root /mnt/tmp
# chmod 1777 /mnt/tmp
# ln -s /mnt/tmp /tmp

Ensure both these are happy at reboot by adding/modifying these lines in /etc/fstab

/dev/xvd[X] /mnt ext3 defaults,usrquota 0 0
/mnt/swapfile swap swap defaults 0 0

PS: If you haven’t got any ephemeral storage attached to your EC2 instance, you’ll need to create an AMI from that instance, modify the attached volumes and create a new box attaching the EBS drives to it. A little bit of downtime, but hopefully worth it for the snappy swap space.

Ref:
http://www.centos.org/docs/5/html/5.1/Deployment_Guide/s2-swap-creating-file.html
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/InstanceStorage.html#Using_AddingDefaultLocalInstanceStorageToAMI

Fedora preupgrade notes

I’ve just upgraded my Fedora 16 to 17 using the lovely preupgrade tool.

I’ve never had major problems with this before, but there a few complications thrown in this time that I thought worth documenting.

 

I’m running a fully encrypted system (LUKS) on a Dell M4600 laptop. This has worked really well with a single extra password prompt on bootup, but when it came to using preupgrade, there were a few issues to cleanup before I could upgrade smoothly, silkily and successfully.

After running preupgrade the system wouldn’t reboot automatically

This was solved by adding reboot=pci as a kernel boot option in /etc/default/grub to the ‘GRUB_CMDLINE_LINUX=’ line (this is where grub2 stores its default options).

Ref: http://fedoraproject.org/wiki/GRUB_2

# grub2-mkconfig -o /boot/grub2/grub.cfg

After rebooting into upgrade, no upgradeable install found

This turned out to most likely be a firmware issue in the Dell’s SATA controller. It turned out my partitions were slightly corrupted resulting in preupgrade bailing with a ‘GPT corrupt’ error. Not enough to cause any issues with day to day running, but enough to be a problem when using preupgrade. I switched the SATA options in bios from ‘Raid On’ to ‘AHCI’. This prevents the corruption from reoccurring, and fortunately was safe for me to do with my setup. I then followed the detailed instructions in this FedoraForum post to restore my partitions. The first post details the repair and the tools needed (gdisk, parted).

Ref: http://forums.fedoraforum.org/archive/index.php/t-272868.html

After reaching the upgrade steps, ‘Unknown release on <drive label>… Product mismatch. Version mismatch’

This was solved by re-installing the fedora-release package on the F16 install.

$ sudo yum reinstall fedora-release

However as is always the case, each time I read around whilst trying to solve issues on Linux, I find a drastically better way of doing things. This time round I discovered rpmconf.. a tool for helping you track down and merge all those .rpmnew .rpmsave config files after package updates.

Giving it the following command line options you can find all the files needing attention, and pipe them into merge tools like meld or vimdiff. Much better than my previously cobbled together scripts!

$ sudo yum install rpmconf
$ sudo rpmconf -a -fmeld

So all in all a bit of a mission, but what can I say I’m addicted to free upgrades!

Make way, Git coming through

iptables -A OUTPUT -o eth0 -p tcp –dport 9418 -m state –state NEW,ESTABLISHED -j ACCEPT
iptables -A INPUT -i eth0 -p tcp –sport 9418 -m state –state ESTABLISHED -j ACCEPT

Github.. it’s pretty cool

That is all

Plesk passwords and all that Jazz

 

I’ve been using the mysql console tip for years from this blog post by Brian Resig (maintaining  a few legacy sites), and struggled to find it this time. In fear of losing it entirely, I’m copying it here. I’ll thank myself one day I’m sure!

 

mysql -u admin -p`cat /etc/psa/.psa.shadow` psa -e “SELECT accounts.id, mail.mail_name, accounts.password, domains.name FROM domains LEFT JOIN mail ON domains.id = mail.dom_id LEFT JOIN accounts ON mail.account_id = accounts.id where mail_name=’ENTER MAIL NAME HERE’ “

Developing a custom content system

We’ve worked with Practical Action for many years developing their site and online presence, but back in 2010 they took the bold step to completely overhaul their site. This naturally involved considering both changes to the design and the system on which the site was built. After considering a few different systems the team decided on a bespoke path that led to the creation of a custom Content Management System that really establishes new foundations for the site to develop over the coming years.

Starting with the redesign

The design process was exiting, and working on such a large site was always going to be a challenge. The core challenge as ever is to find a way to effectively deliver all the content through an interface that pleases the visitor, helping them to clearly find and read content.

The designs needed to streamline the flow of content, simplifying the navigation and present a lot of content. Lots of time had to be given to planning content layout and we manage to nestle a lot of featured content into carousels on the home page and on all the category home pages as well as embracing a tab system for breaking up in-page content into usable groups. It helped that Practical Action had commissioned a report on their old site which provided some great recommendations including how menus should be constructed (one tier only for main menus). This helped direct early decisions and after a number of design concepts this final design was born.

CMS Highlights

We can’t go into too much detail about all the tech behind the CMS (that will follow in future posts), but here’s a snapshot of some of the features that have been created as part of this solution:

XML at the core

The key objective was to build a system that was not restricted to the web and build a flexible system for the future. To achieve this we needed to move away from the standard web editors and so devised a way to create content in XML. In essence we’ve developed a publishing system, rather than a web content management system. Editors write content which is stored as XML then our CMS, depending on the choosen the template, will be able to translate the XML into what ever media the end user requests. Currently the site is only delivering the content as web pages (HTML), however given the XML system underneath, this content could just as easily be delivered to mobile browsers, or used to create PDF’s ready for print or screen, or even syndicated to third-party services.

Reusable Content

Content is an asset and shouldn’t be locked to a page and given our ‘publishing’ mindset we needed to disconnect ‘pages’ from ‘content’. So the system allows content creators to work their magic using the editor interface as with any CMS, but to publish this content the users must assign this content to the site map to allow the content to be pulled through to the site. This ‘node’ architecture enables content to be repurposed in different parts of the site, allowing different ‘leafs’ or pages to use the same content. In future this same content could be used by a mobile sitemap, or even by a PDF system – allowing content to be written once and automatically distributed to multiple platforms.

Cross Linking Content

There’s no point having loads of content unless visitors can easily get to it so another key goal of the new system was to provide a flexible way for content creators to build lists of relevant content from their article.

We developed a pretty powerful query builder that allows editors to create a list of content based on all sorts of criteria. They can list content from a particular branch of the sitemap or list content that matches certain keywords.

Practical Action’s content is also tagged with information such as language, content type (case studies, technical briefs, annual reports etc) and audience (Development practitioners, teachers, MEP/MPs etc), all of which are available to the query builder, allowing the editor to build content relevant to the reader.

This feature is also being developed to automatically suggest content based on what the visitor is reading, much like how a shop suggest other products that you might be interested in.

Previewing Media

It was important that the fixed media, i.e. PDF, word documents etc could be previewed from within the site, rather than requiring the visitor to download the file to view.

We found a document viewer and built it into the CMS in a way that automatically converts media files into a format suitable for the viewer and will allow visitors to scroll through the media content from within the website, with the option to download it if its what they are after.

Currently only PDF’s are previewed through the viewer but plans are already underway to extend this to support other media files.

 

Delivering a scalable solution

The previous site was getting around 80,000 visitors a month, and had thousands of pages so this was a large site to begin.  Ou solution needed to account for the growth plans of the client (i.e. delivering a system that could be easily managed and facilitate the new content) whilst also supporting the increased traffic and keeping the site quick to load. To do this we needed to embrace the latest tech and tools available to us to deliver a responsive site that would provide us with the foundations for future development. We knew from the outset that we’d be building the system on CakePHP which is a leading rapid development framework. Well formed code wasn’t enough though, so we embraced the cloud technologies at Amazon to deliver the site on a load balanced hosting platform. A complex caching system was devised to keep request times down and that cache is synchronized across all servers in the cluster. All amazing tech we will write more about but that’s just the start…

Delivering content around the world

The site carries a great deal of media that is accessed by users across the world, so it seemed only sensible to build the CMS on a Content Delivery Network (CDN). When a user publishes media from within the CMS, the system pushes a copy of that file to the CDN. The website when listing media, uses the CDN version of the file, allowing the visitor to download the file from a web server in their region (rather than having to transfer that data all the way from the UK). Essentially the CDN means that there are copies of the media on servers across the world and when a user wants the file, Amazon will allow them to download it from the location closest to them to allow the file to download quickly. We’ve also embraced the CDN’s access control via the CMS to allow certain documents to only be available for download when a visitor is logged in – very slick. Needless to say any images and files used by the sites core code are all delivered via the same CDN to ensure load times are kept snappy.

Creating little workers to bear the load

There’s so much going on behind the scenes of the CMS that we want to create a window into the world of “little helpers” so users can see everything that gets carried out to ensure the site is blazing. The principle behind developing a process queue, was that we did want administrators having to wait while for the CMS to finish doing all it needs to do when a page is published. So from the users perspective they click a button and see a response – job done. But behind the scense the CMS’s little workers busy away to:

  • Create the necessary thumbnail images (used on the listings) for the content
  • Add the content information to the search index
  • Push any associated media to the Content Delivery Network (CDN)
  • Convert any associated media to the format required for the sites document viewer

Migrating from old to new

What’s a shinny new site with all these great features if there’s no content? Probably one of the largest and most difficult tasks of the project was to import the existing content from the old site to the new site as well as import all the documents and carry across existing user accounts. To give you some idea of the scale, here are some approximate figures:

  • 4,500+ web pages
  • 3,500+ documents (PDF’s, word documents etc)
  • 64,000+ user accounts

The process wasn’t without it’s glitches, cleaning the old data and preparing it in a format for the new system was quite time consuming. Remember we were moving from a traditional CMS that had allowed users to create content in HTML, to a new system that wanted to remove that dependancy from content and reformat content to XML. We also needed to ensure that old content would be presented in the new layouts, so we devised an import system that allowed the administrators to pick an old page and convert it into a new page. This would involve selecting the one of the new templates available and assigning old content to the various parts of the new page.  Let’s not forget the sheer time to process the all the physical documents and ensure they were all uploaded and synchronized with the CDN correctly. We were even able to recreate the user accounts on the new CMS without having to reset any of their credentials, so they were all able to login with their details without any potential blockers to them accepting the new system.

 

Check out the site for yourself: http://practicalaction.org

Building Magento around Booksonix

Practical Action Publishing approached FLUID7 to redesign their Development Bookshop. It was an exciting opportunity to move them off of an ageing OSCommerce site and into a shiny new Magento instal. To really spice things up we wanted to connect their shop with their book management software, Booksonix. Creating a slick solution that pulls new and updated titles from Booksonix into Magento automatically, cutting out the need for the team to manage titles in both applications independently.

You can check out the new bookshop here: http://developmentbookshop.com/

 

Redesigning the Bookshop

The new design pulled in core styles that had been created for the recent group site redesign for Practical Action. Presenting enough similarities in the layout and navigation to enhance the group connections, whilst also creating a unique colour scheme and content elements that would let it stand out from the group and allow it to stand alongside its bookshop peers.

The Magento store offers Practical Action Publishing a firm ecommerce platform to promote and distribute their titles from. It includes specific listings for each of the publishers that they sell titles on behalf of, such as Oxfam. Not only can titles be viewed exclusively by Publisher but standard category listings can also be filtered by publisher allowing the visitor to navigate the site in a number of different ways, to find the titles of interest.

With the designs in place we turned our attention to the build. Anyone working with Magento knows it’s a real beast to wresstle, but once tamed yeilds exceptional results. After developing a number of Magento sites for our clients we’ve already overcome the hosting issues. Magento is very large and requires a hefty server to allow it to run quickly, so we’ve got a sweet server over at Amazon specifically tuned for our Magento sites. Yet the challenge of importing and managing data from Booksonix was to proove pretty tricky.

 

Importing data from Booksonix

Early on we discovered that booksonix didn’t have a live XML feed, instead they ftp an XML to your server on a nightly basis. So instead of being able to use a nice XML module already created for Magento imports, we had to write a custom importer that would pick up any un-imported xml files (and their associated jacket images) and push all that data into Magento.

Booksonix XML consists of a range of shortcodes to represent all the fields managed by their book management software. It took some time to determine what fields were relevant to the clients store-front requirements and with a little processing we were able to format the data and assign it to fields in our Magento site. Probably the hardest challenge was developing a way for the client to manage the store categories from within Booksonix. We were able to utilise the custom fields of Booksonix to allow the client to assign titles to categories. When the importer runs it will now create a new Magento category (if it doesn’t already exist) and assign all coresponding titles to it.

It took a little time to run the first few imports given the sheer volume of data (and image files) that had to be processed, but after a lot of tweaking our importer now runs daily, reading the files from Booksonix and importing anything that is new into Magento.

 

Managing the Magento Store

Whilst the importing of data into a Magento store dramatically enhances the efficiently of the site, there’s still a fair amount that needs to be handled manually within the Magento environment. Core Manageo features allow the client to select featured titles and title of the month, leaving bestsellers to be driven by sales. Practical Action Publishing are also responsible for Managing offers and processing sales, but the beatify of this solution is that now they can focus on selling the titles with reduced time managing the data behind the products.

 

This article doesn’t do the coding behind this solution any justice. We ourselves certainly underestimated the challenge of connecting Booksonix with Magento but we are extremely proud of our achievements (and grateful to the client for bearing with us as we worked through the issues). The benefit of managing data in one place (booksonix) and having other systems (Magento) use that data automatically is a model we try and replicate wherever possible, using technology to simplify the work of our clients (and ourselves).

 

 

 

 

FLUID7 have moved to Electric Wharf

We’re delighted to announce that over the summer the FLUID7 team moved into the Cable Yard at Electric Wharf. Our new studio provides us with the creative space to develop and grow the company as we begin to see the fruit of the partnership between WebJetty and FLUID7 (back in October 2010).

 

Eco-Friendly Studio, overlooking the Canal

We love the eclectic estate that we’re part of. Electric Wharf is an attractive (award winning) canal side environment close to Coventry City Center. Previously an early Victorian power station, the new development of modern offices fuses the industrial brickwork and steel of old with modern architecture. So there’s lots of recycled material in the build and as art around the estate.

Our new studio is within The Cable Yard which is a completely new build that compliments the rest of the development. These eco-offices have been built with solar boosted heating and hotwater systems, rainwater recycling and energy efficient construction with the aim of reducing energy consumption by 35% and water use by 25%. The stunning floor to ceiling windows allow us get a great view over the canal and the city beyond it, and because they are south-facing the studio really does benefit from the sun.

 

Moving wasn’t without its challenges

All the normal challenges of getting services installed (broadband, phone, electric) fade into insignificance against the actual moving experience for us. We decided to move over night to “minimalse” disruption to clients. It took two nights to move everything over and we were able to keep our old studio operational during the process.

However, on moving the final computers over to the new office ready for our first working day in the new pad, we decided things had gone a little too smoothly, so we dropped our server to spice things up. The casing took a beating, as too did the hardrives within it. So our first working day was spent trying to get the server to power up, then trying to get the harddrives to mount in a different machine… the saga continued, but we were able to work without the server while we got a new machine sorted and restored the data.

If that wasn’t fun enough, we arrived at the Cable Yard a week later to be met by the other tenants and a lovely policeman waiting outside the property. Some cheeky chaps must have seen us moving in and thought it would be a good idea to break in and pinch all our lovely Macs. All the offices within the building were effected but it was a tough time saying goodbye to the beautiful beasts that had been the backbone of our design department for years. Fortunately the development team work from laptops which were all out of the office at the time, so work commenced as usual despite embarking on a delightful security exercise of resetting our online passwords (and client passwords) to ensure data couldn’t get into the wrong hands.

 

New pad, New Tech

We’re yet to properly kit out the studio as we’d like it and naturally as we’re preparing for growth there’s a lot of fun toys that we’d like to bring into the workplace (including the replacement kit from the break-in!).

To lay some foundations for this new tech, we thought long and hard about broadband and phone systems. We we’re keen to get away from BT as we’d struggled to get quick support from them in the past, and the nail in the coffin was their inability to allow us to take our old phone number with us half a mile down the road (as the new office was on a different exchange).

We finally settled on Spitfire for our broadband and they were really helpful setting up everything for the move. To sort the phones out we worked with Tino at Forza IT and settled on a Voice over IP (VoIP) solution from Babblevoice. The system we’ve got in place has allowed us to keep our old number and grow our system from only taking one call at a time to handling multiple calls at once. Better still, as well as physical handsets we can setup our iPhones as additional handsets. Meaning the team can stay connected even outside the new studio.

 

Keeping it slim, the ultimate virtual office

The key advantage of our hosted VoIP solution is it that we’re not restricted by physical hardware within our studio. We manage our calls via a web interface, and with the ability to make/receive calls from our iPhones (using the business phone number), we can literally operate from any internet connection. Add to this the concerns over our break-in, and we got to thinking, what if we could set things up to allow us to opperate our entire studio from anywhere we needed to? In the instance of a break in, or fire, we want to have the confidence that all our data is offsite and that we can literatully plug in at home or in a different office and resume production without loss of time or data. We already use a lot of Software as a service (Saas) within the company, such as Fogbugz & Trello for team management, Kiln (Mecurial) for code versioning, BrowserStack for testing etc. So we’re on a cloud-based journey to further slim down our reliance on the physical building we’re in and set ourselves up with the ability to respond quickly to whatever we face in the future. A key aspect in this new journey is to go paperless, which is quite a challenge but we’re excited by the flexibility it will bring.
If you want to know more about the cloud-based services we’re tapping into (including the powerful cloud hosting we offer from Amazon), or have experience you can share when using VoIP or going paperless please get in touch!!

 

The Talent Business Redesign

Designer Ben Merrington got us onboard to deliver his designs for the thetalentbusiness.com. This redesign for The Talent Business, provider of senior talent to communications agencies and creative businesses around the globe, needed to utilise the latest web-tech. The result is smooth HTML5 & CSS3 transitions, some pretty special Ajax content loading and all built on our in-house CMS.

 

On the outside

This really was an exciting contract to work on. Not only are the designs great, but the requirement to make the experience as slick as possible allowed us to get really geeky with the latest techniques. Take a look at the site, www.thetalentbusiness.com, to check out some of the features:

  • Smooth fades between new content, smooth rollovers on the home page, and smooth menus in the main site. It’s all very smooth…
  • When in the main site, we load all content via AJax meaning the page doesn’t have to load the entire page each time. Instead, when you click a link, you’ll notice that only the content that is changing fades in and out, the header and menu remain visible at all times. Getting really geeky, we even update the web address using Ajax so the page name changes so you always know where you are in the site.
  • We also created some neat contact cards to appear when you hover over a member of staff, allowing the core layout to remain clean whilst providing some extra contact details when required.

As well as using HTML5, CSS3 and Ajax to create the required experience, we also made sure as much of the experience could be recreated using JQuery (if the visitors browser doesn’t support HTML5).

 

On the inside

Building this on our in-house CMS helped us deliver the functional requirements of the clients, including:

  • Handling the profile images automatically. Our CMS auto-crops images to fit the designs and desaturates them to allow them to appear in greyscale. So staff don’t need to get into Photoshop to update a profile.
  • Staff profiles and other core content can also be created once and used many times, allowing TTB admin to reuse content across the site without the hassle of duplicating and managing multiple content.
  • Our CMS’s multi-theme system also allowed us to create the CREAM microsite from within the same CMS. Allowing all content to be managed through one system whilst looking pretty different on the outside.

 

The beauty of this project for us is that it uses advanced techniques to subtly improve the vistors experience without shouting too hard that under the surface it’s doing stuff pretty differently to conventional sites. If you like what you see and have a potential project, or want to get geeky and chat more about the tech involved, please throw your comments below!

 

jQuery live(‘submit’) and form serialize() issues in IE

OK It’s been busy in the F7 camp, making it very difficult to find time to blog, hence this being my first development blog entry since before christmas.. MAD.. Admittedly we are still as busy as house elves but I came across some IE voodoo with jquery and wanted to share it with you, or at least so I can refer back!

Using: jquery 1.6.2

Right – I was trying to catch the submit event on a form, the form is loaded with ajax so I use the jquery .live() method.

$("#myForm").live('submit', function() {
    $.post($(this).attr('action'), $(this).serialize(), function(data) {
         $('#dialog').html(data);
    });
    return false;
});

I use the .serialize() to grab all form data, convert it to a nice json string and ajax post it to my php form handler.. Worked great in good browsers, Chrome and FF etc.. In internet explorer, the live(‘submit’) method was hit, but the serialize() method returned empty!

I spent hours trying to find a solution, ripping through google to find an anwser, and there were a few different solutions that I’d like to share, that didn’t help me, but could help you:

  • Check your markup! Make sure you have closed your FORM tag
  • Also make sure your form isn’t embedded in another form, best to just make your markup valid really :) Although that still doesn’t all IE problems..
  • Also instead of returning false, you can use event.preventDefault();

This is all good to do anyways and seems to be the solution to everyone else that I’ve seen encounter the same symptoms, but here’s what sprung my form submit to life:

	$("body").each(function() {
		$("#myForm", this).live('submit', function(e) {
			e.preventDefault();
			$.post($(this).attr('action'), $(this).serialize();, function(data) {
				$('#dialog').html(data);
			});
		});
	});

I found the .delegate() method on the jquery site which pointed me to a way of using the live method (as above) which delegate is a shortcut for.

Essentially wrap your live(‘submit’) inside a parents ‘each’ loop. Note the ‘this’ var in the select method:

       $("#myForm", this).live ....

Hope that helps some one!

Can open source web applications increase the ROI of your website?

Open Source web applications can respond quickly to changes in web trends and technologies, allowing the software to be widely tested and regularly updated – and all for FREE!

For those not familiar with the term open source, it describes practices in production and development that provide open access to the end product’s source materials. (Video: Stephen Fry introducing open source software)

In choosing the right technologies for your website you may have already come across some of the leading open source content management systems (CMS) such as Drupal, WordPress and Joomla. The continuing growth and success of these products is greatly attributed to the open nature of the projects, where any developer can contribute to fixing problems and build enhancements. So as well as the software being free to use, regular updates are also made available by the global community of developers to improve security and usability – allowing these products to respond quickly to changes in security and trends.

Despite these advantages of open source products, can we really trust them with our online sales? Magento is proving we can.

Magento is an enterprise level e-commerce and CMS application which has been designed with big businesses in mind – the likes of Nokia, Xerox, Adidas and Samsung have all adopted Magento at a commercial level. However despite this commercial arm, Magento has actually been built on open source technologies and is available for free, making a very powerful system affordable to SME’s.

Unlike other older e-commerce platforms Magento has been built from the ground up on modern technologies meaning it not only offers improved customer and administrative usability but it does so with more focus on SEO. The result of which is that the software is proactively helping to organise products and content in a way that search engines can lock into, helping to organically drive new visitors to your shop.

We’ve had fun developing a few sites in the latter part of 2010 on Magento and are very excited by the results. Why not check them out:

  1. http://dinkyinc.co.uk
  2. http://reefjewellery.co.uk
  3. http://loose-fit.co.uk