Leaving HomeSeer for Home assistant

In the ever-evolving landscape of home automation, myriad platforms beckon with promises of smarter, more efficient homes. Among these, my journey began with HomeSeer, a choice influenced by recommendations and its compatibility with my alarm panel, albeit at an additional cost. The initial setup was manageable, and with some effort, I successfully integrated the alarm panel, marking a modest start to my smart home experience.

Over time, my system expanded, incorporating Z-Wave, Sonoff switches, and sensors. However, this expansion was not without its challenges. The Sonoff devices were erratic, often miscommunicating their status to HomeSeer, while my foray into Z-Wave was fraught with issues – devices that would initially function well, only to inexplicably duplicate, rendering my carefully crafted automations ineffective. The lack of a backup option for Z-Wave devices in HomeSeer meant the only solution was a complete reset and reconfiguration of the Z-Wave network, a Sisyphean task that offered only temporary respite.

Compounding these frustrations was the fact that HomeSeer 3 required an active login on a Windows PC – a platform not renowned for its unwavering reliability. Daily restarts became routine, necessitating additional software for auto-login and HomeSeer 3 launch. Yet, even this workaround was insufficient to prevent random disconnections from the alarm panel, the heart of my system, requiring frequent manual intervention to maintain functionality.

The release of HomeSeer 4 brought promises of improvements and a modernized interface. Unfortunately, the reality fell short of expectations. The new version, burdened by a significant upgrade cost, brought more frequent crashes and compatibility issues with previously functional plugins.

By the summer of 2023, my patience had waned. Frustrations with unreliable automations and the tedium of manual overrides led me to explore alternatives. My search narrowed to Hubitat, Home Assistant, and OpenHAB. Hubitat, while competent, was limited by its closed-source nature and support for third-party integrations. In contrast, Home Assistant and OpenHAB offered more promising prospects. After delving into their codebases, I settled on Home Assistant, attracted by its robust third-party support and the empowering nature of its open-source framework.

The transition to Home Assistant was not without its pains. Deciphering the functions of numerous zones, sensors, and outputs from the alarm panel was a daunting task. However, integrating Z-Wave devices with Home Assistant was refreshingly straightforward – a stark contrast to my previous struggles. Similarly, incorporating my Sonoff devices, now running Tasmota, required some manual configuration, but the process was logical and well-documented.

Recreating my automations in Home Assistant was a time-intensive endeavor, yet the platform's power and flexibility facilitated complex, conditional automations with ease. The intuitive interface and support for scripting with Jinja2 allowed for sophisticated, dynamic interactions within my smart home ecosystem.

A week into using Home Assistant, the system's stability and the reliability of its third-party integrations were evident. The dashboards were intuitive to set up, and integrating custom MQTT protocols was straightforward. As I waited for the inevitable hiccup, it never came – Home Assistant operated flawlessly.

Emboldened by this success, I invested in a Home Assistant Yellow, Nabu Casa's flagship hub, to support their ongoing development and subscribed to their cloud service. The transition to this new hub was seamless, encapsulating the user-friendly ethos of Home Assistant.

With Home Assistant, my aspirations for my smart home were no longer hindered. Tasks that were once insurmountable with HomeSeer became feasible. Integrating Alexa for announcements and remote routine triggering, once a distant dream, was now a reality, accomplished with minimal effort.

Six months into my journey with Home Assistant, I felt compelled to share my experience. This narrative is not a critique of either platform but a personal account of my challenges and triumphs in home automation.

Continue reading
  360 Hits
360 Hits

Outlook.com exceeds Google as a source of SPAM for a third month running

Its no surprise to anyone that gmail has held the record for the most spam on a fairly consistent basis over the last few years, but in recent months, outlook.com has now overtaken google for the top spot. 

Our Anti-spam and antivirus product allows us to track spam from automated detection, and from our customers reports, allowing us to monitor trends in spam source over time. Generally when a domain, usually a company with a poorly secured mail server, is detected originating spam we're quick to black list this domain until the volume of mail hitting the gateway reduces to a more reasonable level, but with companies like gmail.com and outlook.com we can't just block those or the Help-Desk would implode with tickets. In these cases we can only blacklist certain senders, and that's a race that we can never win since spammers can register up accounts quickly and will switch as soon as they detect their email's are bouncing. So we offer customers a choice to block both *@outlook.com and *@gmail.com (and their local variants) or not, and where not, we will still try and block senders as quickly as we can. 

As you would expect neither Microsoft or Google take any notice of spam reports, so our actions in blocking email have to stand in isolation. We do offer our block lists of course to anyone who wants to clone them, and we report spam to various public reputation and blacklist sites. 

As an option, our customers who want gmail and outlook as senders, are able to use serverside rules to route outlook. and gmail. into a spammy folder to prevent the majority of spam arriving in the inbox, but its a short measure really. 

So, whilst the rest of the planet is working hard to reduce spam, Google and Microsoft between them account for 43% of it, and this won't change as long as 'free-mail' is a thing. Just for the record, GEN used to have a free-mail product back in the 90's called GENMAIL, but we retired that a decade ago for this very reason; abuse, but of course google and microsoft aren't doing it for free. Google digests email content received and sent by its gmail users to help target advertising to them, and its likely that Microsoft does the same. 

If you are a business, and can't stretch to the few pounds a month it costs to have a domain and email then seriously consider how many potential customers will never receive your replies because of blacklisting or spam detection, and make the smart choice. 


Continue reading
  2386 Hits
  1 Comment


© E&OE

Recent comment in this post
Guest — Rubina
One of the Best Internet Security Software: https://www.protegent360.com/antivirus.html https://www.protegent360.com/protegent-to... Read More
Tuesday, 15 June 2021 10:41
2386 Hits
1 Comment

Brexit and Your Data


I was onboarding a new customer today, a b2c company in Nottingham and I mentioned that their website and email was hosted and stored in the USA, and since their customers create accounts on their website it may be an idea to consider hosting that data within the UK. The customer had no idea this was the case, or indeed the implications. They had been sold a "make it yourself" website from a well known provider who threw in email for a dollar a decade, but never considered where the actual data was being stored, or from where the website was being served. In principle, there's no problem hosting data in the USA, that's fine but for B2C businesses within the UK (and Europe) the General Data Protection Regulation imposes certain requirements on those businesses to protect customer data. 

Since the incorporation of GDPR, end users have a right to expect their personal information is stored and protected within the framework of the legislation. That is, reasonable steps have been taken to protect it from undue exposure, and rights of access, review and removal are granted. The regulator (ICO) has the power to compel companies based in the European Union to comply with this legislation or face severe penalties. A company outside of the UK/EU has no such requirements nor can it be compelled to do anything by the regulator.

This is where it all gets very muddy indeed. As a consumer within the UK you have rights under GDPR, but only for companies operating within the UK. That is, if you purchase something from Fluffy Chicken Limited, they will receive and process your personal data such as name, address, phone number, card numbers and so on, and you would rightfully expect that the processing of your data is covered by GDPR. However, Fluffy Chicken Limited's online shop is hosted in the USA, your data is stored in the USA, and the order is fulfilled by an American company, and in fact Fluffy Chicken Limited does not process your data at all, effectively removing any protection you may think you have under GDPR. Even if Fluffy Chicken Limited did process your data, they would have received it from an American company, not from you. 

When we leave the European Union in a couple of months time, data hosted in Europe may not be in scope for GDPR. In all probability GDPR will only apply to companies within the UK and data stored within the UK. 

The thin line between storage, processing and regulation

I think everyone's aware that a company within the UK that services consumers has to comply with GDPR, makes total sense, but where do the obligation end? 

Assume a UK company pays for an online shop service provided by an American company, customers visit that companies website, place order, and those orders are then transmitted to the UK company. Is the American company under any inference for GDPR? No. In fact the UK company is only responsible for ensuring GDPR for the information received from the American company containing personal information. Should the American company also provider order fulfillment (such as the case with Amazon) then the UK company would have no responsibility for data protection under GDPR. 

In our experience, customers are on the fence on this one. Some of our customers have deliberately migrated data and fulfillement offshore, whereas others have chosen a blended solution. 


We've been involved in several large scale migrations from UK to somewhere in the EU ahead of the brexit deadline. Companies who operate throughout the EU have to decide if being based in the UK is the right choice and for many it's not. There could be arbitrary tariffs imposed or customs regulation, or restrictions. 

For our supply line with heavy dependence on Hewlett Packard, we're going to experience stock shortages and shipping issues since most of our spare parts come from the EU. 

For anyone with a .eu domain name, unless you have offices in the EU then you will loose it at the end of this year. 





Continue reading
  5898 Hits
5898 Hits

FreePBX as a route to intrusion and data breach


The History

FreePBX has been around for decades, and was one of the three popular Asterisk GUI's and our choice for years. Asterisk itself has been around longer, and we've been providing and supporting it since version 1.6. That aside, FreePBX has been constantly developed and enhanced with functions and features providing a framework to build an asterisk dial plan configuration with a nice GUI interface. FreePBX provides include files that can be leveraged to add custom dial plans whilst maintaining general management via the GUI. 

Up until 2015, FreePBX was in a constant development cycle providing regular updates, fixes and features primarily provided by Schmooze, a wisconsin based developer who provided very good commercial support and some commercial modules to monetise the operation. These commercial modules could be purchased with a 25 year license (mostly) and for many this was a great way to get some great commercial features for a sensible charge. 

In January 2015, Sangoma acquired Schmooze and from this point onwards, development slowed, updates slowed and commercially licensed modules stopped updating. Today, development on FreePBX seems to slowed significantly, and even the blog postings on freepbx.org have stopped. Sangoma are still selling the same modules and support but I hazard a guess that their commercial SwitchVOX product is taking priority which is a shame. 

The Breach

Back to the title, We were investigating a data breach at a company (not an existing customer) and working backwards from the epicentre, which was their MySQL server back to the source. We had already identified that the MySQL Server login had been 'discovered' and leveraged to select data from a range of tables from the FreePBX box. We removed the FreePBX Box, imaged it and then returned it. Analysing the image we could see some activity with the mysql -u command under the root login accessing the company's remote MySQL Server. 

I won't bore you with the nitty gritty of the FreePBX box compromise, but let's just say that it was running PHP 5 on Centos 6.5 as the majority will be, simply because the upgrade path is fraught with issues. The backup/restore function does most of the job but there will almost always be some manual correction or even a fresh install and reconfigure which dissuades operators from this path unless absolutely critical. Combine this with the relatively complex setup of NATted SIP/RTP and this promotes the bad practice of putting FreePBX on the dirty side of firewalls, which this was. 

Once the FreePBX box was compromised, there were numerous opportunities to pillage the configuration for upstream SIP credentials (stored in the clear) as well as extension and voicemail passwords, voicemail's, and other data. The hacker had created an inbound route on the switch directing a DDI call to a DISA endpoint, allowing them complete system access. There was also evidence of numerous reconfigurations of inbound routes for unknown reasons. I fully expected the hacker to create or compromise an extension, pretend to be 'IT' and then leverage credentials out of the staff, but instead they simply dumped the asterisk database and found the MySQL Server credentials stored in the clear in the superfectaconfig table. Superfetcha is a module that, when given a CID is able to pull info from various sources (as plugins) and use that to influence the CallID information passed through to the endpoint. It's not a bad module, it works and it's not the authors fault that it stores passwords in the clear (although there are other more secure options such as 2 way TLS), but the clear risk here is that any credentials you enter into it can be retrieved fairly easily and exploited. 

For this customer, their MariaDB database contained their customers, contacts, quotes, invoices, contracts and pricing, all of which was sold on. This breach was highlighted by one of their competitors picking up the phone and notifying them that they were offered the data for a very moderate fee, which was a very honest and professional thing to do. 

The Risk

VoIP servers are often overlooked by risk managers as they are thought to be 'isolated' from the things that matter, but as we can see in this specific case, a simple CID Lookup provided everything needed to compromise the main database server and export it all. I know some may comment that the MySQL login should have been restricted to a certain table or view, but in reality that just doesn't happen that often in the wild and even if a DBA created a view, a user restricted to the PBX box, and granted it select only on that view, you've still given the would-be hacker a valid and operational login to your MariaDB/MySQL database that could be exploited. ANY authenticated connection between server A and Server B creates a possible route for compromise, and you should consider carefully the risk and reward of each. 

I'm not sure what the future holds for FreePBX in the hands of Sangoma. We could see a community supported fork much in the ways of MariaDB, or Sangoma could re-ignite development and clear some of the 802 open issues, we hope so. 

IF YOU ARE RUNNING FreePBX and don't have an active support agreement then get one and ensure...

  • It's running the latest version of FreePBX.
  • It's running on Centos 7 or later.
  • It's behind a firewall with SIP/IAX NAT'ed & firewalld is setup and configured.
  • Apache is restricted to the LAN.
  • Do NOT give CallerID Superfecta or CIDLookup credentials to your database server. If you MUST use caller ID lookup then push a limited table of data to the FreePBX server's MariaDB database and query it there. This is not hard to do and once setup will function just the same as a remote lookup but will maintain that isolation between the FreePBX box and the company database(s). 

If you want to be really comprehensive in your network security policy, then segregate the FreePBX box between two firewalls creating a VLAN for it. This way you have SIP, RTP and IAX NAT'ed to the internet and your upstream providers with specific firewall rules allowing traffic to and from those providers ONLY. The internal LAN firewall will allow only SIP/RTP traffic to and from the FreePBX box and the network segment with your IP phones on it, and HTTP(s) traffic to the network segment with your users on it. Everything will work just fine but with some extra hardware (or even using iptables/firewalld) you've reduced the possible paths to compromise to a negligible level. Anything that talks to the outside allowing incoming connections, even if its just your VoIP Server is a risk that needs to be managed, and this isn't just a FreePBX issue, FreePBX is quoted here simply because this was the route to compromise in this case, but any VoIP server has the same risk factors and should be equally considered. 

If you found this interesting, comment and/or Like. If you need help and advice on your FreePBX server then use the forums for free community assistance or the HelpDesk for priority support. 

Continue reading
  12301 Hits


© E&OE

Recent Comments
Guest — Brian L
"In January 2015, Sangoma acquired Schmooze and from this point onwards, development slowed, updates slowed and commercially licen... Read More
Thursday, 05 September 2019 00:21
Technical Support Team
Thank you Brian L for your thoughts. I didn't write the article but I did see a draft about a week ago and re-reading it today I t... Read More
Thursday, 05 September 2019 09:04
Guest — Jon Harman
That may be your opinion but as a FreePBX user I think its fairly accurate. I cannot remember the last time we had a new module or... Read More
Thursday, 05 September 2019 08:13
12301 Hits

Torrent Sites - The History, Mistakes and Failures

The History

Since the invention of the Internet, software piracy has been a stable activity online, and with broadband came media piracy of Music, TV and Movies. Before bit-torrent, both software and media were shared on download sites (many of which have since been shut down) but this was problematic because site owners quickly disabled links generating high traffic meaning pirated downloads quickly shifted from site to site and downloaders were forced to search numerous links to find one working. 

The appearance of Napster in 1999 promoted a distributed sharing scheme where files could be stored over many hosts, and downloaded in chunks. Napster, which is now long since gone, spawned a host of lookalike peer-to-peer sharing programs such as Gnutella, BearShare, Limewire, Kazaa, Grokster and many more. 

The appearance of BitTorrent in April 2001, thanks to Bram Cohen who designed the protocol, allowed large files to be shared in a different way. Instead of one site hosting the file, and downloaders taking that file, with BitTorrent the file is split into hundreds (or thousands) of chunks and those chunks are spread over hundreds (or thousands) of hosts. The concept is that everyone who downloads the file, then shares bits of the file with other users, quickly creating a distributed source for the file over many hosts. 

BitTorrent however, had no way within the protocol to 'advertise' a searchable list of files available. This was deliberate and not a flaw in its design. To find the files to download, websites would spring up with a searchable list of 'link's that could point a BitTorrent client to a tracker (a service that maintains a list of chunks available on various internet hosts) which was also run by the site and from that a file can be downloaded. These websites were operated free-to-use but with ad-revenue providing some income. 

Whilst there was a core following of these BitTorrent index sites, the vast majority of internet users were completely unaware they existed. Fast forward a few years and by 2003 "ThePirateBay.org" was founded by a Swedish organisation Piratbyran. After a year of operation the site indexed 60k torrent files, and by the end of 2005 the site would index 2.5M torrent files. It must be noted here that whilst the site indexed 2.5M torrent files, the availability of those files was significantly less. Regardless, this file sharing infrastructure did not go unnoticed by the American's and their 'media companies' who through the "Motion Picture Association of America" referred to as the MPAA herein, began a legal assault on such sites even though they were outside of the USA. 

The Mistakes

In 2006, The raid of the Pirate Bay's Servers by Swedish Police admittedly after pressure from the USA hit the global news headlines and millions of new users were introduced to the fantastic world of torrents. The Pirate Bay, after having its servers seized was down for a full 3 days before appearing back online and with its significantly increased user base thanks to the media coverage. If there was ever a moment in time where BitTorrent became mainstream then this was it. 

Having failed miserably to have any significant impact on the Pirate Bay the MPAA and others began a campaign to targeting home users for downloading the content, most of whom were children, but again this proved totally ineffective as well as being a source of major embarrassment. 

In the years that followed the MPAA and associates targeted various torrent indexing websites and pursued the operators until they eventually closed the sites, most notably Suprnova, TorrentSpy, LokiTorrent, BTJunkie, What.cd, Mininova and many others. The remaining, and probably the most popular sites remain active, which at time of writing are Demonoid, KickassTorrents, and of course The Pirate Bay

In the next and clueless assault by the American lead coalition of media conglomerates, governments were leveraged to enforce DNS blocks on the domain names of sites like The Pirate Bay. These 'court orders' were then imposed on major ISP's to change their systems to block lookup's for these domains, and instead show a page telling the user why it was blocked. This was not only a clueless waste of everyone's time and money, but it drove torrent indexing sites into a huge array of mirror's, each simply replicating the content of the source. The Pirate bay and many other similar sites also started hosting anonymous onion sites on the Tor network, which circumvented any blocks and gave visitors anonymity. 

This very action, not only had zero effect on visitors to their chosen torrent sites, because lists of valid mirrors sprung up to help direct traffic, but it created a network of mirrors and mirrors of mirrors all sharing and distributing torrent magnet links effectively rendering any further DNS blocking pointless. 

In July 2007, Kickass Torrents which was probably the second most used torrent index site was closed down, again by actions from the USA, soon to re-appear as katcr.co run by the same people as the original site. Once again this change was widely publicised on the internet and users searches for the new site were quickly redirected. The site moved yet again to katcr.to shortly after and this seems to be its new home. There were of course a stream of court orders forcing ISP's to block this and other domains for Kickass Torrents, but an impressive list of mirrors immediately sprung up to carry traffic around these blocks. 

The Landscape

Tor, The Onion Project began in 2006 as a way to browse the internet anonymously. That is, all other ways to browse the internet are certainly not anonymous as your ISP tracks every site you visit, every port you open and of course every file you download. Various government entities further track you activities both through leveraging ISP's and by sniffing traffic at interconnects, and of course App's and software are equally guilty of rampant privacy violations. Tor provides a simple framework to allow anonymous browsing by routing your traffic securely to an endpoint far away, often in another country. Tor which can be easily downloaded for free from www.torproject.org easily and efficiently circumvents any DNS level blocking forced on ISP's by court orders, and the sustained growth in Tor users would suggest this is not unknown. 

Regardless, today and at the time of writing The Pirate Bay, Kickass Torrents and their many mirrors are still online and still serving requests a decade later. Knowing that BitTorrent and its index sites are here to stay, we should consider why they are popular and look at the motivation for piracy and the real world effect this may have on content creators. 

The MPAA and the other interested parties, in general allege in court that every movie shared on BitTorrent is a loss to the industry, but to anyone with common sense, that's simply untrue. For many, probably the vast majority of downloaders, if BitTorrent wasn't available they simply wouldn't watch the movies, or wait until they can buy bootleg copies. Given this, the actual loss to the industry is small and in fact BitTorrent is diminishing the market for copied DVD's. 

You can simply divide BitTorrent users who download copyright material into two distinct groups, Those who cannot afford or cannot access the media that they are downloading, and those who can afford to buy but choose to download instead. 

You will not be surprised to learn that the second group, those who can afford but download instead is a tiny minority and this minority is the only group who are in fact depriving the media producers and content creators of their duely deserved revenue, and this is in fact a relatively small amount. So why the chaotic and disproportionate attack on these sites and their operators? The question has yet to be answered, but you could theorise that its a general lack of understanding of how the internet works, the actual target, and organisations desperately trying to appear as though they are doing something, even if its completely ineffective. A wise man once told me that the only winners in any litigation are the lawyers, and there's probably an element of this here too. 

The Recap

Torrent Sites like The Pirate Bay and Kickass Torrents don't host any files, they simply catalogue a list of links and those links point to trackers which then point to the actual files. Trying to close down these files is now impossible because the approach thus far as simply fragmented them into thousands of mirrors. Trying to intimidate the downloaders results in nothing but embarrassment. Trying to block sites using DNS is pointless because they are all accessible via mirrors and of course Tor. 

The Solution

The solution is a blindingly simple one, make content available legally and affordably. In the UK for example, our 'network' TV is poor by any standard carrying only out of date content with endless repeats. Our broadcast TV is even worse. Our Netflix and Amazon Prime are vastly castrated versions of those available to USA viewers, and networks like HBO, SyFy, ABC, etc are just unavailable full stop. So, if you want to access current content what options to you have? BitTorrent satisfies a demand simply because it exists.

In the future, I hope that ALL video content is available through aggregated networks like netflix and Amazon Prime to all countries and people at monthly rates which they can easily afford. Sites and networks who want users to 'buy or rent' a movie or show will slowly die out as will the use of peer-to-peer for sharing copyright media. You only need to look at Spotify or Amazon Music to see that this model actually works in the real world. Listen to any music you like at any time for a small monthly cost.

There are doubtless going to be 'groups' who disagree with that assessment, but those 'groups' are also going to be aligned with the organisations who feel they are somehow wronged by BitTorrent and Peer-to-Peer. The content creators, and producers are going to have to re-think the way media is distributed and licensed instead of desperately trying to hang on to a system that is no longer fit for purpose in the 21st Century. 

It should be noted that neither the company nor I in any way wish to promote distribution of copyright material, there are laws in place which make such activity illegal in some countries. This article simply explains to those with an interest, how and why it occurred, and possible solutions. 

The Wreckage

List of UK Court Orders to Date forcing internet Service providers to block websites at the DNS level. Can you imagine the amount of money paid to lawyers to prepare, apply for, and execute these orders? and then the cost to Independent ISP's to make changes to their systems to allow this to happen?

Date of Court Order: 27/04/2012
Identity of parties who obtained the Order: Members of BPI (British Recorded Music Industry) Limited and of Phonographic Performance Limited
Blocked Websites: The Pirate Bay

Date of Court Order: 05/07/2012
Identity of parties who obtained the Order: Members of the MPA (Motion Picture Association of America Inc)
Blocked Websites: Newzbin2

Date of Court Order: 28/02/2013
Identity of parties who obtained the Order: Members of BPI (British Recorded Music Industry) Limited and of Phonographic Performance Limited
Blocked Websites: KAT or Kickass Torrents websites

Date of Court Order: 28/02/2013
Identity of parties who obtained the Order: Members of BPI (British Recorded Music Industry) Limited and of Phonographic Performance Limited
Blocked Websites: H33t

Date of Court Order: 28/02/2013
Identity of parties who obtained the Order: Members of BPI (British Recorded Music Industry) Limited and of Phonographic Performance Limited
Blocked Websites: Fenopy

Date of Court Order: 26/04/2013 and 19/07/2013
Identity of parties who obtained the Order: Members of BPI (British Recorded Music Industry) Limited and of Phonographic Performance Limited
Blocked Websites: Movie2K and Download4All

Date of Court Order: 01/07/2013
Identity of parties who obtained the Order: Members of the MPA (Motion Picture Association of America Inc)
Blocked Websites: EZTV

Date of Court Order: 16/07/2013
Identity of parties who obtained the Order: The Football Association Premier League Limited
Blocked Websites: First Row Sports

Date of Court Order: 08/10/2013
Identity of parties who obtained the Order: Members of BPI (British Recorded Music Industry) Limited and of Phonographic Performance Limited
Blocked Websites: Abmp3, BeeMp3, Bomb-Mp3, eMp3World, Filecrop, FilesTube
Mp3Juices, Mp3lemon, Mp3Raid, Mp3skull, New AlbumReleases, Rapidlibrary

Date of Court Order: 08/10/2013
Identity of parties who obtained the Order: Members of BPI (British Recorded Music Industry) Limited and of Phonographic Performance Limited
Blocked Websites: 1337x, BitSnoop, ExtraTorrent, Monova, TorrentCrazy, TorrentDownloads, TorrentHound, Torrentreactor, Torrentz

Date of Court Order: 30/10/2013
Identity of parties who obtained the Order: Members of the MPA (Motion Picture Association of America Inc)
Blocked Websites: Primewire, Vodly, Watchfreemovies

Date of Court Order: 30/10/2013
Identity of parties who obtained the Order: Members of the MPA (Motion Picture Association of America Inc)
Blocked Websites: YIFY-Torrents

Date of Court Order: 30/10/2013
Identity of parties who obtained the Order: Members of the MPA (Motion Picture Association of America Inc)
Blocked Websites: Project-Free TV (PFTV)

Date of Court Order: 13/11/2013
Identity of parties who obtained the Order: Members of the MPA (Motion Picture Association of America Inc)
Blocked Websites: SolarMovie, Tube+

Date of Court Order: 18/02/2014
Identity of parties who obtained the Order: Members of the MPA (Motion Picture Association of America Inc)
Blocked Websites: Viooz website, Megashare website, zMovie website, Watch32 website

Date of Court Order: 04/11/2014
Identity of parties who obtained the Order: Members of BPI (British Recorded Music Industry) Limited and of Phonographic Performance Limited
Blocked Websites: Bittorrent.am, BTDigg, Btloft, Bit Torrent Scene, Limetorrents, NowTorrents, Picktorrent, Seedpeer, Torlock, Torrentbit, Torrentdb, Torrentdownload, Torrentexpress, TorrentFunk, Torrentproject, TorrentRoom, Torrents, TorrentUs, Torrentz, Torrentzap, Vitorrent

Date of Court Order: 19/11/2014
Identity of parties who obtained the Order: Members of the MPA (Motion Picture Association of America Inc
Blocked Websites: Watchseries.It, Stream TV, Watchseries-online, Cucirca
Demonoid, Torrent.cd, Vertor, Rar BG, BitSoup, Torrent Bytes, Seventorrents, Torrents.fm, YourBittorrent, Tor Movies, Torrentz.pro, Torrentbutler, IP Torrents, Sumotorrent, Torrent Day, Torrenting, Heroturko, Scene Source, Rapid Moviez, Iwatchonline, Los Movies, Isohunt, Movie25, Watchseries.to, Iwannawatch, Warez BB, Ice Films, Tehparadox

Date of Court Order: 20/11/2014 (expired on 11/11/2018)
Identity of parties who obtained the Order: Cartier International AG, Montblanc-SImplo GmbH, Richemont International S.A.
Blocked Websites: CartierLove2U, IWCWatchTop, ReplicaWatchesIWC, 1iwc, MontBlancPensOnlineUK, MontBlancOutletOnline

Date of Court Order: 5/12/2014 (expired on 05/12/2018)
Identity of parties who obtained the Order: Cartier International AG
Blocked Websites: Pasmoldsolutions, PillarRecruitment     

Date of Court Order: 17/12/2014
Identity of parties who obtained the Order: Members of BPI (British Recorded Music Industry) Limited and of Phonographic Performance Limited
Blocked Websites: Bursalagu, Fullsongs, Mega-Search, Mp3 Monkey, Mp3.li, Mp3Bear, MP3Boo, Mp3Clan, Mp3Olimp, MP3s.pl, Mp3soup, Mp3Truck, Musicaddict, My Free MP3, Plixid, RnBXclusive, STAFA Band

Date of Court Order: 29/4/2015
Identity of parties who obtained the Order: Members of the MPA (Motion Picture Association of America Inc)
Blocked Websites: afdah.com, watchonlineseries.eu, g2g.fm, axxomovies.org, popcorntime.io, flixtor.me, popcorntime.se, isoplex.isohunt.to, eztvapi.re, eqwww.image.yt, yts.re, ui.time-popcorn.info

Date of Court Order: 7/5/2015
Identity of parties who obtained the Order: The Football Association Premier League Limited
Blocked Websites: Rojadirecta, LiveTV, Drakulastream

Date of Court Order: 21/5/2015
Identity of parties who obtained the Order: Members of The Publishers Association
Blocked Websites: Avaxhm, Ebookee, Freebookspot, Freshwap, Libgen, Bookfi, Bookre

Date of Court Order: 25/2/2016 (expired 31/01/2019)
Identity of parties who obtained the Order: Cartier International AG and Montblanc-SImplo GmbH
Blocked Websites: Perfect Watches, Purse Valley, Montblanc Ebay, Montblanc.com.co, Replica Watches Store

Date of Court Order: 5/5/2016
Identity of parties who obtained the Order: Members of the MPA (Motion Picture Association of America Inc)
Blocked Websites: Couchtuner, MerDB, Putlocker, Putlocker Plus, Rainierland, Vidics, Watchfree, Xmovies8

Date of Court Order: 14/10/2016
Identity of parties who obtained the Order: Members of the MPA (Motion Picture Association of America Inc)
Blocked Websites: 123Movies, GeekTV, GenVideos, GoWatchSeries, HDMovie14, HDMoviesWatch, TheMovie4U, MovieSub, MovieTubeNow, Series-Cravings, SpaceMov, StreamAllThis, WatchMovie

Date of Court Order: 08/03/2017 - (expired on 22/05/2017)
Identity of parties who obtained the Order: The Football Association Premier League Limited (“FAPL”)
What is blocked by the Order: Various Target Servers notified to Virgin Media by FAPL or its appointed agent from the date of the Order for the duration of the FAPL 2016/2017 competition season

Date of Court Order: 25/07/2017 (expired 13/05/2018)
Identity of parties who obtained the Order: The Football Association Premier League Limited (“FAPL”)
What is blocked by the Order: Various Target Servers notified to Virgin Media by FAPL or its appointed agent for the duration of the FAPL 2017/2018 competition season.

Date of Court Order: 20/11/2017
Identity of parties who obtained the Order: Twentieth Century Fox Film Corporation, Universal City Studios Productions LLP, Warner Bros. Entertainment Inc., Paramount Pictures Corporation, Disney Enterprises, Inc., Columbia Pictures Industries, Inc.
What is blocked by the Order: Couchtuner.fr, Couchtuner.video, Fmovies, MyWatchSeries.ac, SockShare, WatchEpisodeSeries.com, WatchSeries.do, WatchSeries-Online.pl, YesMovies, Yify-Torrent

Date of Court Order: 21/12/2017 (expired 26/05/2018)
Identity of parties who obtained the Order: Union des Associations Européennes de Football (“UEFA”).
What is blocked by the Order: Various Target Servers notified to Virgin Media by UEFA or its appointed agent from the date of the Order for the duration of the UEFA 2017/2018 competition season.

Date of Court Order: 18/07/2018 (expired 13/05/2019)
Identity of parties who obtained the Order: The Football Association Premier League Limited (“FAPL”)
What is blocked by the Order: Various Target Servers notified to Virgin Media by FAPL or its appointed agent for the duration of the FAPL 2018/2019 competition season

Date of Court Order: 24/07/2018 (expired 12/07/2019)
Identity of parties who obtained the Order: Union des Associations Européennes de Football (“UEFA”)
What is blocked by the Order: Various Target Servers notified to Virgin Media by UEFA or its appointed agent for the duration of the UEFA 2018/2019 competition season

Date of Court Order: 20/09/2018
Identity of parties who obtained the Order: MATCHROOM BOXING LIMITED ,MATCHROOM SPORT LIMITED
What is blocked by the Order: Various Target Servers notified to Virgin  Media by Matchroom or its appointed agent up to and including 1 October 2020.

Date of Court Order: 28/11/2018
Identity of parties who obtained the Order: QUEENSBERRY PROMOTIONS LIMITED
What is blocked by the Order: Various Target Servers notified to Virgin Media by Queensbury Promotions Ltd or its appointed agent up to and including 1 December 2020.

Date of Court Order: 15/07/2019
Identity of parties who obtained the Order: The Football Association Premier League Limited (“FAPL”)
What is blocked by the Order: Various Target Servers notified to Virgin Media by FAPL or its appointed agent for the duration of the FAPL 2019/2020 competition season

Date of Court Order: 16/07/2019
Identity of parties who obtained the Order: Union des Associations Européennes de Football (“UEFA”)
What is blocked by the Order: Various Target Servers notified to Virgin Media by UEFA or its appointed agent for the duration of the UEFA 2019/2020 & 2020/2021 competition seasons


Continue reading
  12517 Hits
  1 Comment
Recent comment in this post
Guest — Johny pad
Torrentz is online now in anew version. https://torrentz.am/any?f=site%3Athepiratebay.org+added:1d also 1337x and limetorrent is ... Read More
Tuesday, 07 April 2020 05:41
12517 Hits
1 Comment

Why not having a real switch matters

Many people believe that network traffic is just like water down a pipe, its all data and it can only go back and forth. Actually that's no where near reality, and in fact every packet of information that traverses your network consists of at least 2 layers. That is a packet within a packet, with most data being three layers, a packet within a packet within a packet. The reason for this is the separation of protocol and transport. To clarify that a little, the transport is most likely "Ethernet", and the protocol is most likely "TCP/IP" but it doesn't have to be. When you purchase network equipment, you'll be buying an "Ethernet Card" for your workstation, an "Ethernet Switch" for the network and so on, and so it's clear that the transport is "Ethernet", but even here there are differentiations. 

The standard for Ethernet

The current standard is IEEE 802.3 within which we can have a selection of physical connectivity with different cable types and wiring requirements. Let's look at a few common ones. 

10Base5 was the first real "Ethernet" implementation, over thick Co-ax RG-8 cable and commonly known as "Thicknet", this transport was the standard for DEC and other mini/mainframe systems from the early days. Thicknet required an external transceiver for every connection that converted the RG-8 co-ax into D-type AUI connector that then connected with the computer or equipment. 

10Base2 was where "Thin" Ethernet first came to the market. Using the much thinner and easier to work with RG58 Co-Ax cable Thinnet quickly became the standard for local area networks back in the 90's. There were no 'switches' only 'hubs', sometimes called concentrators which where just dumb connections between runs of Co-ax. Towards the end of the life of Thinnet, a few manufacturers such as 3com did release more intelligent routing equipment but with 10BaseT on the horizon uptake was limited. 

10BaseT, was the first structured cabling specification, providing 10Mbps max over four pairs of wires and introduced the Category 3 specification (often just called CAT3). In the past, 10Base2 and 10Base5 consisted of a long run of Co-Ax with computers hooked into that long cable. When something on the network broke, everything broke and fault finding a problem on a long run that strung through several offices was a real pain in the backside. Structured cabling did away with that and instead every computer had its own cable back to the hub or concentrator which were now a little smarter. Fault finding was now as simple as unplugging each computer until everything worked again - much better. 

100BaseT(X), was an upgrade to 100Mbps over CAT5 but also ushered in the 'network switch' which was like a hub but actually had some intelligence. In the world of the Hub each packet sent to port 1 was sent out on port 2, port 3, port 4, port 5..... port 24 and the computers on those ports simply ignored the packet if it wasn't for them. The Switch did away with all that nonsense and instead watched packets on its ports and built up a table of devices on each, often called the ARP or MAC table. Using this table, the switch could now send the data ONLY to the port that hosted the destination computer. This magic also greatly reduced the traffic on other segments (connections between switches) so the whole network was significantly more efficient. The more expensive switches, so called "Managed" allowed network engineers to login to them and to see traffic statistics, errors, activity, etc which greatly reduced the need for engineers to be standing in front of equipment in order to monitor its operation. 

100BaseFX, sometimes called 100Base-X is the same traffic but instead of using wires, it uses fibre. 

1000BaseT(X), the Gigabit Ethernet Standard with 1Gbps speed and full duplex. This is probably the most common implementation in use today and requires CAT5e or later. 

Other Standards

The 802.5 Specification introduced "Token Ring", a 16Mbps network that operated as a long loop where each computer would relay data that wasn't destined for it. Token ring was big with IBM and at the time was probably one of the more reliable infrastructures, but came with a stiff price tag. 

The 802.11 specification is for Wireless transmission and has a number of sub sections such as A, B, C, F, G, M, and AC. The actual physical requirements of seeing data over the air is different to that of wired, and yet the original "Ethernet" was so called because it was designed to be wireless. That aside 802.11 is a complex and evolving specification providing every increasing transmission speeds and distances. 

The OSI ModelBack to the Data

So now we know that different physical connectivity such as a long string of co-ax, a token ring, and wireless all use different methods to send and receive data, and yet they can all send TCP/IP, or Vines, or NetBUI or IPX so how does that work? Well its really simple, the OSI names the physical data transport as Layer 2, Layer 1 being the actual voltage/frequency/waveforms of the signals used on the wires or over the air. The Networks Cards, hubs, switches all send and receive 'data' using Layer 2. In Layer 2, all devices on the network have an address, but it's not an IP Address, this is a MAC (Media Access Control) address. You will see this MAC address shown on the back/bottom of any router, managed switch and network card. This MAC address is the physical address of the devices and if you were to tap into the network and monitor traffic you would see nothing but Layer 2 data, From physical address 01-23-45-67-89-AB-CD-EF to 01-23-45-67-81-22-C4-FF for example. The switches keep track of these physical addresses and deals only in with these packets. 

The actual 'data' which can be TCP/IP is then encapsulated (enclosed) within this physical layer data and its the job of the network endpoints (Computers, and Servers) to translate the TCP/IP Address into the MAC Address, stuff the data into the Layer 2 packet and then send it. Likewise upon reception, the endpoints will extract the TCP/IP packet from the Layer 2 packet and then pass that onto the operating system. This may seem over complicated but in fact it's essential. TCP/IP is not the only protocol in use today AND Ethernet is not  the only physical transport. By separating the physical and data we solve a world of problems and can have TCP/IP travelling over Ethernet, Wifi, Fibre, CDMA, GSM without having to care how it gets there. Likewise, we can have a range of protocols co-existing on the physical network without any impact on its operation. To use an analogy, consider sending a letter to a friend. Layer 1 would be the roads, the postbox, the postman's van, the sorting office, various hands and machines. Layer 2 would be the envelope and the address on the envelope is the physical address, and the contents of the envelope would be the Layer 3 or TCP/IP data. The envelope, being Layer 2 doesn't care how it gets to the physical address written on it, and the note inside being TCP/IP has no concept of how it physically gets to the friends house, it enters the envelope at your house and emerges at your friends house. 

Back to the "real switch matters"

A real managed switch brings some intelligence to the landscape and is able to not only route packets more efficiently across the network but also monitor the network for issues that could cause problems. Most good switches these days are able to do Layer 1 hardware diagnostics of the cables attached, as well as monitoring network events such as collisions, errors, storms and floods, and having this oversight can be invaluable when dumber hardware is connected to your network.

The reason for this article was a long term hardware issue with a broadband router that would intermittently loose its connection for no apparent reason. Engineers would be on site, monitor the broadband circuit and couldn't see anything wrong with it. Extensive broadband diagnostics showed a line in perfect health and when leaving network test gear connected to it over a weekend we could see absolutely no issues. Reconnect the router and within a few hours it would be out of service again. 

Ha! it's the router! Well we'd replaced that already, twice in fact and no change, so what the fluffy fruit is going on here? 

After a lot of what if's and a smattering of customer frustration, we sent in the Level 3 guys with their laptops and the issue was finally identified. Jefferson's Jellies! exclaimed the Level 3 tech, It's the network!. More specifically a network cable that had been crushed behind a rack. This crushed cable was causing intermittent cross-connection between several pairs on the CAT5e cable, and this was causing garbage to be transmitted. A smart network switch would have noticed this and taken action to resolve, but the customer only had a dumb switch and the router (A Draytek 2862) again had no intelligent network interface. The garbage on this crushed cable created what is technically known as a packet storm which quickly saturated the network and more interestingly caused the Draytek 2862 to drop its PPP connection. 

Network Certification ReportThe moral here is a simple one

Why a packet storm on the network port of a router would cause the PPP connection to drop is something I can only speculate about, and this behaviour sent us looking in totally the wrong place. Had the customer spent a little more on a good switch then (a) it would have dealt with the packet storm, and (b) we could have asked it easily where the problem was. 

In all fairness  to the technical teams, we rarely provide a broadband only service, and we would normally manage the network and if not already installed, we'd install managed switches, but this was a tiny remote office for small business customer, with a network installed by Pete the Plumber and all they wanted was a good *reliable* broadband for SIP. 

Get your network installed by a professional and ensure you receive CERTIFIED test results. They will look something like the report on the right, with a page for every port on the network. Any good network professional will provide these, but Plumber Pete can't and won't. If you are moving into a new building that already has structured cabling or you suspect your cabling was installed by Plumber Pete or his mate then having it certified is a simple and cheap process. As for a good Managed Switch, I'm not going to start recommending brands because to be honest most brands are ok for most networks, and it's not a case of the more expensive the better, although some will tell you different. 



Continue reading
  6622 Hits


© E&OE

6622 Hits

Protecting Your Synology NAS from Internet Threats


Today we were informed by Synology that large scale brute force attacks are targeting Synology NAS devices accessible from the internet, and whilst this is fairly easy to thwart the default configuration (Depending on the version of DSM when you first installed your NAS) does not. These changes will protect your NAS from the majority of internet threats but not all, and we'll deal with those later. Firstly let's look at the steps we need to take initially. 

The admin user

When you first install your Synology NAS the administrative user is called admin. This is bad because you don't even need to guess it, its always admin, but we're not stuck with it and we can resolve this by creating a new administrative user and then disabling the admin user. To do this, login to you NAS as admin and then go to control panel, and User. User Creation Wizard Add GroupUser Creation Wizard

Be sure to select a GOOD password. That is a password of at least 12 characters and containing at least one upper case, one lower case, one number and one symbol. An example would be S!n0LoG6nAs% but please don't use that one, pick your own. 

Next you need to select the group and in this case you MUST pick administrators or there will be real problems later on. As long as you've selected Administrators as the group then you can safely NEXT through the following screens since the administrative group has access to all things. 

Now that's done, logout of your NAS and then login using your newly created user. 

Assuming that works, go back into the control panel, select User, select Admin and then EDIT. Now place a tick in the "Disable this account" box, and make sure "Immediately" is selected below it. This will disable the admin account so that it can no longer be used to login. Press OK. From this point onwards you will not be able to login to your NAS using 'admin', but you should continue to use the new account you're currently logged in as. *IF* there is more than one administrator then create a second login, make it a member of Administrators instead of sharing logins, which is bad practice on many levels. 

*IF* You have other users logging onto your NAS, you may want to take a look at the Advanced Tab from User and set some password strength rules along the lines of the above. Whilst a non-administrative user has greatly reduced access, a compromised account can still do long term damage to your business so ensuring passwords expire periodically, and making sure they are strong is best practise. 

2-Step Verification is a good idea, but unfortunately Synology do not support the common two factor authentication tools like Ubikey, etc. So for now, unless you want to have a mobile phone with you all the time, leave this turned off. 

Account ProtectionAccount Protection

Your Synology NAS has some powerful features to protect accounts from compromise, but you need to turn them on in some cases. You will find these in the Control Panel, Security and Account Tab. 

Enable Auto Block - This is probably the most important feature here and will block or ban IP Addresses that fail password authentication a set number of times. The best practise setting here is Login Attempts = 4, within 60 Minutes, and disable "Enable block expiration". 

This does mean that from time to time a real user will lock themselves out, but you can remove that block from the Allow/Block list button just beneath the settings. 

Untrusted and Trusted Clients

One really powerful feature of your Synology box is the ability to differentiate between clients and set different limits for each. In our example left we're giving trusted clients 10 login attempts within 5 minutes, but untrusted only get 8 attempts in 999 minutes. Unfortunately Synology won't allow you to set 1440, i.e. 24 hours since 999 is the maximum minutes you can select. Feel free to change these as needed, and if your genuine users screw up then you can manage their restrictions using the "Manage Protected Accounts" and "Managed Trusted Clients" buttons by each section. 


The Synology NAS Firewall can be unnecessarily complex to setup, but its also very powerful when used correctly. You should open the firewall configuration from the Firewall Tab in Control Panel / Security, and select the currently active Firewall Profile. In the next dialogue you can configure a number of rules (and in the case of larger NAS units, the interface upon which they apply). The Synology Help does a good job of explaining the firewall rules but we'll give a brief overview here. 

Firstly, you need to understand how the firewall works and what it actually does. The firewall inspects each incoming request to the NAS, and looks at the SOURCE ADDRESS, DESTINATION ADDRESS, PORT and PROTOCOL for each. The firewall then compares that against its allow rules to determine if the request should be honoured or rejected. An example would be to allow access to DSM on port 5001 via TCP for your LAN only. In this case, assuming your Lan is then we would setup an allow rule for 

Ports 5001, Protocol TCP, Source 

By default if no rules are matched for the source, destination, port and protocol the packet is rejected, so please do not change this unless you know exactly what you're doing. 

Synology makes it easier to add rules by allowing you to select applications instead of ports, but behind the scenes it just converts applications to ports and protocols for you. The source address can be a single IP, a subnet, subnet's or location. Now be careful of location because whilst its pretty good its not foolproof and we have had experience of users allowing GB (Great Britain) but clients being rejected even though they are in the UK. Remember you can add multiple rules here but every rule you add is another risk so limit the applications (ports/Protocol's) exposed to the internet unless absolutely necessary. 

The Security Advisor

Security Advisor

Synology have empowered your NAS with a tool which can analyse your security configuration and make suggestions on how to improve it. Not all suggestions must be acted upon but having the analysis is really handy. Open up the Security Advisor and then select RUN. 

After a few minutes you should see an overview of recommendations. Hitting Results on the left gives you detailed suggestions and some guidance. 

Don't panic, there may be lots of red triangles but these are just warnings. For example, we have unencrypted FTP enabled, because some dumb devices still need that, and that gives us a red triangle but we know its ok and we can ignore it. Do the same for each, understand the warning or suggestion and take action. 

Evolving Environment

Unfortunately, flaw's and exploits are being discovered daily and whilst Synology is very good at releasing fixes and patches for discovered vulnerabilities, you should never rely on them to be infallible. Instead ensure you have a good, tested and working backup strategy for your NAS so that in the event that your NAS is compromised, data is lost or damaged, you can swiftly recover with minimal loss. 

At this point its worth mentioning that GEN not only provide Synology Technical Support but we also host Synology RackStations in our datacentres and unless you want to DIY it, we'll take care of all these things for you including disaster recovery. 

Continue reading
  10070 Hits


© E&OE

10070 Hits

Breaking the Digital Chain

Display SignI think everyone knows that we specialise in high encryption data links, anything from single layer IPSec right through to triple layer tunnelling and disparate routing, but for some customers that's not enough. 

The internet is a collection of many separate networks connected together in a way that traffic can move freely from point A to point B via the most efficient route (most of the time). This routing is automatic and can only be influenced slightly from the A or B end. Interception of data travelling across the internet in its encrypted form is easy but it's virtually impossible to decode. You can determine however, the point of origination and destination, to snoop on encrypted tunnels you would need to compromise one of these. We make this far harder by using relay nodes in disparate jurisdictions so that the packet journey instead of being from A -> B is now from A -> Node1 -> Node2 -> Node3 -> Node4 -> B. This means any sampling of traffic between Node1 and Node2 for example would reveal ONLY Node1 and Node2 but not A or B. In order to locate the traffic source or destination ALL the relay nodes would need to be compromised upstream or downstream. That is, from a sample between Node1 and Node2 you would need to physically compromise Node1 to find A and Node2 to find Node3, and Node3 to find Node4 and Node4 to find B. In reality we don't use 4 relay nodes but instead use a minimum of 10, usually more. The nodes themselves are a mixture of cloud hosting (free and paid), virtual and physical servers, Tor network nodes and public proxies. Some of the 'free' services are less reliable but a little smart routing solves this problem. 

You can see however, that there IS a way, given enough resources and physical access to the locations holding nodes along the path to discovery the A and B ends and then compromise them. We needed a way to break this digital chain for one customer so there was no internet 'connection' between A and B. This method of breaking the chain is known as out-of-band routing and often utilises either private networks or non-data networks. A good example of out-of-band routing was to use Modem's and have part of the route travelling over analogue phone lines but in recent years this has become less secure because as technology evolves its easier to discover and monitor calls between countries and determine start and end physical addresses. We can make this harder by placing the originating or receiving modem in countries with antiquated telephone systems but then you run into data issues and a reduction in the already low bandwidth. 

A new customer provided the impetus for finding a new way to break the chain in a new out-of-band transport and the solution was ingenious, even if I do say so myself. Whilst the job has now completed and the route is no longer in use, I will not be sharing the location, country or any media from the actual connection because that would be idiotic. The image above is a library shot of a Marvel digital display board and not the manufacturer used and is there for illustration only. What I will do is describe the adopted solution to the problem.

After much discussion and consideration of point to point laser, microwave, hijacking local radio and piggybacking satellite TV we were in the unusual position of being at a loss. We know what we wanted to do, just not how to achieve it and this continued for about a week. One idle Tuesday during lunch one of our tech's was browsing CCTV live feeds on Youtube and like a clap of lightning the idea was born. We would utilise a CCTV Live stream and some form of transmission to act as the out-of-band route, but how would we transmit the data? Luckily the data rate is very very low consisting of bytes per minute instead of the usual Mb/s and was unidirectional (one way) so it could be something very analogue but it took another full week to come up with the transmission medium. 

Whilst browsing hundreds, if not thousands of live cctv feeds we noticed in one, an unremarkable street with an equally unremarkable bus stop with what looked like a digital sign on the end. We spent some considerable effort trying to identify the company name from the bottom of the sign by enhancing the imagery and applying various filters only to later notice that one fo the ad's loosely translated as 'advertise here' call this number. 

Calling the number, finding a translator, calling the number again this time with the translator and after some some negotiation we purchased ad-space on this specific sign for a period of time, and no we didn't use a credit card. After receiving the credentials we were able to access the sign's http interface (yes http) and upload some sample ad's which were in fact just images that are played in a loop with various transitions. Some php code later and we had the auto-upload working. 

Now we need to figure a way to transmit data from the sign and pick it up again in the CCTV live feed. After many attempts we settled on an ad which consisted of a number of white squares on a black background with some unimportant company branding which we totally made up. Streaming the CCTV image was simple, converting it from frames to separate images, removing all the duplicate images leaving only the changes was a fairly simple if not a laborious job. Parsing the images to determine the actual data took a little more time but again wasn't hard to do. From a technical point we broke the image into separate area's, calculated the intensity of the area as an average and then converted that into a simple 0 or 1. I know some of you may be thinking that we could have gone with barcodes, QR codes and others with much higher data density but this specific project required very low data rates.

We tested the solution over several days sending images to the display panel in batches of 1, recording them from the live stream, decoding them and comparing the data. A few changes were made to the layout and intensity was lowered during the dark hours to improve capture clarity. We did occasionally have people standing in the way but we wrote code to detect this by including a persistent layout of squares that we then verified before decoding the data and sending it on. 

Now we need to setup the rest of the route, which we did in 14 nodes between the A end and the remote server that was taking bit data, converting it into an image and uploading to the sign, and another 10 nodes from the remote server capturing the live stream, decoding and forwarding on the data to the B end. 

As we had no access to the A or B end we were unable to test further ourselves but our customer tested the connection themselves and were satisfied with both the performance and the security strategy. 

Its important to state that we do not know the purpose of, the data transmitted, or indeed even the customer as we were dealing with an agent, sometimes called a proxy, but we do know the customer was happy with the solution. Remember that this is an extreme case with uncommon prerequisites and for most a string of relay nodes and two layer encryption is more than sufficient to prevent unauthorised interception. If you can think of a better way to achieve out-of-band communications then please leave us a commend, if you found this interesting then please rate it. 

Continue reading
  3859 Hits



3859 Hits

Amazon Prime & eBay for Business?


Both eBay and Amazon have been major players in the online retail market for many years but How are they to use for business? We've been using eBay for business over the last decade, mostly to obtain hard to find replacement parts for laptops, desktops and servers that we still have on maintenance long after support has been withdrawn by the manufacturer, but Amazon is a new voyage of discovery. Amazon offers a service known as Prime, which for a fixed monthly charge (currently £7.99) gives FREE next day delivery on a large range of products, which in itself can be an attractive proposition for businesses who spend a fair amount on carriage annually. In fact, our carriage bill for the last 30 days is £134.98, so if we could source everything from Amazon we would save £126.99 over the month. We cannot of course purchase everything from Amazon, but we're looking to see if the Prime deal is workable and viable. 

In order to offer a fair comparison we will select a few business consumables and then order them from Amazon, eBay and an online store and review the whole journey. 

Shopping Time

Item Description Amazon Prime Ebay Online
Duracell CR2032 Lithium Battery 2 pack £4.99 (2 days later) £1.69 (next day) £2.89 Curries (next day)
EPSON WF-3620 EXTRA LARGE BLACK CARTRIDGE £33.32 (next day) £30.00 (2 days) £37.59 Cartridge Discount (3 days)
HP 24f Display £99.00 (two days) £109.00 (5 days) £99.00 AO (next day)
HP PREMIUM PHOTO PAPER A4 x20 £13.98 (next day) £13.96 (2 days) £19.76 Photopaperdirect (2 days)
hulker 3 way power strip £15.99 (next day) £23.00 (3 days) £18.99 BlockCube (7+ days)
TOTAL £167.28 £177.65 £178.23


Amazon Search me not

The first thing you notice about Amazon search is that EVEN IF YOU SELECT PRIME ONLY, and EVEN IF you select order by LOWEST PRICE FIRST ( both of which you have to do EVERY TIME YOU SEARCH because they reset to Non-Prime and "Featured" ) you are presented with a bunch of items that are not prime and are not what you searched for. In our first search for the Duracell CR2032, with PRIME and SORT PRICE LOWEST FIRST set, we first get double A cells, chargers, and a "Homidy Digital Hygrometer Indoor Thermometer, Xiaomi Mijia Rare 360°HD E-ink Display Room Humidity Monitor Swiss Sensirion Industrial Grade High Accuracy Temperature Humidity Meter" which is so far out of scope it makes no sense at all. HP Premium Photo Paper A4 again Primed and Sorted Lowest price gives you HP Office A4 80gsm, Kodak Premium Glossy Paper, Hp Everyday Glossy Paper - see the issue here, NOT WHAT WE SEARCHED FOR!. This makes purchasing from Amazon awkward and time intensive. Notwithstanding their unreliable search functionality, Amazon used to be country specific, that is, amazon.co.uk was for UK buyers and sold UK goods, but those days seem to have passed and now many of the items you see in search results are from overseas and its made impossible to know which because the fact is nowhere to be seen, and there is no way to filter 'uk only'. Ordering from a non-prime, non-amazon supplier is virtually impossible to avoid and of course you'll soon learn that the prime only delivery and return benefits ONLY apply to goods sold by amazon. Ordering from overseas unintentionally is again almost impossible to avoid with Amazon and I guarantee you will also be waiting 4 weeks for something you expected next day only to find it arrives with a customs charge attached. 


Unlike Amazon, eBay's search results are actually of the items you searched for, but since its inception you MUST select UK ONLY, and Order Lowest First for nearly every search or you will unwittingly end up buying something from China. That notwithstanding, eBay is fully loaded with fraudulent items, both electrically unsafe, non-compliant right through to outright illegal knockoff copies that underperform, all seemingly in the UK but when your item arrives its invariably from overseas and took 4 weeks to arrive. I don't believe eBay has any intention to deal with this since this has been happening for at least a decade and they've done nothing about it so far. Users can 'report' listing for being in appropriate, fraudulent or otherwise but this doesn't seem to have any effect as I've reported obviously fraudulent listings and they never get removed.  


Amazon may be a more sanitised marketplace but ONLY if you select PRIME and even then be aware that not all PRIME items are actually delivered by Amazon and guaranteed next day and even the next day guaranteed rarely arrives next day in our experience. eBay is the wild west of Internet shopping, but as long as you select and reselect UK only, check the sellers feedback then its functional search greatly speeds up the experience. With both Amazon and eBay, and other online stores you need to be sharp because anyone can now throw up a believable storefront, list believable items when in fact the seller is in the far east and the items are counterfeit or dangerous. You may think that Trading Standards are hot on the heels of these fraudsters, but no, they actually no interest or activity in this area instead focusing their time on bootleg dvd's at car boot sales and other dangerous goods that threaten our way of life. 

Paying, eBay accept most payment methods but clearly prefer PayPal, Amazon ONLY accept credit/debit cards and no Paypal. This isn't a huge problem for most, and I can understand Amazon's approach given the volume of issues that comes with PayPal, but if you're a PayPal Business user and use this to control and manage your spending then you'll need to think again. 

Price & Performance

As you can see the price performance on these few items swung marginally in favour of Amazon Prime. Don't forget we've paid a monthly fee for the one day free delivery and that one day usually means two or more, but its still less than eBay and Online overall. There are however other issues to consider when using Amazon & eBay as opposed to online stores and retailers. With both Amazon and eBay you are isolated from the seller, more so with Amazon but anyone who's used eBay for a while soon learns that the phone numbers provided in the listings are rarely real numbers. Both eBay and Amazon do provide a method to communicate with the sellers via messages, and eBay has their "Resolution Centre" which is actually quite useable as long as you're prepared to wait 40 days for your refund, but Amazon will only get involved in PRIME items otherwise you're on your own. 

Help me Amazon

Today I'm going to focus on Amazon, simply because this was a new journey into procurement, there was a promise of free next day, and because their communication was certainly lacking. 

Amazon TrackingTaking our first Item, the Duracell CR2032, these were required for part of a presentation system that was being shipped the next day. I ordered them on a sunday for delivery Monday (as is the promise with Prime) but by late afternoon they were still a no-show. I sent someone down to our local supplier to get them for our shipment, but I then thought I'd contact Amazon and enquire as to where they are? Your first greeted with the Orders page, you find the item and hit Track and you'll see something like the screen to the left. Pretty uninformative but gives some hope that its going to arrive today, even if it is long after the office has closed. 

Then you go on a veritable treasure hunt to find the missing contact us page hidden within the depths of the amazon.co.uk website. Even for this article I've spent another 10 minutes trying to find it again. For anyone else being sent around endless help pages looking for it, its at https://www.amazon.co.uk/gp/help/customer/contact-us

Now you've got here, and ignoring Track Package because you've already done that, select "Where's my stuff" then "check status of my order" and moments later you're given the options of e-mail, phone or chat which is recommended. So I enthusiastically clicked Chat and was presented with the screen to the right. Great news, I'm blocked for some reason, and this is the first time I've tried to contact amazon for anything. Never mind I'll use the Phone option, it's probably just as quick. 

Amazon offer a callback service, which makes total sense and saves you from an endless queue of poor quality music. I entered my desk phone number and clicked "Call Me Now". I was excited to receive the message immediately to the right. I have no idea why my number would be blocked, I've never contacted them before but there is a pattern forming here. Amazon BS

Given that attempt 1 and 2 have failed let's see if e-mail is any better. Following the e-mail option you are quickly reminded that email's take 12 hours (normally) to respond, which is of no use today then. I did call the 0800 number and after some time got through to a very nice person who told me simply that it wasn't going to be today and *should* be tomorrow, a fact which I was already aware of since it was long after 17:30 and I was the last chicken in the henhouse. Is this journey indicative of Amazon or did I just pick a bad day? I don't know, all I can show is this day. 

What can we take away from this article? 

Amazon is certainly an option worth considering and overall may save some money at least in the shipping department, but the awkward search and lack of any assistance is a significant downside for business supplies. eBay is great providing you're not in a rush for anything and are happy to live with ordering a proportion of orders from China simply because you forgot to click "Uk Only" every time you search. Online stores I think are going to be increasingly rare in the future with Amazon and eBay taking up ever increasing search engine results with their listings (and there proxies). Ignoring the cost, independent online stores only winning features are accurate search results, contactable and not inadvertently buying junk from China, all of which I don't believe are sufficient to draw market share away from the big two. For us, we are committed to purchasing from distribution in quantity, and that won't change, but one off items such as cartridges where we would normally pay £10 in shipping could be beneficial in the long term. 

As a hardware supplier, the goods we provide are quoted very competitively with margins between 2% and 3%, but we are occasionally told that "I can buy that cheaper on eBay" to which our reply is always, go ahead. This is always going to be a risk reward based decision for any business, if the customer can save £5 and get it from eBay that's great but when it breaks who you gonna call? That's right, no one. 

As always comment if you have something to add, and please take a moment to rate the article. 



Continue reading
  6903 Hits


© 2019 GEN E&OE

Recent Comments
Guest — Cheviz.L
They both are designed to trick you into ordering from Chinese sellers, either by including them in results where you specifically... Read More
Tuesday, 23 July 2019 10:40
Guest — Lin Simco
We do use amazon for some things, but for business their courier sometimes shows up in the evening or weekend when the office is c... Read More
Tuesday, 23 July 2019 10:43
Guest — Jeff
FYI for Amazon you can select if you want deliveries at the weekend or daytime for offices and it works because we use it.
Sunday, 27 October 2019 19:18
6903 Hits

CDN's and the recent trend of Blacklisting Genuine Customers


One More StepThere has been a recent shift towards using Content Delivery Networks to distribute content rather than hosting in a conventional way, and this brings with it a selection of good and bad. One of the regular issues we receive at the HelpDesk is primarily generated by Amazon Cloudflare which offers 'free' content delivery, making it a popular choice for smaller websites. The most common complaint is the screen right "One more step", which prevents the customer from visiting the website without completing the infamous Google ReCaptcha. Given the serious privacy concerns surrounding Google ReCaptcha would it be Amazon or the website owner who is responsible for *not* highlighting this to the end user? Regardless, our standard answer in this case (and it's a canned response now) is "Unless this website is business critical close the tab and select another website". There is some suggestion that these messages are generated in an attempt to rate-limit or reduce load on either Cloudflare or the vendors website but this is unconfirmed. 

So what causes Cloudflare to blacklist business customers from visiting their vendors websites? Amazon will claim that they blacklist IP addresses that exhibit unusual traffic as well as those on commercial blacklists. That sounds great in theory, but in fact with the vast majority of client IP's being dynamic (including mobile devices) this blacklisting simply prevents customers reaching vendors and for no technically good reason. If their blacklisting wasn't inherently flawed then we would not see the volume of Helpdesk requests on this very issue, with genuine customers trying to reach genuine vendors, and its for this reason that we no longer offer Cloudflare as an option on our hosting services. 

Error 1005
Another example of Cloudflare blacklisting, this time suggesting that the website owner enabled this block is the message to the left with "Error 1005". In this case we're shown that the network AS8560 is blocked from accessing the site. This HelpDesk ticket was raised by a customer who was in fact in Germany, using a tablet in a coffee shop and who wanted to check who had been blowing up their mobile. There are of course other sites which I'm sure satisfied their curiosity but the customer was concerned that the message may have been an issue, because quite honestly to the end user it is a little intimidating. 

Access DeniedIn the "Access Denied" message to the left we again have a genuine customer who was trying to access their account on a vendors website, and yet again we're told that its not going to happen, this time suggesting that the client is somehow responsible for an online attack. They are of course not responsible for anything except trying to access their vendors site, but again this sort of message just generates HelpDesk requests, takes time and effort to explain to the customer they've done nothing wrong and that they should consider another vendor in future. In this particular case "Error 1020" indicates that the website operator has established this block as a firewall rule which you would think was intentional but I can't speak for the site or site owner. 

That's enough of Cloudflare, which is after all a free service for most and with that you cannot really complain if you knew it was happening, the very issue here though is that the vendors operating their websites in most cases are unaware that customers are being turned away or impeded from visiting. The prevalence of Cloudflare means that once a customers IP is blacklisted, a good few sites in their daily browsing will all be met with the same resistance. You could say - Contact the vendor, but how do you do that when you can't access their website? 

Cloudflare is not alone and there are a growing number of alternative Content Delivery Networks all bringing their own flavour of issues to the market, preventing customers from visiting vendors and there can be nothing worse to a growing e-business. I understand that protecting the business from 'attack' is a good idea, but in reality we're not protecting them from anything, what is happening is the content delivery network is protecting itself from excessive load at the vendors expense. 

One effective but equally concerning method around this is to use a free proxy server, and the internet is full of them - just search "free proxy server". These servers whilst for the most part are safe, have the ability at the protocol level to intercept your traffic, even that over HTTPS which presents a clear danger. Whilst it's beyond the scope of this article to discuss the technical ramifications of http proxies our recommendation is please do not use them. 


The idea of CDN's is great and has a mostly positive effect on content delivery and site speed, but when your CDN starts blocking customers, either itself or due to (mis)configuration from visiting your site then you need to asses the overall benefit to the business. In other words what is the likelihood of your website being 'attacked', and in 'attack' we mean an attack that a CDN can block (which is actually very few) verses the potential lost business due to customer rejection. It's a hard one and as CDN's become more popular I think it will be increasingly relevant. 

Looking through 3 months of tickets raised in our Support/Web/Browsing channel and selecting a few from the list I find: 

  • analog.com (analog devices) access denied - customer was looking for components for project - went elsewhere. 
  • semrush.com : various - customer was trying to access account - gave up trying. 
  • moneysavingexpert.co.uk : one more step - customer was following link from google - filled box still rejected. 
  • fiver.com : one more step - customer was trying to buy services because we are 'too expensive', got to love tickets - customer went to seoclerks.com instead. 
  • yell.com : forbidden - customer was trying to find business phone number - directed them to alternative website. 
  • yelp.co.uk : sorry you are not allowed to access this page - customer was trying to check reviews - went elsewhere. 
  • scottishpower.co.uk : one small step/not a robot - customer trying to contact company - agent found phone number for customer and advised them to compare prices. 
  • rswww.com : permission denied - customer trying to purchase components - customer went to another supplier.
  • royal applications.com : An error occurred in retrieving update information - This took 4 hours of helpdesk time to determine that the update url "royaltsx-v4.royalapplications.com" is a cloudflare url and being blocked. 
  • rigol.com : one more step - customer was trying to compare equipment specifications - customer attempted to complete captcha but was then told they were blocked. 
  • talktalk.co.uk : Request Rejected - customer was trying to report a fault on their service - customer was persuaded to source bb elsewhere. 

There are many more, and a lot of tickets don't actually specify the website but you get the idea, from our small subset of customers 46 of them gave up and were advised to go elsewhere. There's no way to tell how many successful customers were able to access these sites and how many we're presented with stupid rejection messages so our sample set is the only indicative data we have, but its statistically significant in this scenario. 



Continue reading
  5013 Hits
  1 Comment


© 2019 GEN, E&OE

Recent comment in this post
Guest — Sicar Vandehaus
There you have it! I was in Germany last week, couldn't remember how to use my scopemeter with three wire measurement but I knew i... Read More
Sunday, 09 June 2019 13:28
5013 Hits
1 Comment

A VPN is Unlikely to Protect You


It seems that the Internet, and Social Media (especially YouTube) is full of advertising for VPN's so you can somehow access the internet in a covert way, but what they don't tell you is that for most people a VPN does absolutely nothing except empty your wallet. VPN stands for Virtual Private Network, and VPN's have an important role when you want information encrypted between two endpoints. GEN Uses a highly secure VPN (Our SAS Service) built on Juniper Pulse Secure which enables our customers to connect to our Intranet and from there access their companies private networks. GEN SAS provides three important roles; (a) It authenticates the end user, (b) It encrypts all traffic from that end user to the Intranet, and (c) it provides for privilege enforcement so that some users can only access some resources from their company. End User VPN's such as HMA, NordVPN, SuperVPN, UltraVPN, SafeVPN, CyberGhost, ExpressVPN, IPVanish, SaferVPN, PrivateVPN, Hotspot Shield, StrongVPN and many more advertise that they have Unlimited Bandwidth, Zero Logging and a plethora of technical misnomers to entice the uninformed into parting with their hard earned cash for the promise of anonymity. 

Will a VPN protect me?

That's very simple, as long as you don't use it on the same device you regularly use for internet access then possibly, but unlikely. To understand exactly why that is, let's first understand what the VPN is actually doing for you. 

How a VPN works

VPNWhen you access the internet, traffic from your devices (Pc's, tablet's, etc) goes to your router, the router has the job of forwarding your requests to the internet, and receiving data back from the internet and relaying them to your devices. Your router will appear on the Internet as one IP Address (usually) and this IP address will either be fixed (static) to will change from time to time (dynamic). Your ISP knows which IP address you are using at any point in time because your router 'authenticates' with the ISP when it first connects. From the ISP's point of view your router is assigned an address from its pool (either the same every time - Static, or a random one -Dynamic). Because your ISP knows which IP Address you are using at any one time, and because 'most' ISP's use traffic shaping then they can prioritise or delay traffic of certain types, as well as maintaining logs of what you access and when. As a Business ISP, we don't prioritise or delay anything but for the purpose of this article we're going to assume the majority of our audience could be domestic users. 

A VPN establishes a software 'tunnel' between your device and a server on the internet managed by your chosen VPN provider. Now all traffic that is sent to the internet will instead be sent through this tunnel and the IP Address that originates your traffic will be the IP Address assigned to the VPN providers server. Likewise, traffic received for you will be routed back through the same software tunnel to your device. There is optional encryption of varying strength provided by a software VPN and different providers will use different methods and strengths. 

I want to draw your attention to the image right, which came from the site advertising VPN Services for a price, and I used this image for three reasons; Firstly, its a good image and whilst mildly entertaining does show how a VPN works, Secondly, their site is generated almost entirely of javascript which then builds the HTML page from resources, this isn't completely unusual considering its a WordPress site, but I found the method they used to obscure images was interesting, but of course easily overcome. Finally, the image clearly shows how the VPN works, and highlights to me at least that there are two glaring weak points in this setup; YOU and the VPN Server. Compromising either gives the game away and its not impossible to do. 

Using a Browser via the VPN

When you visit a page, such as 'google.com', your browser is kind enough to share with google.com the contents of any cookies stored in your browser, these cookies are created and updated every time you visit a particular website. Google for example uses 6 different classifications of cookies, many with multiple cookies each and spreads cookies over 17 google domains that they list in their privacy policy. These cookies IDENTIFY YOU explicitly. Every time you login to any google service such as youtube, google, gmail, etc your identification is stored in cookies. Using a VPN has zero effect on google tracking you via its cookies so even though you IP has changed and may even be in a different country, google knows who you are. This is not limited to google, but pretty much all websites you visit will have some sort of tracking data in cookies. You can of course clear these cookies manually, but the first time you use a google server, facebook, twitter, intstragram, pinterest and so on, the game is over and you're identified. 

Some browsers more than others are also leaky. For example, many browsers today have plug-ins or built-in features that send every website you visit to the browser developer or a third party (such as your antivirus provider) to 'check' for phishing or fraudulent sites, but with that data also goes personally identifying data. If your using your VPN, the the same data will travel the VPN therefore identifying your new IP Address. Turning all this off is not a simple process but its do-able in most browsers. Additionally browsers and operating systems exhibit a range of security vulnerabilities that can be and are exploited regularly by carefully crafted javascript, a plug-in or extension or as a downloadable application which are able to access not only cookies but identifiable data such as serial numbers, license numbers, and with very little effort your real IP Address. The technical strategy to achieve this is way beyond the scope of this article, but trust me it can be done and it's not that hard to do. 

Using an Application via a VPN

So you've decided that your never going to use a browser on your VPN and that's a great start, but you should know that on Windows, your operating system is communicating with Microsoft almost constantly, your antivirus product is communicating back to base constantly, even your keyboard driver could well be calling home to check its version etc. So your identity is being given away on an almost constant basis to a wide and varied range of companies. Stopping this is pretty much impossible with Windows and MacOS, but it is do-able on Linux with some effort. 

Using email via a VPN

Using email requires two things to happen, firstly your device needs to connect to the mail server which stores your email. For our customers that server is probably mail.genzone.net, this server records the fact that you have logged on to your mailbox, and your current VPN's IP. For GEN this information is only kept for 36 hours after which time its purged, but the majority of other email providers such as Microsoft (office365, hotmail etc), Google (Gmail, GSuite etc), and many more will keep this information for considerably longer, and of course they will share it internally to connect your IP to your identity. 

DNS Leakage

DNS is the Domain Name System and is used to convert a domain name, like www.gen.net.uk into an IP Address. When using a VPN, DNS Queries SHOULD be intercepted and handled over the tunnel by the remote server, but this is often not the case leaving DNS queries to be sent to your ISP. This allows your ISP to see every website your visiting, but not the actual content which will go over the VPN tunnel. 

Using a VPN to bypass GeoIP

Some commercial services such as Video-on-Demand will check the country associated with your IP Address and reject those outside of coverage. In most cases, this occurs with USA networks such as HBO, SYFY, Discovery etc and using a VPN that will allow you to connect to a server in the USA may temporarily bypass this restriction, and assuming that is you have a billing address and bank account in the USA to setup the account. Even then the performance is often so poor that watching video on demand from the USA over a VPN is problematic even if it works at all and of course these companies are actively working to blacklist VPN Service IP's. 

Google, Facebook, Twitter, and pretty much all commercial websites are actively working to add VPN servers to a list of IP's that are banned. Google for example rarely works from a VPN instead complaining that 'unusual traffic' has been received, and services like video-on-demand are also quick to blacklist VPN servers from their services. The company MaxMind commercialise a maintained list of VPN IP's with "Anonymizers can cause headaches for companies attempting to identify who is visiting their website. The GeoIP2 Anonymous IP database provides insight into your traffic by identifying IP addresses which are used as various forms of anonymizers".

How can I be covert online

There are certainly ways to do this, but it requires some discipline and structure. Firstly the Tor Project provides a complete package of browser and VPN that's free to use and very secure (I recommend you make a small donation to the project if you use it regularly). You must still ABSOLUTELY NOT login to any websites using this service or once again you're identified, but you are otherwise reasonably covert. Applications and your email client cannot use Tor so they will not give away your ID. (There are some situations where you can setup Tor to route all traffic but this is not the default configuration, requires some work, and is definitely NOT recommended). 

Using a virtual machine, preferably linux, can provide you with a 'covert' presence since you will ONLY access the VPN via this virtual machine, and again providing you DO NOT login to any websites or use any applications on your virtual machine that are shared with your local machine.

Breaking the VPN

A VPN by default is point to point, which means that you will have a tunnel from your device to a remote server managed by a company. This presents an inherent weakness in your protection because by compromising the server you're connected to, both your identity and traffic can be exposed. VPN providers will tell you that there's zero logging, but that's rarely true because if there was no logging then how could they validate your credentials and respond to any support requests? Even without logging, many of these providers are buying traffic from an ISP who certainly does log and probably capture traffic. Should an agency require to identify the user then they would only need to compromise one physical endpoint server in order to do so and we know this has happened in the past. 

In Summary

Using a VPN service like many listed above will give you some limited protection providing you are using a virtual machine and NEVER use credentials to connect to any website unless those credentials were created specifically from your virtual machine and never used elsewhere. Its hard work and I'm not sure anyone going about their lawful business would want to put this much effort into being covert online. Servers operated by VPN providers are blacklisted constantly so never pay for your VPN service more than a month in advance or you could find it no longer works for the purpose you intended. 

Anyone serious about operating covertly online should consider using (a) multiple VPN's traversing several Jurisdictions and (b) using burn-boxes to perform online activity. Both solutions, again providing you NEVER EVER use the same credentials to login or the same browser, email or applications in both your local and VPN/burn-box environments can give you covert protection but I must point out that it only takes one slip-up and you will be exposed and identifiable. 



Continue reading
  6771 Hits
  1 Comment


© 2019 GEN, E&OE

Recent comment in this post
Guest — RayO
That makes sense to me. have been sold the whole must use a vpn to protect you deal and it seems like its bollocks. ... Read More
Monday, 20 May 2019 10:50
6771 Hits
1 Comment

How to annoy your visitors with Google ReCaptcha


I'm not a robotFor many years now there has been a steady proliferation of Google ReCaptcha - A free service provided by Google which is used to verify that a human is actually filling out your form. It was annoying when it first arrived on the internet, but the latest rendition takes annoyance to a whole new level with poor quality images, multiple pages to select and more. So why do so many websites choose to irritate their visitors with Google ReCaptcha?

Well, firstly its free, and readily integrates with most hosting platforms. Secondly its thought to be effective and Finally for whatever reason people think it's a good idea. In reality, that's not at all the case, it is free, but there are serious privacy concerns and its not effective as it can be bypassed easily with a browser plug-in or broker service and finally I don't think there's a complete understanding of just how annoying it is especially for those on small screens or those with imperfect vision or hearing. But first let's talk about privacy as that's a hot topic these days. 

Privacy Concerns

If you click Privacy or Terms from the Google Re-Captcha box then your taken to generic Google Privacy or Terms which make no reference to ReCaptcha or what it will collect. This odd behaviour could only be by design. If you dig deeper into the Privacy Policy for ReCaptcha which is nearly impossible to find you discover the following. 

  • reCAPTCHA is a free service from Google that helps protect your website and app from spam and abuse by keeping automated software out of your website.
  • It does this by collecting personal information about users to determine whether they’re humans and not spam bots. reCAPTCHA checks to see if the computer or mobile device has a Google cookie placed on it. A reCAPTCHA-specific cookie gets placed on the user’s browser and a complete snapshot of the user’s browser window is captured.
  • Browser and user information collected includes: All cookies placed by Google in the last 6 months CSS information The language/date Installed plug-ins All Javascript objects

Blimey, who knew? After reading that do you still believe Google Re-Captcha is a good idea for your website? 

  • The Google reCAPTCHA Terms of Service doesn’t explicitly require a Privacy Policy. However, it has the requirement that if you use reCAPTCHA you will “provide any necessary notices or consents for the collection and sharing of this data with Google

But this is often if not always overlooked by website owners, in fact I cannot think of a single website using ReCaptch that actually notifies you prior to its use that your going to be sharing a bunch of data with Google just by clicking "I'm not a Robot". Let's review and expand on the Privacy Policy and what is collected...

  • A complete snapshot of the users browser window captured pixel by pixel
  • All Cookies placed by Google over the last 6 months are captured and stored and an Additional Cookie is stored. 
  • How many mouse clicks or touches you've made
  • The CSS Information for the page, including but not limited to your stylesheets and third party style sheets. 
  • The Date, Time, Language, Browser you're using and of course your IP Address. 
  • Any plug-ins you have installed in the browser (for some browsers)
  • ALL Javascript including your own custom code and that of third parties. 

So at this point, you as a website owner are obligated to disclose to your users that by clicking on the I'm not a robot re-captcha you as a visitor AGREE to all the above being shared with Google, which is not only an inconvenience but pretty much no one does it because in most cases they don't fully understand what data is being shared. This can be a real problem especially in the EU now where GDPR has caused many websites to display mandatory and equally annoying cookie confirmations, and even restricts access to a large number of really useful sites from within the EU.


CrosswalksTry again laterIn a recent survey conducted by GEN with our business customers we included a question about Google ReCaptcha and asked users to rate how annoying it was from 1 to 10 with 10 being the most annoying, and we came back with 94% who though it was the most annoying. Now its a small sample set of a few thousand users but it does indicate a general appreciation of the inconvenience it presents. Personally, when I see the 'Im not a Robot' box unless its absolutely critical I'll just close the page and move on to something else, and this is a view shared collectively at this office as it probably is a most. 

For those outside of the USA, a crosswalk is what the Americans call a pedestrian Crossing, in the pictures its the white lines across the road but of course in most of the rest of the world these are black and white or black and yellow. This is a regular mis-understanding as is Palm Trees which are the trees with the leaves at the top, and never seen in many countries. 

If your Not a Robot and I am certainly not then its easy to wind up with the dialogue to the right after getting a couple of images incorrect, after which your screwed and cannot continue to submit your form without closing the browser, re-opening and filling the whole thing out again. That is really really Annoying. 


There are a whole myriad of alternatives to Google ReCaptch, most of which are self hosted and have none of the privacy issues associated with Google ReCaptcha. The general trend these days with Captcha is that its not required anymore since form submission mechanisms have evolved to use a hidden captcha which is in fact a generated seed on the form that is passed and validated server side on submission. A robot (or bot) would want to POST the form without filling it in which this hidden captcha easily defeats. Further validation of field types can pretty much eliminate bot POSTing and removes the need for anyone to click traffic lights, fire hydrants, store fronts or any other collection if images whilst providing Google with your personal information. 


  • Google Re-Captcha is not infallible and can be defeated by browser plug-ins or brokers. 
  • Google Re-Captcha has serious privacy issues especially in Europe. 
  • Google Re-Captcha is annoying to visitors and deters customers. 
  • Google Re-Captcha can present images of such poor quality (to the left) that no one can accurately guess them. 

If you are using Google Re-Captcha on your website then look for alternatives, there are many out there and many of those will not require the customer to enter anything and work silently in the background. If you have a GEN Hosted website and would like assistance in replacing your Google Re-Captcha then please raise a ticket at the HelpDesk and we'll do our best to assist you. 

In writing this article, we rely on sources from Google's website and others. We make every effort to ensure accuracy but things do change especially terms and policies so be sure to check the current status. 

Continue reading
  15093 Hits


© 2019 GEN, E&OE

Recent Comments
Guest — Baranee Bjoha
Great, I was just about to order food via UberEats and guess what... "Try Again Later" bullshit. I wonder how much business they l... Read More
Friday, 24 May 2019 12:04
Guest — Moe Badderman
reCAPTCHA is the biggest waster of time on the 'Net, but the lack of instruction for the comment form of this website is a runner-... Read More
Saturday, 30 May 2020 05:41
15093 Hits

The Food Delivery War (Deliveroo, Just-Eat, UberEats) Comparison and Analysis


There is no doubt in my mind that being able to order food online and have it arrive at your home or business half an hour later is a wonderful thing, but not all services get it right, and some get things badly wrong. We collectively decided in our office that we'd use each service daily for 1 month and review the performance and shortcomings of each on this blog. This isn't a conventional tech related article but we think its something of interest. We'll go through each service provider in the order we tested them and provide an insight into the strengths and weaknesses of each service, finally we'll summarise the three and give our views. If you find this article useful and interesting then please rate it. 


Now Deliveroo is probably the best known provider in the market, and I do like the branding even if its a little juvenile, but how about the actual service?

Deliveroo has a phone App, and a website, both of which work fine. You are required to supply your email address and phone number (which can be a landline which is great). The entire registration journey was simple to follow and easy to do.

The selection of outlets available via Deliveroo is reasonable (we're in the city centre here) and the general layout and operation of the website is good. 

Placing an order is a simple matter of selecting the restaurant, selecting food by adding it to a basket and then checking it out. Some restaurants allow changes to food items such as add/remove sauces, topping, and so on, but some don't and that's more the restaurant than deliveroo's fault. 

Once the order is placed, your taken to a map showing the outlet and your home/office which updates every few seconds. There can be a significant delay between the ordering and the assignment of a rider, the rider arriving at the restaurant and any changes on the map, and this is because delivery agents (riders) can pick and choose which delivery they will take, meaning the restaurants further out can be waiting literally HOURS for someone to transport your food. Regardless, Deliveroo keeps you informed of the process so you know when someone has taken the job and when the food is actually collected, after which the map will update showing the location of the agent (rider) and this is really helpful in judging arrival time. 

The competence of the delivery agents is extremely variable with some unable to read street signs and house numbers whilst others able to quickly arrive at the correct premises. In our test period we found that around 80% of agents found the property quickly and easily (it is very obvious and clearly marked) with the other 20% ranging from wandering around, going to the wrong premises and even just dumping the food and running after marking it delivered. There is no way on Deliveroo to rate the agent (rider) or even the restaurant so deliveroo has no way to track performance and penalise those who fail miserably and this I think is an area that needs urgent improvement. 

In the event that the agent just cut and run, or delivered the wrong order, Deliveroo were quick to respond and just refunded our order, which was great for us but I'm not sure if that information is fed back into the network to penalise the rider or outlet for their respective cockup. 



A latecomer to the food delivery business, Ubereats seeks to capitalise on its taxi business by using that same resource to deliver food, and why not. The UberEats website rejects our email address as 'invalid' even though its not of course, and further demands a mobile phone number before it will proceed. We used an iPad Pro with a SIM card as the mobile number, and had to register up a gmail account to get past the invalid email nonsense. Poor design and coding aside we eventually managed to get registered and a text message was sent to our iPad with a code to verify and we're up and running. This 'verification code' isn't a one off, you'll be hassled to enter it time and time again for some unfathomable reason and this is a real pain. 

The ordering process is very similar to Deliveroo, with a matrix of restaurants to select, food items to select and then the old basket add before checkout. One thing you do notice with UberEats is the multiple entries for the same restaurant at a different location. For McDonalds as an example we have 6 different listings for 6 different locations, and we have to choose which one we want. That makes no sense. Surely we should have one listing and UberEats Decides which outlet to order from based on distance? 

After we've check'ed out were presented with a similar screen to deliveroo showing the outlet and delivery agent and again this map updates periodically. As with Deliveroo UberEats suffers the same loooong delays on some deliveries simply because they don't have enough resource and allow delivery agents to pick and choose what they collect and deliver, but unlike Deiveroo, UberEats doesn't keep you informed of the process and your just left watching the map with the expected delivery time shifting further into the future with each update. In one instance we were waiting just over 2 hours for a delivery and there's no way to cancel it and no indication as to the holdup. This can be frustrating especially when your dinner break is an hour between 12:00 and 13:00. 

When considering delivery agents and their competence, UberEats was slightly better than Deliveroo with approximately 90% of agents finding the location and delivering the food quickly and easily. The remaining 10% just drove into the street and tried to call the mobile number that we'd been forced to use during registration, this is as I said before a SIM card in an iPad Pro so its not going to ring no matter how many times you call it. Some agents eventually prized themselves out of their cars and came to the gate whereas others just marked the food as delivered and drove off. UberEats DOES have a system to rate the delivery agent AND the restaurant and that's awesome, but, you don't get to choose who you have deliver the next order. When ordering your shown the rating of the delivery agent, but whether its 50% or 99% is pot luck and you don't get a say in it. The rating is however quite accurate and those with a low rating were indeed the ones who didn't show up or delivered our food elsewhere. One guy actually refused to come through our gate claiming he had a phobia of gates, but seriously how can you delivery food to the door when you can't get through a gate?

When there was a cockup, UberEats was nearly impossible to reach with us eventually having to leave a message via their website, and even then just trying to convey the issue presented many challenges. Out of the three, Ubereats has by far the worst support should you ever need to contact them. If you loose access to your email address or phone number then you are literally screwed as ubereats will only contact you on those and you can't change them without being able to reply to an email FROM the address you're trying to change. Considering this level of stupidity from Ubereats you may find yourself burning through a few accounts as its easier to just register up another account than try to fix the one you have. 

One point to note here, UberEats has absolutely no facility to change the mobile number you used when your signed up. We would have loved to change that to the office landline so we'd be able to receive calls, but we can't and we're stuck with a number from an iPad pro. 



Just-eat has been around for a while now and tends to offer restaurants that are further out of town and not available on the other two which is nice. Just-eat unlike Deliveroo and UberEats is not limited to city centre restaurants and for that we're grateful. 

The sign-up process was painless and unlike UberEats it accepted our email address and allowed us to enter a landline. The range of restaurants was reasonable and accessing them was also ok. The ordering process is a little more clunky than the other two but it's certainly do-able once you get used to having to 'Add' a subtraction to an order. The checkout process was fine but the post order tracking was less comprehensive. There was delivery tracking once it left the restaurant for some outlets and that seemed to work well but not for all. Each restaurant will use its own people to make the delivery so just-eat is simply the order taker, not the deliverer. 

Delivery times were rarely what was quoted with an hour being the norm, but Just-eat does allow you to enter a delivery 'note' into which we could enter "Press door phone and side entrance" which was a neat future and meant that some delivery agents actually came direct to us without going to reception first. Just-eat has a rating system allowing us to rate both the food and delivery time but not the delivery agent and its not immediately obvious how to get to this screen. 

Just-eat does allow you to have more than one address which we found especially useful so we could use the same account for both office and home whereas the other two needed a separate account for each that was awkward to use and was unable to be used with their APPs.

Just-eat provide online chat and a number to call when it all goes wrong and they were fairly quick to respond and issued a refund where needed. 

Notwithstanding the delivery times and lack of tracking, we felt Just-eat did ok and we'd certainly use them again. 


Price Variance

In order to correctly study the price differences between services we found a restaurant that is on all three services, and we ordered the exact same items on each, here's how they compare...

Service Food Cost Delivery Charge Total
Deliveroo £21.00 £3.15 £24.65
UberEats £19.35 £2.50 £21.85
Just-Eat £21.00 £2,50 £23.50


On a single order your looking at a saving of £2.80 (or 11%) when selecting UberEats over Deliveroo, but over a year of ordering assuming you're spending £50 a week on deliveries over 48 weeks you would save £264. Its worth noting at this point that Deliveroo offers a monthly payment plan of £11.49 which then gives free delivery on all orders (delivered by Deliveroo) and Ubereats has for months been suggesting its going to offer something similar. If your a regular buyer then this may work out in your favour but we didn't take this option and its not included in the table above. If you are considering such an inclusive delivery option then check out the small print because there could be restrictions that are going to make it less economic. 


Summary and Thoughts

Some studies we've read suggest that 70% of restaurant business will be via delivery, but there's no guarantee and the services above are going to be the ones leveraging that change but is it all good news? Well not for the local Pizza, Chinese or Indian takeaway's who traditionally dominated the home delivery market with their own drivers, now relegated to the sidelines by the big three, and we're hearing of restaurant owners who are being pressured into paying the big three to delivery their food over and above the delivery fee that we're paying, but for us as consumers it can only be good. 

Whichever you prefer, you may well have to use all three because of the exclusive deals done by each. For example McDonalds is exclusively UberEats, BurgerKing is UberEats and KFC is Just-Eat and this is unlikely to change anytime soon. Independents are often represented on two or more as this makes most sense. 

From an ecological perspective, Deliveroo is mostly riders on bicycles, whereas Ubereats is mostly cars claiming to be bicycles. Just-Eat is almost always vehicles. I would hope that in the future, the use of bicycles and electric vehicles would be an order option or be otherwise highlighted as an initiative. Likewise, all three should do their best to leverage a reduction in plastic packaging and waste, highlighting those restaurants who comply etc. 

If you have a problem, then Deliveroo were quick to address it, Just-eat were slower but always responded positively, and Ubereats simply isn't worth your time so let it go. 

None of these services allow ordering from more than one restaurant at a time. When you have a city centre environment and our office we often found some people wanted food A and others food B but we could only order one. This wouldn't seem to be an impossible issue to solve and would give one provider a lead over the other but no sign of it yet. 

We also found the 'delay' before anything was delivered to be annoying but understandable. A suggestion here would be to have a realistic delivery time based on capacity and an option to cancel the order if its too far in the future. 

We sincerely hope you find this article of use and would appreciate your comments and ratings. 


November 2021 Update

One of the USP's for Deliveroo is their extensive use of bicycles which from an ecological point of view is good and this fits with our green profile. We used Deliveroo Plus extensively for our city centre offices. However, this also means the food isn't always hot when it arrives, its not often but occasionally. Now However, for no other reason than to increase profit, Deliveroo have started this nonsense "Rider is delivering another order first". This means that a 5 minute ride from a city centre outlet to our office now takes 30 minutes plus and the food is stone cold when it gets here. We asked a few riders what street they had delivered to before us and found that its not even on the way, sometimes in the opposite direction. Uber has had this for a while, but (a) its usually cars so quicker overall and warmer and (b) you can opt out by paying a tiny fee. Needless to say we've cancelled our Deliveroo Plus and will be switching back to Uber for all future orders. 

March 2022 Update

Over the last 6 months Uber Eats has seen a significant decline in quality and service. Their 'support' which was excellent has now clearly been outsourced in somewhere in asia, where there is a clear language barrier and the agents seem unable to resolve anything. Not only that but the majority of deliveries need at least two or more 'delivery partners' to arrive at the outlet, be waiting for it, stop waiting and drive to the other side of the city, then cancel meaning we're now waiting HOURS for deliveries instead of minutes even at peak times. 

So, as this is a comparison, Deliveroo was good, then uber was great, then deliveroo started delivering cold food and now uber just isn't supervising their delivery agents and doesn't care about its customers needs. Just Eat for what its worth has been consistent in their service, which isn't the best by far but at least can be relied upon. We've significantly cut down on our use of delivery services and now instead use a local cafe who deliver themselves and doing this its never wrong, never late and never cold. Perhaps the age of delivery is coming to an end, or maybe the companies just aren't interested anymore but with such a clear drop is service there has to be a root cause. 

May 2022 Update

In only two months Ubereats has gone from worse to terrible whilst Deliveroo has stepped up their game somewhat. Ubereats is now charging extra for 'priority' which isn't shown to the rider, nor does it mean anything at all since most riders these days will collect more than one order for more than one service and you'll always wind up being second or third delivery regardless. There seems to be no supervision of this since the additional deliveries can be clearly seen on the tracking, and where support was problematic in March its virtually non-existant today. If you have a problem, tough seems to be the current thinking over at uber. Deliveroo on balance has stepped up and the quality of their riders seems to be improving, maybe through better training or selection, but there is still no way to feedback a bad experience or rate/tip a rider which is a shame. JustEat for what its worth is still hanging on in there with a fairly consistent record and comes in a close second to deliveroo.

Continue reading
  11992 Hits
  1 Comment
Recent comment in this post
Guest — MayaJosephine
Hi Good day wishes, Your blog delivers you a distinct understanding about UberEats Clone app and how to launch a food delivery bus... Read More
Saturday, 29 August 2020 06:10
11992 Hits
1 Comment

The curious case of Traffic Exchanges


Traffic Exchanges are not a new phenomenon but have in fact been around for at least 10 years if not longer, but they do come and go each rarely surviving more than a few years. The concept is a simple one, you browse someone else's website and they'll view yours. At this point we break these down into two groups, auto and manual. With a Manual traffic exchange each user selects sites to view, usually for a selected time and earns point for this. Those points are then spent with other users viewing your site(s). In auto, a browser (either program, plug-in, extension or just javascript) cycles through websites automatically usually for less points per view. Some of the more advanced exchanges allow specific geo-targeting, refer, and even an attempt at search engine -> site simulation with varying degrees of reliability. In most cases you can of course pay for points which are then consumed by users. None of the research and testing done for this article involved paying for anything. 

The aim of all this is threefold. Firstly, website traffic from human visitors can be an opportunity to convert into sales providing what you're selling is something of interest but with auto surf there absolutely zero chance of this.

Secondly, some advertising networks pay per view rather than the more normal per click or conversion, these networks whilst generally immune to fraud can be fooled by traffic exchanges generating negligible income for site owners.

Thirdly, search engine positioning as well as ordering of ad banners on networks is driven by complex algorithms, some of which may (or may not) be influenced by the increased presence of visitors to a site. In reality I can find almost zero data to support this belief but the persistence of such traffic exchanges would tend to suggest there must be something to it. Many of the sites listed below use different terminology for the actual 'points' such as minutes, tokens or credits but for the purpose of this article I shall just call them points as the principle is much the same. 

During the research I've searched, located, signed-up and tested as many traffic exchanges as I could find and will list them here together with my observations which I hope will be helpful.

Traffic Exchange Websites


A bright and clearly produced site from Australia that looks maintained. This site is purely a traffic exchange and it does it very well. Auto-Surf is done with a browser plug-in and there's one for both Firefox and Chrome. Whilst it doesn't show every visit it does show the number of visits per day and with that we can match that to visitors fairly accurately. This site offers a few more options for a recurring fee of $10 allowing limits to be set as well as geo-targeting, referrer spoofing and several others. For an extra $10 you can buy 12k points. If your just starting out then this one is a fairly easy one to get with and won't cost the earth. 


A fairly recent entry to the market, Followlike.net provider some features not often seen such as OK.ru, Vimeo, Reverbnation, ask.fm, VK, Mix, Dilgo, Pocket, Folkd, Reddit and 9Gag to name a few. I've tested a few of these and found it to work and it provides accurate tracking of your accumulated points. The auto-surf works although it views sites in a pop-up window which you then can't easily mute in most browsers. Firefox has a plug-in that auto-mutes all new windows so that was an easy fix. Interestingly this site is hosted in the UK on Webfusion. If your looking to have a shot then this one is definitely worth checking out and with recurring plans starting at approximately $5 for 5k points.


Very clean and modern layout and simple easy to understand operation from Hong Kong. Auto-surf is by an application and this works well in Windows (Linux version also available). You can have up to three websites listed. Geo-targeting and bounce reduction are available at a cost starting at $15 approximately per month which includes 10k points. Certainly worth a try if this is something you want to explore.  


Another good example of how to do it from France, clearly laid-out and with manual approval of sites it's a safe bet that it's regularly maintained. Even on the free account you can have referrer spoofing, user-agent overriding, Click simulation, scrolling simulation and geo-targeting and in our tests works exactly as advertised. Auto-surfing is again via an Application and this works well. There isn't a recurring fee, but points can be purchased for as little as 2 euro (about $3) for 1.5m points. OtoHits also offers an API for integrating your applications but I haven't explored this option. 


Nice clean design from the USA. In testing this site works as described and has fairly accurate tracking of visits and for a recurring fee of $29 you can have geo-targeting, referrer spoofing as well as 200k points, 45 websites and more. Auto-surfing is done via an Application (Windows) that you can set and forget. Certainly a serious competitor in this marketplace. 


This exchange has been around for a good number of years and offers far more than just traffic exchange and in fact the traffic exchange is quite poor but its still worth listing especially if you want human Facebook, Twitter, Youtube, Pinterest or Soundcloud likes and follows. The actual tracking of where your points are going is non-existent but tracking visits to a site that has zero normal visitors does indicate that it is working as advertised. There are paid plans starting at £199 per WEEK but these apparently give unlimited points although I'm not sure how that actually works. 


Another long lived exchange which offers much of the same, but includes blogger posts and some others but this site has some non-functionality and almost constant service outages that would suggest its no longer actively maintained. The auto-surf is also broken and only surf's a few sites before stopping, mostly due to connectivity issues with the site, and in our tests we often found that the auto-surf thought we had multiple windows open when we didn't and whilst we viewed the sites no points were accumulated. if you set no-referrer it still uses linkcollider.com as the referrer in the web requests. Apart from these few issues its certainly worth a look with recurring monthly plans starting at £20 for 5k points. 


Manyhit.com, hosted in the USA is Unlike other players in that it suggests you could actually earn real $ by surfing sites but in testing this wasn't the case. No matter how many sites I surf'ed the 'account' still showed $0. Judging by the "This banner URL is incorrect" everywhere, this site may well no longer be maintained but is listed for completeness. 


Whilst this site from the USA promises much its complexity and reliability issues with its 'auto-surf' software move it down the list. Because of the issues I wasn't able to successfully test this site and I can't say if its maintained or not. They have a recurring monthly package $14.95 and I'm sure for that you'd find things would work as expected, but for this article I was only testing the free account. 


I've included this one because I really like the site design, its the best I've seen so far and it seems to have some sweet features such as Macro's. Unfortunately the only way to auto-surf is with their Application and the windows version requires FLASH which I'm simply not prepared to inflict on my PC. Recurring monthly premiums are only $6 for 100k points which is very reasonable but again couldn't test due to FLASH. I suspect this one is worth keeping an eye on. 


Another site that's been around for a while and only supports manual web surfing. Points can be purchased at a rate of $10 for 3k without any recurring charges. After adding a site, accumulating some points there were trackable visitors so this one works as advertised. This site also offers completing surveys as a way to accumulate points but I didn't try any of those. 

The Dangers

So, you've got a PC somewhere running an auto-surf, or even a browser on your PC running in the background viewing sites, but you don't know what those sites are going to be. They could be malware infested sites, bitcoin harvesting sites, Denial of Service sites, Sites that attempt to deposit files on your PC or even sites with illegal pornography and its all traceable back to YOU. (See our blog article on Tracking). Whichever way you throw it, your trusting these sites to monitor and vet all the links with them which they simply don't do. In the test we ran there were a few instances of porn and a few more of malware but nothing we couldn't handle because we were monitoring it. 

Socialmedia Purchasing

As you will no doubt find the majority of these sites also provide an option to 'purchase' using your points, socialmedia metrics such as followers, likes, etc. This is not a great idea because whilst search engines do use your total followers as a positive metric, they are more focused on the quality of your socialmedia proponents. If you consider the rep or footprint of a socialmedia profile is based on that entities posts, likes, dislikes and follows then your rep is based on the rep of those who post about you, follow you and like you. For a normal real person socialmedia account that's great and these accounts have reputation and normal activity, but the ones your buying on these sites will have thousands (or more) likes, follows and shares that are clearly fake, unrealistic and ignored by Google so don't waste your time and money. 

Backlink Purchasing

It is no secret that quality backlinks can greatly enhance your sites appearance in search engines, but likewise poor quality backlinks can greatly damage your appearance in search engines. Your site will rank far higher with just one good quality backline than 1000 poor quality ones, but how can you judge quality? Simple, anything you BUY on sites like the above are POOR no matter what they tell you and these will damage your ranking. The only way to obtain QUALITY backlinks is with effort and perseverance. As a point of interest it has been a long standing weapon in SEO that competitors can be knocked off their spots by spamming their sites with thousands of poor quality backlinks, so please don't spam yourself out of search engine existence. 


I am not sure of the actual benefit from traffic exchange and there's no guarantee that its not going to hurt your rankings rather than improve them, but website ranking especially in Google is something that takes months to affect and during this articles research the timeframe was about a week. I can see some benefit assuming the users browsing your sites have the Alexa Toolbar or similar plug-ins (that sends all the domains you visit back to a server somewhere, which personally I think is a ridiculous idea) and these would be influenced by the increased traffic. As for having any effect on search engine ranking I cannot see how unless the surfers first went to the search engine page, entered some keywords, paged through until it found your site and then clicked it, all of which is quite complex and unpredictable.

In order to track the effectiveness of these traffic exchanges, I used a different URL with each and then dumped the log files and compared to the reported figures. All the ones I could test came out with about the right number of hits give or take, but be aware that tracking social media likes/follows is far more complex. . Just considering Facebook then FB Likes would require these sites have linked your FB account and had access granted to an App on your FB account in order to accurately register likes and follows which none of them seem to do. Google, Twitter, Linkedin all have a similar method for tracking.  

I'm going to leave it running for a few more weeks with a couple of theses sites and see what, if anything happens to ranking or placement and I'll update the article with the results. 

I'd be very interested to hear if anyone has another take on this? please leave a comment. 


Continue reading
  6540 Hits
  1 Comment


© 2019 GEN

Recent comment in this post
Guest — Frankie Neesman
So what’s the point? Still don’t get it
Wednesday, 17 April 2019 11:12
6540 Hits
1 Comment

Spring Clean your Personal Computer


Modern operating systems like Windows and MacOS generate vast volumes of needless data during normal operation by design. For almost as long as these operating systems have existed there have been tools to clear down the needless data and optimise files, tables and configuration to speed up operation. One of the first of these tools was "Norton Utilities" created by Peter Norton in 1982 for MS-DOS (later sold to Symantec in 1990). In the intervening 35 years Operating systems have become over more advanced and demanding and the number of competing tools increased. We, as a solution provider have used most of these tools over the years but recently we've focused on a powerful and lightweight tool from MacPaw. 

MacPaw, a Ukrainian company has been producing "ClearMyMac" for many years and are now very much the market leader in this space. Recently MacPaw entered the Windows market space with "CleanMyPC" and with their proven track record we adopted this product as our core offering for Windows users. This article is going to review both products and highlight the key features of both. 


CleanMyMacFor as long as we've been using Apple products, CleanMyMac has been a pre-requisite and it brings a comprehensive toolbox to the platform. The most signifiant of which is its junk removal feature which can free up a significant volume of space on each run but removing Cache Files, System Logs, Broken Preferences, unused Universal Binaries, unused Language files and Localisations, Deleted Mail Attachments, iTunes junk, Browser Cache and History and of course your Trash Bins. This whole process is automated and after a few minutes of processing a figure of storage is given. In running it on this very workstation whilst writing this article CleanMyMac found 1.7GB of junk to be removed. 

The Un-installer is one feature that is still missing from MacOS even today. Some App's once installed are complex and awkward to uninstall and most require a return to console commands to remove everything. CleanMyMac allows complete removal of Applications including preferences and local data. CleanMyMac can even uninstall multiple applications at once which is really handy. 

MacOS has a number of tools only available from the command line such as flushing DNS, rebuilding the launch database (Launchpad), repairing permissions etc and CleanMyMac brings these to an easy to use menu. 

Privacy is something that is becoming more important now that the nefarious practices of some Websites is becoming public knowledge. CleanMyMac gives a simple click and do approach to clearing this data and ensuring privacy. CleanMyMac also includes a secure file deletion tool which promises to eradicate all traces of a selected file or files. 

Extensions, are pluggable add-ons for MacOS Browsers and Applications such as Finder and Launchpad. CleanMyMac lists all these Add-ons and allows you to simply point and click to enable and disable them. Especially useful is removing Launch Agents, which load automatically when you login and can be really annoying. 


With the same clean and modern interface, CleanMyPC brings the same toolset to windows and focuses on the key issues that still effects windows PC's daily. The "Cleanup" feature clears Cache Files, Logs and of course Trash. Running it on a PC in the office a moment ago we free'd up 1.5GB of space automatically. 

The Windows Registry is the store for all settings for Windows and most Applications. The Registry is a database and suffers from Fragmentation as well as junk. CleanMyPC swiftly cleans the junk and optimises the Registry files to speed up access and keep it relevant. 

Windows does have an Uninstaller but you can only uninstall one application at a time and there are often issues with uninstallation. CleanMyPC brings the same multi-application uninstaller with added clean up. 

Windows Extensions are again plug-ins and add-ons for Browsers, Windows Explorer and these can again become damaged and require repair or removal. 

Autorun, which is the same as Launch Agents on MacOS and is a list of applications that will automatically be started when your PC boots up and/or you login. CleanMyPC Gives you a simple click to enable/disable list to easy manage these.

Privacy is again a concern maybe more-so for Windows users and CleanMyPC not only gives you a list of data to be cleaned but also suggests the data's "Safety" or risk. CleanMyPC also brings with it the same secure erase functionality ensuring any trace of the selected files is removed and rendered unrecoverable. 

CleanMyMac DashboardThe NEW Version of CleanMyMac

CleanMyMac X takes the product to a whole new level with CPU & Memory monitoring, Malware Protection and performance tracking all of which just add to an already awesome toolset. 


Both tools offer a wide range of really useful services and perform flawlessly. This is not free software and there's a price but its very reasonable, is FREE to try and on a cost/reward basis is well worth the money. If your looking for more than 5 copies then please contact us for a corporate license. 



Continue reading
  4212 Hits


© (c) 2019 GEN. E&OE

4212 Hits

Royal TS/TSX - The perfect tool for connecting to everything


There are some tools that you work with so often that they become invaluable. Anyone who spends their days connecting to different systems and servers will know that the tools generally available are system specific; Windows desktop = Microsoft RDP Client, Linux box = Native SSH or Telnet, FTP Server = FileZilla, Cute or WinSCP and the list goes on. Each tool has its own qwerks and issues but we learn to live with them in order to get the job done. 

A few years ago now I was looking for a better SSH client because in my job when I have many SSH windows open its easy to loose track of which is which and I downloaded the first beta of Royal TSX (For Mac). It was a work in progress but I loved it. Now I can have my SSH clients in Tabs instead of separate xterm windows and I can name the tabs so its clear to see. I can even automate the login by scripting so I no longer have to waste time looking up passwords and leaving sessions open way longer than needed just because I have to lookup passwords. Royal TSX even in its early stages was a well thought out tool that instantly made its way to my quick launch bar. 

The first beta could connect to SSH, Telnet and RDP and I quickly found time to add all my regular connections and never looked back. 

Now that was a good few years ago and today Royal TS for Windows and Royal TSX for Mac are well polished and comprehensive toolsets with connectivity options to just about everything you could ever want...


Telnet, SSH: With full control over credentials, session, scripting, emulation and much more. 

RoyalTSX Screen

You can clearly see the TAB's showing connections to multiple servers with varying connection types. 

File Transfer whether FTP, SFTP, SCP can be a bind to manage but no longer


Simple drag-drop file transfer. But there's more, much more...

RDP: for connecting to windows workstations and servers. 

TeamViewer: For anyone that still uses it. 

VNC: For your GUI based connections to Windows, Mac, Linux, IP KVM's and more. 

File Transfer: Over FTP, SFTP, SCP and more. Simple drag-drop functionality.  

VMWare: List instances, control on and off, connect to the console and more.

Hyper-V: Instance control, data and connections. 

Serial: Yes, even Hardware Serial over USB is a click away for those serial console moments that blindside you on an idle Tuesday afternoon. 

An all-in-one Tool, one screen, one set of configs! The organisation of connections allows you to create folders and move connections into folders so finding that connection is logical. I have folders for each customer, then a folder for each site within the customer folder which really helps. Royal TS/TSX stores all your connections and configuration in a 'phonebook' file which can easily be migrated or even sync'ed between workstations. I for example sync between my main desktop, laptop and mobile phone (yes, there is a mobile/tablet companion product!) 

But that's not all, how about windows Events? We all hate those, and monitoring can be a pain especially with multiple servers over multiple domains. Royal TSX cuts through all the nonsense with direct connections to Windows Events.

RoyalTSX VMWare

Windows Services, no problem. Windows Processes a click away, simple as anything and of course Powershell is also a click away. 

If your not already looking for where to download this tool then I'd be surprised, did I mention its FREE for up to 10 connections and after than the full product is only €35 or $46 for an individual license which is seriously undervalued in my opinion. If I add up the thousands of hours I've saved over the years then the true worth of this product would be 5 figures plus. 

When I first started using TSX and it would spend a good part of the day on my screen, where co-workers, visitors and even customers would ask, What are you using to do that? The product literally sells itself through its smart clean look and feature set. 

The developer, Royal Applications, are an Austrian company with a tight focus on their core product line. The product is actively developed and there are updates with new features and connections arriving regularly. The Support is outstanding with quick responses and assistance, and there's comprehensive documentation also available.

Its important to note that Royal Applications are not paying or influencing this review in any way. I genuinely love the product, use it every single day and paid willingly for my licenses. I strongly recommend anyone not already using it to give it a try, for FREE remember. 

You will find their product at www.RoyalApplications.com and a quick link to their download page would be https://royalapplications.com/ts


If you found this review useful and I managed to save you hours a day then drop us a comment... 

Continue reading
  10693 Hits
  1 Comment


© (c) 2018 GEN, E&OE

Recent comment in this post
Guest — Dave Schincelli
What I've always been looking for.... Perfect.
Thursday, 25 October 2018 13:22
10693 Hits
1 Comment

SocialMedia, Google, Bing, Yahoo, Amazon, ISP's, Government Tracking and Personal Data Leakage

After our post 'In defence of social media" which itself was a response to the disproportionate news coverage of Facebook specifically, there have been many responses generally accepting that it should have been common sense that nothing is 'free' but that there was a clear mis-understanding on how people are tracked online and what exactly is collected and by who. This isn't unreasonable because the whole tracking and collection industry is shady and insidious, and just for clarity I was correct when I said GDPR will make absolutely no difference. So, how about we look at a few specific examples of data capture from some big players in the market...

Let's start with Facebook, purely because it was the subject of recent news stories. 

ChavbookFacebook of course collects everything you feed into it, this includes you name, address, date of birth (if anyone actually uses their real date of birth), phone numbers, email addresses and so on. This data forms the root record (the record to which everything else is attached). 

To the root record we then add everything you view, everything you like or dislike, everything you post (Images, Text, Links), every message you send and receive and every ad that is displayed or clicked. 

Associations are also added, that's "Friends" and the interactions between you and your "Friends" are also logged and common interests or appearance in common photographs are also recorded. 

If you use the Facebook app on your mobile device then your location (unless you deliberately disable it) is recorded and stored. 

If you are unfortunately enough to have used your Facebook 'login' to login to third party websites then a record of that site, when you use it and for how long is also included. 

Facebook was reportedly paying people to give up their privacy by installing an application that sucks up huge amounts of sensitive data, and explicitly sidestepping Apple's Enterprise Developer program rules. This has now been brought to a shuddering halt by Apple, so thanks Apple. More information on this one HERE.

As you can see, Facebook stores pretty much everything you do and that's their business model, you get to waste hours of your life that you'll never get back and Facebook sells the data they collect from this activity. There's nothing wrong with this business model, it works and has been around for decades. 

Pinterest, Instragram(which is now Facebook), Tumblr and so on

These sites, which are generally 'image' sites record everything you add into the profile, a to that they add everyone you follow, every image you view (and for how long) and further some of these scan the images uploaded, recognise faces and then form internal relationships between the images and users. There's nothing wrong with this business model either of course, except perhaps the fact that the moment you upload your image, its no longer your image but that still doesn't stop people using these services. 


TwatterNow Twitter has been around for a few years and is basically a 'feed' services where you follow topics and people and you'll receive updates from them. Its a simple model yet an effective one. Twitter records your posts, reads, follows and followers. It also records every link you follow from posts. Twitter inserts 'ads' into your feed which is annoying but not a show stopper and these are of course paid for by the advertisers. The rest of twitters revenue comes from selling your data to third parties which is again a good sustainable business model. In the early days Twitter was wide open to abuse where 'fake' accounts were created in celebrity's names causing unsuspecting followers to be duped and further be directed to 'donation' or 'malware' sites but Twitter put a stop (mostly) to this by 'verifying' some celebrities to remove any confusion. Twitter also allows the embedding of links, audio and now video into the feed which is great but also brings with it a new set of challenges around protecting users but also provides additional tracking metrics. 



The Evil OneGoogle is a huge company with many 'services' most of which are 'free' to use. Let's look at probably the most common service, the "search" engine. There's no denying that Google.com is a great search engine and if your looking for something a little obscure then its your go to engine, but let's look at what's captured. 

When you Search on Google, the search term is recorded along with the results, which results you click on, and the time taken for that click. This simply makes associations of interest between your google profile (if you created one, or a unique identifier if you didn't). This in itself isn't really bad and you would expect them capture this information surely? This information (search history) is further used to focus future searches so the more you use it, the more likely you are to get more applicable results but this is the official line and don't ever believe that Google is the only search engine, its not. Because of the way Google adds sites to its index, sites with large budgets and resources always find their way to the top results even if they aren't applicable at all. Moreover, Google adjust results of political, social, personal or controversial searches to add their bias to the results you see, and many would argue that this 'bias' that most don't even realise is wrong on many levels. Some other search engines such as DuckDuckGo, etc often produce more evenly weighted results and without adding their bias which some may prefer. 

Getting back to Google the company, we need to talk about google analytics which is yet another 'free' service allowing website owners to get insights into visitors which is actually really useful, but for that to work Google needs to be able to connect YOU as a person to that site which it does easily. This gives Google not only your search queries, results, and clicks but also now most websites you visit, when you visit them for how long and what you do on those sites. Now we're starting to collect some seriously valuable data and this is of course the business model again, you get lots of free services and Google makes money from advertisers and the data. Google allegedly purchased shopper data from MasterCard which again when augmented with your online profile just adds a wealth of additional behaviour data. 

That incredibly annoying "I'm not a Robot?" - Well that little thing captures a vast collection of personal data and all you have to do is click some pictures and be annoyed by it. 

Other Services (Gmail, Google Docs, Groups, Google+, Google Drive, and so on)

Google offers a bunch of other 'free' services all of which are quite useful, but to use these services you'll need to provide your mobile phone number, which you are forced to verify by entering a code from a text message. Using these services each bring yet more data to the profile they are maintaining on your behalf. Every email you send and receive via Gmail is scanned, stored and linked. Every document you add to Google Docs is scanned, stored and added, any file you store on Google Drive is scanned Stored and added, are you seeing a pattern here? Nothing you do on any Google service is private. How about Google Maps? A very useful tool if you want to find somewhere, but yet again everything you look at is recorded and added to your profile. If you have an Android phone then your location data is also added to your profile along with your messages, apps installed, app usage, contacts and so on. Google Home is a voice assistant and speaker for your home, but again anything you ask it is stored and added to your profile data. 

YouTube (now owned by Google) again stores the video's you want, channels you watch, comments you make and so on. 

Android, the phone operating system developed by Google as open source has its own class of information leakage in that every app you install and use is tracked and unless you specifically disable it (and there's still a debate if you can disable it) then your location is tracked using your phone's GPS data. Mapping this allows Google to track all the places you visit, shops you visit and for how long. 

Google Chrome is a web browser developed by Google and is again free to download and use. Within this browser there are options to 'store' your credentials and bookmarks in the Cloud and this does then of course give Google this data to further add to the profile. We also noticed that Chrome (unlike other browsers) created several local files storing your search history, browser history, and so on for reasons unknown. The files are unprotected meaning that we (or any malicious or otherwise software) can easily read them to obtain this information. At the time of writing we also noted weak protection of your stored passwords, but this isn't specific to Chome and several other browsers are also easy to crack. 

So Google know what you search, what you view and for how long and how often, what you buy, what you look at but don't buy, how often you buy something, what you read, what you post and what posts you read, what pictures and video's you view, how often and from what websites which is what everyone expected, but wait, google recently were exposed by the EFF for using methods to bypass Apple's protection and capture users screens. Read the linked article HERE for more details. 

Bing & Yahoo

BongBing is a search engine that is pretty useless in fact and is even more unfairly weighted towards sites with $$$ and subsequently doesn't have any significant market share (about 7% at time of writing) but that doesn't mean that they don't store you searches, links clicked etc which they do. There's a 'relationship' between Microsoft and Yahoo which goes back several years and brings Yahoo results into the Bing search engine which is probably a good thing but this also brings Yahoo free services such as Yahoo Messenger, Yahoo Groups and so on into your search footprint. Yahoo itself has been bought and sold several times and the actual ownership is hard to pin down but we do know that the majority is owned by Oath inc (part of Verizon) at time of writing. 

Generally speaking the use of Bing and Yahoo is fairly limited these days with about 4% market share (at time of writing) since Bing's search results are limited and Yahoo's reputation has been shredded with past data breaches. The use of Yahoo mail brings with it the same issues that Gmail has, your email's and everything in them are scanned and stored. Microsoft's Hotmail is exactly the same and why shouldn't it be so, its free after all. Yahoo's Geocities which is pretty much dead now and Yahoo Groups, if anyone still uses them, bring yet more profile cross linking with group 'Members' being associated by topic and post and of course you must have a 'yahoo' account to participate.


Pretty much ANY app on your mobile device, for android at least is able to track your location using your device's built-in GPS. For Apple devices it's harder but still perfectly do-able. Collecting this GPS data, as you may suspect would enable the processor of such data to be able to track your movements throughout the day. For modern laptops running windows there is also a leak of GPS data to installed programs and even webpages under certain circumstances. Apple Laptops are by default prevented from leaking GPS data but this can be overcome especially in earlier versions of MacOS. Your Car, if it has satellite navigation, records your start, end and route in its entirety and the more upmarket vehicles ship that data over the cellular network back to base. If you combine this GPS data with detailed mapping information and you can easily link GPS co-ordinates with the places (shops, schools, etc). 

Internet Service Providers (BT, PlusNet, Virgin and so on)

Some reading this may not be aware that your Internet Service Provider has access to every website you visit. They do this via DNS which is the system that converts a domain name into an ip address. Unless you specifically override it your ISP will route your DNS requests to their servers which then accumulate your website requests against your 'session' which is your current IP Address linked to your account. Using SPI (Stateful Packet Inspection) your ISP can also record what you actually do online such as listening to music, watching video, making phone calls, instant messaging, and so on. All this data is accumulated and stored indefinitely and in this country at least is made available to law enforcement without a warrant. 


AmazonThe Amazon ecosystem is slightly different to the general model as there's no 'free' services, you need an account to be able to buy online, download books, listen to music or watch videos, but that doesn't mean the company won't collect your data because they do. Everything you search for on Amazon is stored and kept, everything you listen to, read or watch is stored and kept and all this profile data is used to target search responses and advertisements to your specific interests. Amazon don't make any guarantees not to sell your data (that I can find) so its safe to assume they probably do. Amazon also has 'Alexa' which further arguments the profile by storing what you ask and do with the devices but this in itself isn't bad and can be used to tailor responses based on your past history. The Amazon Ring Doorbell on the other hand is nothing but a storm of privacy issues. The doorbell records what it sees from your front door, continuously and that video is stored at Amazon. You, as the purchaser of the device have no rights to the data and it clearly states in the T&C's Ring and its licensees have an unlimited, irrevocable, fully paid, and royalty-free, perpetual, worldwide right to re-use, distribute store, delete, translate, copy, modify, display, sell, create derivative works, in relation to the footage taken from your front door, and you paid for the privilege. Whilst there is no law against recording your street in the UK, giving your live video to a third party who can do whatever they like with it would certainly seem to be unwise if not unlawful. With the application of face and numberplate recognition those third parties could potentially identify people walking and driving on the street which takes this to a whole new level. Can you stop it? Nope, this doorbell only works when the internet works, and when the internet works its uploading your video to who knows where. 

 Local Government & Agencies

The Department of Privacy InvasionYou may or may not know that your local council is at liberty to sell your personal data to anyone willing to pay. They call this the electoral roll but in fact its just a dump of all the people registered to vote + council tax payers. When you combine this with data from a company like Cameo you then introduce affluence and net worth, link that with Experien or Equifax and you now have credit worthiness, loans, mortgages, bank accounts and the list goes on, all free to purchase.

The DVLA is now also selling your details to companies so if you own or are the registered 'keeper' of a vehicle that data is now also up for grabs. 

And of course the Census data, that you MUST complete legally is made available for sale to anyone who wants it and this is of course why the Government is exempt from GDPR along with the Police, the Military, and anyone else who you may want GDPR to actually apply to. 


The payment provider allows easy transactions available on many websites and vendors. Paypal collects the product, price, location, currency, and store and records this at point of sale. Whilst this information can easily be justified, Paypal are at liberty to sell this data to anyone else which further compliments your online profile with validated purchases. 


There are an ever increasing number of "Voip" Providers, most of which are just reselling someone else's service who are actively pushing Voice over IP to anyone who will listen. There's no doubt that Voice over IP will become the norm in the future, but currently there are significant risks to its uptake. In an earlier article we showed just how easy it is to intercept voice traffic as it passes through the internet and this of course makes is really easy for anyone, government or otherwise to capture and record telephone calls. There are unconfirmed rumours that our own government is already capturing our internet traffic for analysis and of course voice traffic would be part of that. If you're familiar with the abilities of modern voice analytics then you'll know that your conversation can be quickly converted into a transcript and searched and/or archived. If you've taken up VoIP then ask your provider if they are using SRTP (Secure RTP) and you'll be told either No or they will lie to you. As it stands in the UK marketplace we are the ONLY VoIP provider offering voice encryption but be aware that even our voice encryption is only encrypted up to the point it leaves our service meaning we can ONLY guarantee voice security between GEN VoIP Customers/Sites. To many this shouldn't be a concern especially considering how much of your data is already in the wind but for some this is a serious unmitigated concern. 

The Cloud

There are two distinct flavours of "The Cloud". Private Cloud is business class internet based storage and services as provided by a myriad of providers and for those enterprise class providers you can be assured that your data, servers, containers and systems are secure and protected. Public Cloud which is often 'Free' is the sort of services provided by Microsoft (OneDrive), Google (Google Drive), Amazon, DropBox, Apple (iCloud Drive), Datablaze, Box, FlipDrive, HiDrive, iDrive, JumpShare, Hubic, Mega, pCloud, OziBox, Sync, Syncplicity, Yandex.Disk etc, and these services are absolutely NOT SECURE. This is not only because they are frequently compromised but because there is zero accountability because it's 'free' and provided 'as-is'. NO business should ever use Public Cloud services for storing business critical data. If its important to you then use a service that you PAY for and that has a degree of accountability. 

Cross Contamination

Since tracking to your personal profile is done via Fragments left on your computer, or cookies/sessions left by website's or even by your browser screen size and in a recent discovery by your sound card then allocating your activity to you is fairly good but there are some cases, especially in companies where internet access is proxied and where only a few 'login' to accounts that others activity can be falsely attributed to your or others profiles. I have personally seen this whilst writing this article when I requested all my activity from Google. Digging through it and remember I never use Google I found a bunch of searches performed as recently as earlier in the week that were from other users on the network which somehow wound up in MY profile. I have no idea how common this is in the real world. 


There are some claims on social media that Google, Facebook and others are always 'listening' using the Microphone in your equipment, but this has largely been disproved by researchers at the time of writing this article. That doesn't mean it categorically does not happen or that it does, simply that the evidence to date suggests not. 


Services such as VPN's and of course the ever popular Tor Browser are ways to obscure your real identity online, but you'll discover fairly quickly that the services above either don't work at all or are crippled deliberately. Google for example returns some made up message about unusual traffic. As VPN's come and go there will always be a short time before the services get blacklisted but this will never be a viable solution long term and as you'll discover in our article "A VPN will not save you" following this approach requires strict discipline and limitations. 

The sale of data and the data market

All of the above can produce fairly detailed and valuable profiles of your online AND offline activity but when the separate data collections are combined you start to have very complete profiles linked directly to an individual. This is what worries people more than Facebook and Google. Given that your data is bought and sold on a daily basis, some of these companies have a complete record of pretty much everything you do. Let's see what the total footprint of an average teenager today is

  • Your Name, Address, Race, Religion, Ethnicity, Phone Number(s), Email Addresses, family members, friends, loved ones, and associates. 
  • Your bank accounts and balances, credit cards, loans, and payment history. 
  • Your vehicle, make, model and registration, current tax and MOT status and how much you owe on it if anything. 
  • All Google/Bing/Yahoo searches, Clicks and All Sites visited, comments and posts.  
  • Every instant message you've ever sent or received and the content of all. 
  • All your photo's and the date/time and location they were taken along with everyone who can be identified in them using face recognition. 
  • Your location to within 5m at any time of the day and where you've ever been and for how long, how often and with who. 
  • What music, sports, products, services, video's, you like, dislike, watch, download or buy. 
  • Anything you've ever purchased or sold online, be that clothes, shoes, groceries, electronics, etc. 

I think now you must be starting to understand how the data business works and how your pretty powerless to stop it without some radical changes to your lifestyle and even then its too late for most people. Its important to be aware that these companies have done nothing wrong, nothing illegal or even shady, they are all businesses and their business is your data. I personally like Facebook & Twitter and Google is a good search engine but YOU need to make informed decisions on what services you use online, and what information you surrender to those services, because changing a few settings on their website will make ABSOLUTELY NO DIFFERENCE.


AppleWhether you believe it or not, Apple has taken a fairly adversarial approach to data protection, committing to protecting your data not only on your devices but also online with anti-tracking features in their browser (Safari), but in the scale of things and despite Apples best intentions it's not going to make very much difference in the end. The only way for Apple to make an effective dent in the data collection market would be to block all social media and search engines from users devices, which they won't do for obvious reasons and in the real world everyone has to make their own decisions on what they do and don't use. 


The near future

There's no doubt that data collection and dissemination is a business model that's here to stay, and you have to look at both sides of the argument. Imagine how much easier it is for our Police to be able to tell exactly who was where and when, Imagine how pattern analysis of messages and movements can identify possible crimes before they are committed, or imagine a world where your every move is recorded, analysed and reported. There's always two sides to it. 


Although GEN VoIP Encryption can only secure voice communications between GEN VoIP Customers/Sites, We also offer VoIP encrypted to Mobile Phones using a local App so for Company Site <-> Company Mobiles we can guarantee voice security.


Continue reading
  64526 Hits


© (c) 2018 GEN. E&OE

Recent Comments
Guest — kumar
Consider my suitably enlightened!
Thursday, 06 September 2018 10:09
Guest — jerald g
thank you. I've removed all my files from onedrive and will be storing them on my pc from now onwards.
Saturday, 02 February 2019 10:09
Guest — best online bingo
Monday, 21 October 2019 07:29
64526 Hits

In defence of Facebook and Social Media


There's a lot of hysteria in the news around Facebook and personal data, and that's fine it's a slow news week, yet the real truth is that Facebook did nothing wrong. 

Facebook, like all social media, is a business, plain and simple. Their business model is to provide a free service to you, and from that collect information and then sell that information to third parties for the purposes of advertising, marketing, market research, and analysis. A wise man once said in relation to internet services,

"If you don't pay anything for a product, then you Are the product"

and its true of Facebook just as it is for Twitter, Pinterest, Instagram, Snapchat, WhatsApp and so on. You use the service for free, and the company running the service and spending significant sums to develop and maintain it gets free and unrestricted use of your data. Sounds like a fair deal to me. 

Facebook will tell you its in the agreement you accept when you setup an account, and it is, but its also just common sense. So, delete facebook if you wish or keep using it in the knowledge that they will collect and sell your data as part of their business. This same framework applies to all social media, the majority of 'free' apps you can download for you phone, and other free services such as google, gmail, yahoo, bing and so on.

If, for whatever reason you object to any of these business models and do not want your personal data scanned, analysed, sold and so forth then that's your right, but don't whine about it on the very service you're complaining about! 


To those still outraged at the idea that Facebook sold their data, Facebook is just one of many that you will undoubtably use and they are all doing what Facebook does, so singling out Facebook does indicate a certain online naivety. For anyone who uses 'free' email, did you know every email you send and receive is read and analysed by the company operating the service? Did you know that every time you use google to search for something they track not only what you search, but how long you spend looking, what you click on and for how long? Did you know that every picture you've ever uploaded to a photo service such as tumblr, pinterest, instagram, and so on is then scanned and faces recognised and cross linked between users? Did you know that the Chrome browser stores everything you've ever searched for in a file on your PC? 

I could go on and on so get with the programme and understand the model at work here and then make informed choices about what you will and won't participate in. 

Loss of control

One subject that has been asked a few times recently is how do you withdraw your consent for your data to be used? and the short answer is besides some 'settings' that change very little, you cannot. Whilst you can write to some companies and express your wish they have no obligation to take account, and further since they've already sold your data many times over the chances of you being able to track down all renditions and withdraw them all is zero. If you've used social media, search engines, free email then it's simply too late, but you have an opportunity to educate your children and ensure they make informed choices. 

This article generalises the business model although it is understood that each company may vary their model specifically for their users. There is no complaint or blame here, just education. E&OE. 

Continue reading
  10102 Hits
10102 Hits

The 2017 Toyota Prius PHEV


We recently selected the Toyota Prius PHEV for our 2017-2020 Fleet and after 6 months its time for a real world review. The New Prius PHEV comes in two flavours, the Business and the Excel. The former lacks many of the refinements yet has an optional solar roof whereas the latter is probably the only sensible choice but cannot have the solar roof.  

The Toyota website quotes "Fun to drive" as one of the USPs for the Prius PHEV and indeed it is much more fun to drive than the regular Prius. In Electric only mode its fast and sporty, so much so that even in damp conditions its hard to keep the front wheels stuck to the road. In Hybrid mode it performs pretty much as the regular Prius. The quoted range is 30 miles and we can achieve that if driven very carefully and without any heating but in the real world you can expect to get 21-26 miles range and in the winter its more like 18-20. When pushed the traction control doesn't seem to control anything and your left with the same understeer issues that you would expect from most front wheel drives. It would have been nice to have seen a rear motor as in the Estima for even more go and some 4 wheel handling but sadly not.  

The city drive is really good, very sedate and comfortable especially in traffic and you have to believe that this scenario is the real purpose of the PHEV. Motorway driving is good but there is significantly more engine and road noise which requires an adjustment in expectations, again, its a city car for sure. You have full control over EV or HV modes allowing you to mix/match to obtain maximum fuel economy on longer journeys. A good example here would be a 40 mile round trip that involves around 50% at 50mph, and the rest slower in the city, Select EV for the city driving, and HV for the longer faster runs and this works great. You can even 'charge' the battery whilst in HV mode should you need it. 

Once the battery is empty, your then back to Hybrid mode and this seems to regularly achieve 50-55mpg which is very respectable but overall performance is severely diminished. One point to note is that Toyota seem to have failed to match the relative throttle position of the EV and HV modes so when switching back and forth you're required to adjust the throttle which takes a little getting used to. 


The exterior style is unique and truly stunning, and was a large component of our purchasing decision. With its quad LED Headlights and its sleek aerodynamic profile this is one of those vehicles that stands out from the rest. The alloy wheels are also fairly unique although I would have preferred some alternative options available. The vehicle is available in only 4 colours and black isn't one of them which was a shame and again more options available here would certainly not go a miss. The rear boot glass is elegant and expensive but of course lacks a rear wiper because of this, and it could do with one. 


The interior, when compared to the previous Prius Plugin is a significant upgrade and everything feels a little more upmarket. Comfortable leather seats further enhance the experience and the cabin is quite spacious even for the larger occupant all of which enhances the driving experience. There are however a few complaints to consider, such as the dash decor that sweets to the left from the infotainment system is just a crap trap and with the sweeping dash the windscreen is hard to reach and clean but these are generally very minor issues. The cup holders are generous and easily accessible as is the Qi Charging Tray but there is a definite lack of somewhere to put your crap which now tends to occupy one of the cup holders. The storage area between the front seats is ok but the lid opens sideways and not backwards making it very awkward for the passenger to access and quite awkward for the driver. The steering wheel is smaller than most but with the power assist its more than acceptable. 

The heating however is utterly worthless. I know its an EV and I also know that EV's have poor heating but this vehicle seems to excel in poor heating. There is an option to pre-heat from the key fob before a journey but that just steams up all the windows and defrosts nothing, when you get in the vehicle you then need to use de-mist  which then starts the petrol engine so what possible benefit that is I'm not sure. At 0c outside I ran the pre-heat three times and it didn't even clear the frost from the front window let alone the rear ones. Even on FAST mode, Heat set to HI, driver only and in Power mode the heating still struggles to heat the cabin in moderate exterior temperatures. Its so bad in fact that the back and rear windows permanently steam up and this means you need the rear de-mist permanently on, which is also underpowered. There are heated seats in the front but those also seems under powered and were definitely an after thought judging by the ridiculous location of the switches (below)

But climate aside the interior is pleasant environment in which to spend your day. The infotainment system is covered separately Toyota's Touch 2 & Go Review so I'll skip over that for now and focus on something that caught us by surprise a little. The boot. 

As you can see from the picture a large part of the boot is taken up with the batteries leaving a greatly reduced cargo area. We didn't see this initially as being a problem but once you start loading it up with equipment you soon find that the back seats are lost to overflow so consider this carefully. 


The vehicle comes with a charger for a normal 13A socket which takes 4 hours to change. Additionally you can have a hard wired charger installed at your property that will charge at 16A and this reduces charging time to 2Hours 10Minutes. Unfortunately that's the fastest it will charge, even though most properties are able to supply 40A which would charge in less than an hour and this makes charging on the go a no-go unfortunately but charging at work is still do-able. 

You are able to setup charging schedules so that your daily charge can be taken in off peak times and cheaper electricity, and when you turn off the vehicle you have the option to bypass this scheduling and charge immediately if required which is nice. 

Driving Features

The new Prius PHEV comes with a wide range of driving features which I'll address individually here, but collectively its a nice package that is rarely seen on a vehicle of this price point.

HUD (Head Up Display)

The Prius has featured a heads-up display for many years and generations but in this model the display is further enhanced and very visible. It's also a colour display which is great except that the normal display is in monochrome, but I assume to be as clearly visible as it is a single colour is beneficial. The only downer for this feature I can see is that the SATNAV is *not* replicated to this display as it is in most, if not all other vehicles with a HUD. 

Automatic High/Low Beam Headlights

A well thought out system that works in the majority of cases even if its a little slow to react sometimes and it only works faster than 40mph which can be annoying. The system is activated by a switch located near your knee which is unfortunate making it a distraction to turn it on and off. Overall however its a good system as long as you understand its limits. 

Radar Cruise Control

Not so well thought out and the sort of system that seems to work great right up to the point where it quits working as you're approaching the vehicle in front at speed, which it does. Further when you are trying to engage it, it just won't engage for some reason and gives no feedback or reason why. It seems to work well in queuing traffic but again occasionally just quits working for no reason. When it quits working the warning is tame and often missed leaving you to discover that its not going to brake for you at the point when your thinking 'why isn't it breaking'. Another annoyance is that it constantly feels the need to display pointless images and messages on the dash obscuring key information and you cannot turn that off. On roads with corners, not that we have any of those in the UK it seems to regularly loose track of the car in front and accelerate then spot it and brake again usually in the corner which can be worrying and is just bad implementation. So overall it works, but you've got to be supervising it at all times and preparing for its failure. 

Road Sign Recognition

It does, but it doesn't. Road sign recognition is probably a good idea and I'm sure it works great in Japan but here it either gets it wrong or misses the signs altogether. Turn it off and move on. 

Collision Protection

Well, this kinda works and if you're using the radar cruise control then you're going to get a chance to test this from time to time. The only problem here is that when its activated and it detects an imminent crash it displays BRAKE in red on the far left dash accompanied by a fairly feeble beep that serves no purpose. Ideally for such a function to be effective it should BEEP loudly and flash everything so the driver is immediately aware that they need to take action. 

Lane Departure

This works most of the time although it can become very annoying after a short time especially on country roads where the road markings are not so clear. On the motorway however it seems to work great. There is a button on the steering wheel to switch it on and off which makes managing the feature very easy.

Automatic Parking

Well, this is one of those features that does work if you have the patience to let it do it or if your not able to park yourself. For me its a gimmick that will never get used except to test it because I can park and I can do it much quicker and more accurately, but some may find this feature of use. The vehicle does have all around sonar so parking by ear is easy to do yourself.

Driver Information

The Prius boasts two 7" displays that form the digital dashboard display and it does have all that the regular Prius has but seems severely lacking in driver information for EV mode. It does show the average MPG and average Kw/H but for a single journey you cannot get the Kw/H used or regenerated nor can you get Kw/H remaining. Furthermore on the infotainment display you can get regenerated power whilst in Hybrid mode, but in EV it shows nothing. The 'battery gauge' is confusing and the Toyota manual does a bad job of trying to explain it. 

Its as-if the software was tweaked slightly to make it work with the EV but they couldn't be bothered to add the key functionality and data that you or I might want which serves to detract from an otherwise good vehicle. To take it further all this data that's collected cannot be downloaded or exported anywhere even though there's a USB port which for a business makes it hard to track mile performance metrics. Ideally you would want to be able to download a record of Kw/H used, regenerated and fuel used which would give everything needed. I know that Toyota don't expect to sell that many PHEV's but for the price they could at least dedicate some time to driver information. 

The Economics

There's a lot of talk around the economics of EV's over conventional fuel vehicles, but its really down to your driving requirements and some math has to be done to work out if its going to be worth the extra costs so let's do that now. 

Assuming that we take the purchase cost, grant, servicing, MPG, etc from the official Toyota website and throw in servicing and tyres then we're going to get a total cost of ownership over 5 years of £31670 for the Plug-in vs £30470 for the standard Prius excluding any finance charge (because finance varies significantly so we're going to assume here that you purchased it outright). 

Next we need to know the driving patterns for the year, and initially we're going to consider 15k miles per year, with an average journey of 30 miles, that's around 500 journeys per year. I'm going to take the EV range at 25 miles as a year average, and the cost of fuel at £5.50 per gallon and electricity at 0.13p/Kwh. Given that we can calculate the fuel and electricity costs for your 500 journeys which is

£412.50 per year for the Plug-in vs £1586.54 for the regular Prius and that's £2062.50 over 5 years for the plug-in and £7932.7 for the regular Prius. 

That brings the cost of travel for your 5 years to £33732.50 for the Plug-in and £38402.70 for the regular Prius showing a saving over 5 years of £4670.20. 

So, if your'e a 15k a year driver running an average of 30 miles per journey then your going to be a winner with the plug-in. For business however we'd need to consider an average mileage of around 60k, and an average journey of 150miles so let's do the math.

£5130.00 per year for the Plug-in and £6346.15 per year for the regular Prius. Again we'll add in the cost of ownership to give a 5 year travel cost of £57320.00 for the plug-in vs £62200.77 for the regular Prius giving a nett saving of £4880.77. 

So, on a scale of economy the Prius Plug-in is a clear winner for both domestic and business travel with the benefit being significantly greater if you can keep your average journeys to 25 miles or less, and of course be aware that we're using Toyota's values here and these may not be real world applicable. I'll add these figures into a table below to make it easier to see. 

Vehicle Miles/Year Average Journey Cost of Ownership Fuel costs / Year Total Cost of Travel / 5 Year
2017 Prius Plugin Excel 60000 150 £31670 £5130.00 £57320.00
2017 Prius Excel 60000 150 £30470 £6346.15 £62200.77
2017 Prius Plugin Excel 15000 30 £31670 £412.50 £33732.50
2017 Prius Excel 15000 30 £30470 £1586.54 £38402.69

Final Thoughts

I personally like the car and I like driving it especially in Electric only mode but some may find the greatly reduced cargo area combined with the lack of colours and options too much of a stretch. It is in my opinion a far better option than the Ampera/Volt (which we had before the PHEV's) because its more fun to drive, more comfortable and more stylish. You will also find some incentives available at your local Toyota dealer which can make the relative premium more manageable. 

There is a wealth of information on the Toyota.co.uk website but be aware that certain parts of it do not work, like 'My Toyota' which just gives you a blank page when you try and login so be aware of that. 




Continue reading
  10088 Hits
10088 Hits

Synology Hyperbackup and Certificates

Hyperbackup is a backup system provided by Synology on their Diskstation and Rackstations and its a good product as is the hardware, but like most things in Synology, the term "set it and forget it" does not apply as this customer found out to their detriment. 

The Synology NAS system has a web interface, which is in fact very good and well designed, it allows amongst other things for you to setup an SSL certificate to encrypt web traffic. This can be a self signed, purchased or lets-encrypt certificate and in the latter the process of renewal is automated which is nice. 

The problem comes when your SSL Certificate changes, which is would normally do annually for a purchased cert or every 90 days for lets-encrypt, at which point everything breaks including Hyperbackup and the cause isn't immediately clear. The dialogue above indicates that the destination for your backup is offline, you would of course check the backup server and find it online and running. You would check the firewall settings, probably restart the services maybe even reboot the server but nothing is going to make this work again until you go into settings and get as far as target at which point you notice...

Yes, seriously, because your certificate renewed and even though you've specifically not enabled transfer encryption the backup process crashes to a halt. You are required to press "Trust Server Certificate" to continue after which the backup will resume until the next certificate change (90 days for lets-encrypt, a year for purchased). Why? What possible purpose can there be to halting the backup every time a certificate renews? and why is there no way to prevent it? 

Just as a side note, other things that break are all the iOS applications, Cloud-Station Backup, Cloudstation, and probably more. If you are going to use a lets-encrypt certificate, and I would encourage you to do so, then every 90 days you need to make a note in your diary to go to all the servers and click all the buttons or stuff will stop working. 

Update 19/09/2018: Just had another new customer today who's had a volume crash and his hyper backup stopped working because of this about 6 months ago, so we're now in the position where he's shipping the unit back to us and we're going to have to attempt volume recovery. PLEASE CHECK YOUR HYPERBACKUP IS RUNNING REGULARLY

Update 20/01/2019 - Synology released an update that effectively FIXES this who scenario by allowing you to ignore certificate errors/always trust. We're briefing this out to our base and recommend you re-visit your Hyperbackup client and make the change. Nice one Synology! 

Continue reading
  9156 Hits
9156 Hits