15 May 2015


Posts relating to the category tag "authentication" are listed below.

20 August 2010

Avoiding Popular Passwords

A few weeks ago I mentioned two new research papers about the use of passwords on website. Another new paper from Microsoft Research and Harvard University discusses how to avoid, and protect web sites from, users selecting popular passwords.

Part of the first page from 'Popularity is Everything: A New Approach to Protecting Passwords from Statistical-Guessing Attacks'

The paper Popularity is Everything: A New Approach to Protecting Passwords from Statistical-Guessing Attacks describes online and offline threats and defences against the sue of common popular passwords.

Password implementation policies can be guided by legacy approaches and various standards, but as mentioned previously, economics plays a large part too. Following a much publicised successful brute force against Twitter accounts, the company increased its password requirements. But rather than forcing passwords to be more complex, they instead took the decision to prevent the use of 370 common passwords. Whilst the list is culturally-biased, due to other breaches, there is similar data from other sites (e.g. here and here). But how does banning popular passwords help, and if the lists of common passwords are known, does this matter?

Firstly I'll mention here a couple of typical online tools for determining password complexity:

  • Password meter providing an indication of complexity
  • Hammer of God providing an estimate of how long it would take to obtain the password using a brute force attack

Don't put your real passwords into these sites or any other checkers! But these types of tools do not take into account popularity (e.g. '123456') or common manipulations (e.g. is 'P@ssword' really that much more secure than 'password'?). If attackers try popular passwords first (i.e. a dictionary attack), the time to break into a user's account may be much shorter.

The research paper, which does include some mathematics, suggests that simple passwords should be allowed providing they are not subject to statistical guessing attacks and proposes attack detection methods.

Good reading and inspiration for password-based authentication systems. I'm off to the station now, to get a train to Newcastle which was cancelled last night.

Posted on: 20 August 2010 at 07:00 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

30 July 2010

Economics of Website Users' Passwords

Two great papers on web site password security were published this month. We are swamped with passwords and every daily activity is increasingly linked with an online version, which requires users to register to obtain some additional benefits. Every organisation, resource, activity and event encourages us to visit their own website and sign-up.

Poster for nightclub in Newcastle-upon-Tyne promoting the Digitalism DJs, with a link to their website on MySpace

Firstly, in Where Do [Password] Security Policies Come From?, Dinei Florêncio and Cormac Herley of Microsoft Research discuss the password policies of 75 different web sites, in an effort to determine password strength requirements with other aspects such as size of site, assets protected, number of users and frequency of attacks.

The authors' findings suggest that none of these are the key factors, and in fact some of the largest sites, most attacked and with higher-value assets have the weakest password policies. The authors suggest stronger policies exist where organisations are more insulated from the consequences of poor usability, whereas online retailers and sites that rely on advertising revenues have to compete rigorously for users and traffic. The paper also discusses how strong passwords need to be, and how this is affected also by what attack methods you are considering (e.g. online vs. offline brute-force), and whether other security controls are implemented (e.g. account lock-out).

This idea of considering the whole password environment is taken further in The Password Thicket: Technical and Market Failures in Human Authentication on the Web by Joseph Bonneau and Sören Preibusch at the Cambridge University Computing Laboratory, and presented at this year's Economics of Information Security (WEIS 2010). Their study included 150 web sites looking at password implementations. the study looked more broadly at the protective measures used— not just complexity requirements—but whether these were applied consistently across the site's functionality (e.g. registration/enrolment, log-in/authentication, password change, password reset/recovery, log-out), encryption during transmission, storage of passwords in clear text, inclusion of passwords in emails, as well as protection from brute-force attacks.

The authors found that stricter security in one area was often undermined by weaknesses in another, suggesting that a lack of standards is harming security. The paper also discusses economic interpretations, such as how deploying passwords might be being used to justify collection of marketing data, and how password insecurity can be a negative externality.

Posted on: 30 July 2010 at 08:45 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

02 July 2010

Web Site Security Basics for SMEs

Sometimes when I'm out socially and people ask what I do, the conversation progresses to concerns about their own web site. They may have a hobby site, run a micro-business or be a manager or director of a small and medium-sized enterprise (SME)—there's all sorts of great entrepreneurial activity going on.

It is very common for SMEs not to have much time or budget for information security, and the available information can be poor or inappropriate (ISSA-UK, under the guidance of their Director of Research David Lacey, is trying to improve this). But what can SMEs do about their web presence—and it is very unusual not to have a web site, whatever the size of business.

Photograph of a waste skip at the side of St John Street in Clerkenwell, London, UK, with the company's website address written boldly across it

Last week I was asked "Is using <company> okay for taking online payments?" and then "what else should I be doing?". Remember we are discussing protection of the SME's own web site, not protecting its employees from using other sites. If I had no information about the business or any existing web security issues, this is what I recommend checking and doing before anything else:

  • Obtain regular backup copies of all data that changes (e.g. databases, logs, uploaded files) and store these securely somewhere other than the host servers. This may typically be daily, but the frequency should be selected based on how often data changes and how much data the SME might be prepared to lose in the event of total server failure.
    • check backup data can read and restored periodically
    • don't forget to securely delete data from old backups when they are no longer required
  • Use a network firewall in front of the web site to limit public (unauthenticated user) access to those ports necessary to access the web site. If other services are required remotely, use the firewall to limit from where (e.g. IP addresses) these can be used.
    • keep a record of the firewall configuration up-to-date
    • limit who can make changes to the firewall
  • Ensure the host servers are fully patched (e.g. operating system, services, applications and supporting code), check all providers for software updates regularly and allow time for installing these.
    • remove or disable all unnecessary services and other software
    • delete old, unused and backup files from the host servers
  • Identify all accounts (log in credentials) that provide server access (not just normal web page access), such as used for transferring files, accessing administrative interfaces (e.g. CMS admin, database and server management/configuration control panels) and using remote desktop. Change the passwords. Keep a record of who has access and remove accounts that are no longer required and enable logging for all access using these accounts.
    • restrict what each account can do as much as possible
    • add restrictions to the use of these accounts (e.g. limit access by IP address, require written approval for use, keep account disabled by default)
  • Check that every agreement with third parties that are required to operate the web site are in the organisation's own name. These may include the registration of domain names, SSL certificates, hosting contracts, monitoring services, data feeds, affiliate marketing agreements and service providers such as for address look-up, credit checks and making online payments.
    • ensure the third parties have the organisation's official contact details, and not those of an employee or of the site's developers
    • make note of any renewal dates
  • Obtain a copy of everything required for the web site including scripts, static files, configuration settings, source code, account details and encryption keys. Keep this updated with changes as they are made.
    • verify who legally owns the source code, designs, database, photographs, etc.
    • check what other licences affect the web site (e.g. use of open source and proprietary software libraries, database use limitations).

Do what you can, when you can. Once those are done, then:

  • Verify the web site and all its components (e.g. web widgets and other third party code/content) does not include common web application vulnerabilities that can be exploited by attackers (e.g. SQL injection, cross-site scripting).
  • Check what obligations the organisation is under to protect business and other people's data such as the Data Protection Act, guidance from regulators, trade organisation rules, agreements with customers and other contracts (e.g. PCI DSS via the acquiring bank).
    • impose security standards and obligations on suppliers and partner organisations
    • keep an eye open for changes to business processes that affect data
  • Document (even just some short notes) the steps to rebuild the web site somewhere else, and to transfer all the data and business processes to the new site.
    • include configuration details and information about third-party services required
    • think about what else will need to be done if the web site is unavailable (does it matter, if so what exactly is important?)
  • Provide information to the web site's users how to help protect themselves and their data.
    • point them to relevant help such as from GetSafeOnline, CardWatch and Think U Know
    • provide easy methods for them to contact the organisation if they think there is a security or privacy problem
  • Monitor web site usage behaviour (e.g. click-through rate, session duration, shopping cart abandonment rate, conversion rate), performance (e.g. uptime, response times) and reputation (e.g. malware, phishing, suspicious applications, malicious links) to gather trend data and identify unusual activity.
    • web server logs are a start, but customised logging is better
    • use reputable online tools (some of which are free) to help.

That's just the basics. So, what would be next for an SME? If the web site is a significant sales/engagement channel, the organisation has multiple web sites, is in a more regulated sector or one that is targetted particularly by criminals (e.g. gaming, betting and financial), takes payments or does other electronic commerce, allows users to add their own content or processes data for someone else, the above is just the start. Those SMEs probably need to be more proactive.

This helps to protect the SME's business information, but also helps to protect the web site users and their information. After all, the users are existing and potential customers, clients and citizens.

Oh, the best response I had to someone when I was explaining my work: "You're an anti-hacker than?". Well, I suppose so, but it's not quite how I'd describe it.

Any comments or suggestions?

Posted on: 02 July 2010 at 08:18 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

05 May 2010

Favourite Colour for Password Recovery?

The weaknesses of personal knowledge questions in password recovery schemes has been discussed previously, but a recent survey indicates you may also need to take into account the user's sex.

The survey may not have scientific rigour, but mentioning it does give me the chance to include this photograph of Alyson Shotz' sculpture Helix:

Close-up photograph of the sculpture Helix by Alyson Shotz, a aluminium and steel superstructure laminated with diachronic file, located on Euston Road, outside Sir John Soane's Holy Trinity Church opposite Great Portland Street in London during summer 2009, showing reflections of people, vehicles and the surrounding buildings in the multi-coloured facets

The survey on colour names shows that colour names are given differently by "girls and guys". I'm not sure too many girls will use "dusty teal", "blush pink" or "dusty lavender" in their "favourite colour" answer, but the listed colour names are certainly worth checking when verifying or demonstrating the strength of these personal knowledge question systems. Oh, as the survey points out, also include mis-spellings (e.g. of "fuchsia") in the testing lists of colour names. But "drop table statements" shouldn't normally be in your fuzzing list—those are for testing SQL injection.

So beware of weak password reset and recovery schemes. Sometimes they are less secure than the equivalent registration and authentication (logging in) processes.

Posted on: 05 May 2010 at 18:32 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

22 December 2009

Should The Whole Web Site Be SSL?

Britain has some snowy and cold weather at the moment causing difficulty for people getting to shops or going on holiday, and web retailers are likely to be doing brisk trade if they can still deliver before Christmas.

Photograph of the snow-covered landscape around Gatehouse, Northumberland yesterday 21 December 2009

E-commerce sites are often associated with HTTPS (a combination of HTTP with the SSL/TLS cryptographic protocol). There was a time when HTTPS was used only where absolutely necessary due to the additional encryption/decryption overhead it placed on a user's browser (client) and the web application (server). But what's the situation today?

N.B. the padlock symbol or green/blue coloured address bar (depending upon the type of certificate in use) indicating the use of a "secure" web server, does not mean your data is safe; it shows the server identity is verified to a certain extent and that data in transit between your web browser and the web server is probably safe from interception, if it is configured correctly on the server, the certificate has not expired or been revoked, and you can ensure the content you see is on the site the address bar says it is. It also says nothing about how the organisation and partners that handle the data once it has been received by the web server—they might forward it by email, allow third parties to have access to the server, print the data and leave it unprotected, etc.

Almost all web sites have some aspects that should only be accessible over HTTPS. Any sort of data entry form is likely to include personal information and therefore HTTPS should be used to at least protect the confidentiality of the information in transit. User registration, authentication (log in) and any pages that contain confidential information would also be included. Previously, many search engines did not index HTTPS addresses, but since its use was mainly restricted to content protected by some type of authentication and authorisation, this was never much of a concern.

But nowadays, search engines are indexing HTTPS content and a few web sites are only available using HTTPS. Is this a configuration worth following? In a discussion Ivan Ristić described the additional benefits of HTTPS (HTTP over SSL):

... Even with web sites that do not contain sensitive content (no need for confidentiality), you'd still want SSL to provide authentication (are you seeing the correct web site?) and integrity (has anyone modified content in transit?)... Can you have too much SSL? I don't think so.


So while there are benefits relating to authenticity and integrity, in addition to confidentiality, and dangers to mixing HTTP and HTTPS on the same site due to badly designed authorisation and session management systems, what other issues are there?

Search engines

The most popular search engine robots no longer discriminate whether the content is HTTP or HTTPS, so this is no longer a concern. I am not aware if any adverse effect on search engine optimisation (SEO), other than the effects of changing from HTTP to HTTPS or vice versa which would have to be managed carefully and appropriate permanent redirects set up (also called 301 redirect due to the HTTP response status code of 301 for "moved permanently").

Note that Google, and apparently Yahoo and Microsoft, support the "rel='canonical'" link element and state it can be used for indicating a preference for HTTP vs HTTPS, or vice versa, when pages are available by both. There is also a setting for this choice in Google webmaster tools if you are a site owner. But be careful with allowing both HTTP and HTTPS access to the same page, since this quite often is implemented in a way that adds vulnerabilities to user authentication and session management.

Resources on the server

The server is affected by two aspects—the increased number of requests (see also resources on the client, below) and the overhead of encryption/decryption/building SSL connection. Intermediate proxies should not cache the content and therefore a greater number of requests is to be expected. The additional resources required to serve content using HTTPS are discussed extensively here and in a research paper, i.e. there will be a performance hit, but whether this is a problem depends on your traffic profile, architecture, server utilisation and site's design.

Server side processes

It is possible that any server-side indexing or reporting systems may not support HTTPS and they may need to be updated or configured to work with the different protocol. If you syndicate data to other systems via XML, RSS or web services, these processes will also need to be checked for compatibility.

Traffic management

Network devices that inspect and route internet traffic must be SSL-aware to be able to read and analyse the content. Most modern devices will be able to support this mode of operation.

Client device support

Some devices (e.g. mobile) may not support HTTPS, or HTTPS may not be allowed through firewalls but this is probably less of an issue now. Check if these are issues with your expected users and the devices your site supports.

Address familiarity

Most people will not recognise (or type in) HTTPS addresses and use the common shorthand of the host name (e.g. www.clerkendweller.com) or an alias (e.g. clerkendweller.com) rather than the protocol followed by the full host name. So this would require a redirect from the HTTP address to the HTTPS one, and for many web sites this will be acceptable. For sites of a more sensitive nature, this would have to be handled carefully to protect any session identifiers and still leaves the user potentially vulnerable to a man in the middle (MITM) attack. These are where the redirect is amended and the user taken to a malicious web site instead. If you can rely on users using only the SSL address, perhaps by bookmarking it, you are on safer territory.

Resources on the client

Again there will be a performance hit on the user's client device (e.g. browser of a desktop computer). Much of the time this will not be a problem unless the device already lacks resources (e.g. a mobile device). Then again, due to the lack of caching, more requests will have to be made directly to the server, creating additional lag to download and build content.

Mixed content

Even if all the content from your own site is sent using HTTPS, you may have embedded content such as:

  • client-side web analytics
  • advertisements
  • news feeds
  • widgets
  • images, videos, scripts and other content hosted elsewhere.

These must also all be provided using HTTPS, otherwise the benefit of being HTTPS-only will be lost and users may see "mixed content" warning. But this can be a problem as much third-party content is not available using HTTPS (SSL), notably including Google AdSense, Amazon Affiliates and YouTube. However, Google Analytics does support SSL.

Conclusions and further reading

An all-HTTPS web site provides additional security benefits, but user acceptance and server constraints need to be considered in the site's design and architecture decision making processes. The partial, or full, use of HTTPS (SSL) in a web site needs to be considered carefully during design and development to ensure weaknesses that could be exploited are not built in, and then verified by thorough testing. If you have a heavily consumer-focused web site or include third-party content, some of the choices may have to be on the side of ease of use rather than with the lowest security risk.

"Whole site SSL" should be a serious consideration for "green field" web sites, especially where user authentication is required for any part of the content and for sites where phishing is a major risk (e.g. gaming, web mail, banking). User knowledge and acceptance may be difficult until we see the likes of major banks or large consumer-orientated sites (Google Mail, Google Docs, Twitter, Facebook) use this configuration and and display a warning/educational message to people who go to the non HTTPS site, rather than a redirect.

Posted on: 22 December 2009 at 09:12 hrs

Comments Comments (2) | Permalink | Send Send | Post to Twitter

23 October 2009

Clocks go back this weekend

This weekend the clocks change as we revert from British Summer Time (BST) to Greenwich Mean Time (GMT) at 02:00 BST on Sunday 25 October 2009 and the clocks go back, giving an extra hour.

What does this mean for web site security? Does running 01:00 to 02:00 twice matter? Well some brave web application owners will be disabling their systems like this online bank:

Partial screen capture of a web page notification message saying 'Important information regarding Internet Banking - Please note that the Internet Banking service will be temporarily unavailable due to essential maintenance from 12am until 3am on Sunday 25th October 2009. We apologise for any inconvenience this may cause.'

And, I don't think it's just being done as a finale to the current Energy Saving Week. Most people, quite rightly, won't be taking this rather severe step. Another millennium bug anyone? The date/time should be considered rather like other untrusted user input. Most problems will probably fall into the "business logic" category such as:

  • Failure of time-based logic where dates are being compared.
  • Assumptions of uniqueness in time-stamped output (e.g. by a single-threaded process).
  • Running tasks again leading to possible:
    • loss of data due to overwriting
    • duplication of exports or emails
    • creation of inaccuracies in management information.
  • Chronological ordering anomalies leading to other faults.

It's not just banks and other financial organisations that may have difficulties.

Partial screen capture of a web page notification message saying 'Whats New ... Website Downtine - The website will be unavailable on Sunday 25 October 2009 and for a short period of time on the evenings of Friday 6 November 2009 and Sunday 8 November 2009 for essential maintenance. Please accept our apologies for any inconvenience.'

The time change may expose some other vulnerabilities that only exist at changeover time and/or during the next overlap hour.

  • Circumvention of brute force attacks on user authentication mechanisms.
  • Increased risk due to extension of a session's validity where local time is recorded.
  • Failure in data validation routines for time-related comparisons.
  • Incubated vulnerabilities where a time-related aspect causes the attack to be possible.
  • Denial of service due to extension of account lock-out.
  • Using time as a loop counter.
  • Additional errors caused by any of the above leading to information leakage.

Recording the offset of local time to GMT/UTC and synchronisation should certainly be done, but may not resolve the time overlap issues. The effects on long-running "saga" requests might be especially difficult to determine. Time dependencies need to be specified and considered through the development lifecycle. Perhaps the bank is right after all?

Partial screen capture of a web page notification message saying 'Alcohol & Tobacco Warehousing Declarations (ATWD) ... Saturday 24 October 23:30 – Sunday 25 October 03:30 ... Due to essential maintenance customers will experience a delay in receiving their online acknowledgement to submissions made using our HMRC and commercial software between 23:30 on Saturday 24 October and 03:30 on Sunday 25 October. Your acknowledgement will be sent once the service is restored. Please do not attempt to resubmit your submission. We apologise for any inconvenience this may cause. '

Posted on: 23 October 2009 at 08:41 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

09 October 2009

Website Password Requirements

Sometimes you read someone else's blog post and think "I wish I had written that".

Well, Jeremiah Grossman has blogged All about Website Password Policies which neatly sums up password policies for typical web sites and web applications.

While the process seems straightforward, organization should never take choosing passwords lightly as it will significantly affect the user experience, customer support volume, and the level of security/fraud.

The suggested policy is not for your online bank, but the sort of web sites most developers have to work on day-to-day. He also has a good brief explanation of how to store passwords in a hash digest form (but also read the associated comments).

I do think, however, that there is never a reason to have passwords stored in plain text—password recovery mechanisms can be built that do not need to send the actual password. But also, I agree with the comment that passwords truncation should be done with care and that allowing users to have longer pass phrases (containing space characters) can be beneficial. Let the user decide on what they are happy with, above the minimum standards.

My own password related posts are Is Password Complexity Too Complex to Implement? and Guessable Usernames and Passwords.

Posted on: 09 October 2009 at 10:15 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

29 September 2009

IP Address Restrictions and Exceptions

It's common for access to some web sites to be restricted to users from particular Internet Protocol (IP) addresses. This is usually in addition to some other identification and authentication method. But other IP addresses are often added to this "allow list" and these should not necessarily be trusted in the same way.

Photograph of a sign with an exclamation mark on a yellow triangle that reads 'Caution - Traffic management Trial - DO NOT MOVE' on a construction site boundary's wire barrier

In a typical scenario, a web site hosted on the internet that is used to administer another web application might be restricted to the company's own IP addresses. Then the developers say they need to check something on the live site, or another server needs to index the content, or someone wants to work from home for a while, or the site needs to be demonstrated at a client's location. All these additional IP addresses are added to the "allow list". These restrictions may be being applied at a network firewall, traffic management system, at the web server, in the application itself, in intrusion detection systems or in log analytical software, or in many of these. These are difficult to manage and in time there will be many IP addresses that no-one knows why they are allowed unless they are carefully documented, and subject to a fixed time limit when they are confirmed again by an appropriate person or removed. These extra addresses are quite often hard for someone else to guess.

However, there is another area where IP addresses are added to "allow lists", and this is for remote monitoring and testing services. These might be checking uptime, response times, content changes, HTML validation or security testing. The service providers publish the IP addresses of the source systems so that companies can specifically allow access to their web sites. Since the number of these services is relatively small, it's not too difficult to find which one might give access to areas of a web site or web application that the public (and malicious people) should not be able to get to. The particular danger here is that the IP addresses might be excluded from monitoring and logging, and therefore even a diligent web site manager might not realise for example the uptime monitoring service is making unusual, or excessive, requests.

Although it is not likely a malicious person is using this "trusted" address unless routing has been compromised as well, problems can go undetected, from what might seem to be a legitimate source. The IP address may have been typed incorrectly, or worse, the restrictions/exceptions may not have been implemented correctly allowing more addresses to have the privileged access than intended. Not logging a user's session is privileged access.

Allow traffic through, but be very specific what is allowed and monitor what's going on. Review all the exceptions periodically. Be especially careful about anything that bypasses authentication (such as allowing a search engine to crawl restricted-access content) on an otherwise public site.

Posted on: 29 September 2009 at 10:18 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

18 August 2009

Phishing Protection or Phishing Enabler?

A friend forwarded me a wine offer from his bank. But the email from his bank included additional information meant to verify the authenticity of the sender.

The bank included the last part of his postcode in the body of the message as well as his title and family name. If those were meant to be for verification purposes, why did it suggest he forward the email to someone else?

Partial screen capture of an email message showing the text inviting the recipient to forward the message to a friend with some words obscured 'Hello ************** Recommend ************** to a friend by forwarding this email to them and you could both be celebrating with 12 bottles of wine each.  Ask them to click on one of the links below and follow the instructions - simple! Find out more here.  So if your mates would be happy to receive this offer, forward this email and make a friend **************
************** P.S. We hope you enjoy your wine, but please drink it responsibly.'

Yes, the forwarded email has all the verification details embedded in it. No, it's not the username and password, but it's everything someone would need to construct a phishing email that would be very hard to distinguish from a real one.

Partial screen capture showing the start of the forwarded email which includes the verification postcode and account holder's name - if you're not sure what the postcode is, there's even a link to explain what the number and letters are

This has the feel of using a security measure as a marketing gimmick. Mixing marketing emails and those necessary for servicing a bank account is a difficult balance, but this is way off the mark.

Posted on: 18 August 2009 at 08:13 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

07 July 2009

Password Masking Update

Last week I highlighted Jakob Nielsen's advice on Password Masking, which I believe to be misguided.

Nielsen's Alerbox generated a wide-ranging discussion—in particular there are some excellent points of view at Web Security Mailing List, and a comment on this blog from Robert Campbell. I was surprised that "security guru" Bruce Schneier seemed to agree with Nielsen and this has been widely reported such as Usability and Security Gurus Agree That Masked Passwords Should Go.

Bruce Schneier now admits he was "probably" wrong The Pros and Cons of Password Masking. Good, let's hope that becomes as widely reported.

Posted on: 07 July 2009 at 12:34 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

More Entries

Authentication : Application Security and Privacy
ISO/IEC 18004:2006 QR code for https://clerkendweller.uk

Requested by on Sunday, 29 November 2015 at 17:38 hrs (London date/time)

Please read our terms of use and obtain professional advice before undertaking any actions based on the opinions, suggestions and generic guidance presented here. Your organisation's situation will be unique and all practices and controls need to be assessed with consideration of your own business context.

Terms of use https://www.clerkendweller.uk/page/terms
Privacy statement https://www.clerkendweller.uk/page/privacy
© 2008-2015 clerkendweller.uk