Total results: 13

Apache Configure CORS Headers for Whitelist Domains

In the current implementation of Cross Origin Resource Sharing (CORS) the Access-Control-Allow-Origin header can only provide a single host domain or a wildcard as the accept value. This is not optimal when you have multiple clients connecting to the same virtual server and simply want to allow a list of known client host domains to the "allow" list.

Since only a single domain in a single access header can be delivered back to the client, Apache must read the incoming Origin header and match it to the list of "white" (accepted) domains. If an appropriate match is found, echo the domain host back to client as the value of Access-Control-Allow-Origin.

Use the following configuration snippet in the Apache virtual host ".conf" file or in the server ".htaccess" file. Ensure mod_headers and SetEnvIfNoCase are enabled.

<IfModule mod_headers.c>
   SetEnvIfNoCase Origin "https?://(www\.)?(domain\.com|staging\.domain\.com)(:\d+)?$" ACAO=$0
   Header set Access-Control-Allow-Origin %{ACAO}e env=ACAO
</IfModule>

The regular expression https?://(www\.)?(domain\.com|staging\.domain\.com)(:\d+)?$ matches the URL of Origin, a required HTTP header for all requests. The pattern matches both the http and https protocols. It will match an optional www. subdomain and finally matches the actual host name of your whitelist entries. Any characters after the domain name are ignored. This example will therefore enable:

* http://domain.com
* https://domain.com
* http://www.domain.com
* https://www.domain.com
* http://staging.domain.com
* https://staging.domain.com
* http://www.staging.domain.com
* https://www.staging.domain.com

If you send a request from http://staging.domain.com/app/, the response would include the header:

Access-Control-Allow-Origin: http://staging.domain.com

If you sent another request from https://www.domain.com/client/, the response would include the header:

Access-Control-Allow-Origin: https://www.domain.com

JavaScript Beacon API Explained

Beacons are a new API that is being experimentally introduced by browser vendors that will allow an XHR call to be made to a server when a webpage unloads without blocking the thread in order to create a better user experience. Quoting the W3C Beacon Specification.

"analytics and diagnostics code will typically make a synchronous XMLHttpRequest in an unload or beforeunload handler to submit the data. The synchronous XMLHttpRequest forces the User Agent to delay unloading the document, and makes the next navigation appear to be slower. There is nothing the next page can do to avoid this perception of poor page load performance."

This specification will prevent developers from having to use tricks and hacks to bend the rules of the browser, creating a better experience for everyone.

Disable Telekom "Navigationshilfe" search pages

To disable the automatic "Navigationshilfe" search page when misspelling a domain name, it must be deactivated at the Telekom website.

Visit the following link, https://kundencenter.telekom.de/kundencenter/kundendaten/navigationshilfe/ and log in with your Telekom account information.

Simply choose the "Navigationshilfe aus" option and save using the "Speichern" button.

Telekom Navigationshilfe options

You must now restart the Telekom router for the changes to take affect. Either hold the restart button, or simply disconnect the power for a few seconds.

After the router restart, Navigationshilfe will no longer bother you.

Thanks for the programming challenge, Deutsche Telekom!

A guide on how to fix the Speedport W 723V "web 'n' walk" configuration error that occurs when trying to run the setup.

Recently the W 723V Speedport router that I rent from Deutsche Telekom automatically pulled an update for new software. This update put my router into an unresponsive mode, meaning I had to reset the device and re-enter my login / configuration data through the internal configuration utility (A.K.A https://speedport.ip/).

The Speedport provides a configuration page where I would normally enter my Deutsche Telekom details and the setup would run. After this update however, the setup tells me that my "Web 'n' Walk Stick" has not been properly configured.

I do not own a Web 'n' Walk stick, so I saw this as an obvious bug in the configuration web page. The page was submitting the wrong configuration form details, as both forms are in the same page, just shown/hidden via JavaScript.

After calling Deutsche Telekom, they were clueless, had no help to give and promised to call back when an engineer was available. This is where I took the task upon myself to inspect the router software and see if I could fix this problem myself. The game was afoot, a programming challenge lay ahead. After around an hour absorbed in convoluted code, ignoring the temptation to let the Deutsche Telekom get away with this, I focused in on the problem, the following is the description of the solution.

Google Analytics bug: Missing events with numeric labels

I have recently discovered a small bug in Google Analytics that prevents an event from being collected in the reporting if the event "label" is a number.

The situation I encountered this issue in was when using my Google Analytics Wrapper to track custom events. I had the 3rd label argument as the funnel step number that the visitor was currently on, it was simply an integer (1, 2, 3...) and after some waiting time I realised that Google Analytics was not accepting these events.

After testing I simply added a "step" prefix to the value, making it a string ("step-1") and the events immediately began recording as expected.

This may be a bug but may in fact be a method that Google uses to prevent a developer from tracking specific users, which is against the terms of service, where Google assume that an integer is a user ID or similar.

If anyone else has run into this issue and has any further insight, it would be great to hear.

The most efficient CSS reset pattern

I've recently been investigating web page performance and focusing on CSS efficiency to see where performance gains can be made. After using the Chrome CSS profiler, I came to the conclusion that my "reset" CSS file was taking a long time to parse and match selectors, slowing down the browser paint process.

Reset CSS files are used to normalise default DOM element styles, so that your custom styles will work effortlessly across all browsers.

Naturally then, the reset CSS must target many elements in order to change their style. However, how can this be done in the most efficient manner?

This useful website, CSSReset.com provides 5 popular reset methods for you to use.

For my experiment I have used each of the provided codes and implemented them in turn into my web page. I then run though a "script" of accessing certain parts of the website while the Chrome CSS profiler is running and measure the results of each. I have performed the script multiple times with each code snippet to ensure a valid result.

In short, my findings are: normalize.css has the best efficiency, by quite a long way, with YUI coming in second place.

The reason normalize.css is highly more efficient is that unlike the other scripts, it does not select all DOM elements in a single comma separated declaration, but breaks up the declarations into smaller segments.

Due to my project not using many DOM elements, the browser can skip parsing for many of these declarations making overall performance better.

This is interesting as normalize.css is the least used code snippet on the website, at the current time of writing.

I'd be interested to follow up on this and here any arguments for or against using a reset CSS in this manner.

Improve page performance with optimal CSS

There are certain CSS best practices that should be followed in order to increase CSS performance in the browser. These will help the browser to perform less DOM parsing and reduce the time of repaint events.

1. Browser reflow and repaint

If you are unfamiliar with the concepts of reflow and repaint, here are some articles that will get you up to speed:

2. Remove unused styles

Simply having styles that aren't used causes the browser to search the DOM for matches needlessly. This is wasted processor time. Additionally, the CSS file will have a higher page weight that is needlessly sent over the wire to the client.

It may not seem to make a huge difference but removing dead CSS code will greatly increase performance and additionally improve the project stability and maintainability.

Check size of localStorage usage in Chrome

localStorage is a great tool for JavaScript developers to persist information across multiple sessions. Due to its simple API, large browser support and and higher reliability than cookies, it can easily lead to the situation of overusing localStorage.

Since localStorage has a maximum size of 5MB the following snippet will help inspect what localStorage keys are being used and how much size is being allocated to each.

To give credit, I have taken the method here and added a total counter.

var total = 0;
for(var x in localStorage) {
  var amount = (localStorage[x].length * 2) / 1024 / 1024;
  total += amount;
  console.log( x + " = " + amount.toFixed(2) + " MB");
}
console.log( "Total: " + total.toFixed(2) + " MB");

Simply run this in the Chrome console or create a bookmarklet from the following code (Right click bookmark bar > Add Page > Copy the code as "URL")

javascript:var total = 0;for(var x in localStorage) {  var amount = (localStorage[x].length * 2) / 1024 / 1024;  total += amount;  console.log( x + "=" + amount.toFixed(2) + " MB");}console.log( "Total: " + total.toFixed(2) + " MB");

Which will give a similar output:

Matches = 0.40 MB
Matches_timestamp = 0.00 MB
Teams = 0.19 MB
Teams_timestamp = 0.00 MB
Total: 0.59 MB

© Blake Simpson, 2012 – 2017