Front-End / JavaScript Developer. Mozillian. Passionate advocate of the Open Web. Fond of Japan...
10 stories
·
0 followers

Old Shinjuku pictures

1 Share

Shinjuku 新宿 is my favorite area in Tokyo. I’ve been taking pictures of Shinjuku since 2004. During these 13 years I’ve seen this neibourhood transforming and I feel some kind of nostalgia when looking at my old Shinjuku pictures.

Shinjuku developed as a neuralgic point of activity in Japan since the beginning of the Edo area when a “juku” 宿, a place to stop and rest, was stablished on the side of Koshu Kaido, one of the five main routes of commerce in Edo Japan. Nowadays, Koshu Kaido is the avenue on the south exit of Shinjuku station that is used by more than 3 million people everyday.

Shinjuku was totally destroyed in the war and rebuilt from scratch after it. These are pictures from old Shinjuku, ranging from the 50s until the 70s. I love the general structure of the station and the surrounding streets hasn’t changed too much.


This is Shinjuku in the 70s. On the left side you can see a building with the sign Sakuraya さくらや, it is an electronics and home appliances shop.


This picture I took it in 2004 and also shows the same Sakuraya さくらや building. Sakuraya is no longer there, now it is a Bic Camera.


Shinjuku west exit area in the 70s.


Shinjuku west exit area in now.


Naito Shinjuku 内藤新宿 (Second dot from the right to the left on the map) was the first stop after Nihonbashi on the Koshu Kaido route during the Edo period. Naito was the name of an important daimyo who lived where now Shinjuku Gyoen park is located.

Read the whole story
tregagnon
2593 days ago
reply
France
Share this story
Delete

Facebook: Still Literally the Worst.

jwz
1 Comment and 5 Shares
Facebook Hosts Trump Inauguration Party With Fake News-Pushing Website The Daily Caller:

Facebook may now profess to combat fake news, but that didn't keep the company from hosting a Trump inauguration soirée in DC along with the Daily Caller, a far-right news website whose frequent misinformation and lies keep fact-checking website Snopes rather busy.

I know! Who cares about being in cahoots with a publication that celebrated Obama commuting the sentence of Chelsea Manning this week with the headline "Genetic Male Bradley Manning Pulls Off The Klinger Gambit," right? Not me!

Not to be left out of the action, oil company BP was also a host. It was, according to the publication, "A veritable who's who of beltway bigwigs" at the Hay-Adams Hotel in DC, all talking Trump.

Mark Zuckerberg sues over 100 Hawaiians to force them to sell them their ancestral land:

In 2015, Mark Zuckerberg (who insists that privacy is dead) bought 100 acres of land around his vacation home in Hawaii to ensure that no one could get close enough to spy on him.

The Zuckerberg estate on Kauai North Shore engulfs several smaller pieces of land deeded in the 1800s -- kuleana lands that were granted to native Hawaiians. The owners of this land are entitled to easements through Zuckerberg's property, so they can reach their own.

Zuckerberg has filed "quiet title" lawsuits to force the owners of more than 100 of these parcels to sell to him. His lawyer says it's the easiest way to figure out who has title to these family lands so he can make them an offer. Hey, when I want to find out who someone is, I always sue 'em.

Zuckerberg has a staff of twelve just to delete abusive comments on his Facebook page:

Typically, a handful of Facebook employees manage communications just for him, helping write his posts and speeches, while an additional dozen or so delete harassing comments and spam on his page, say two people familiar with the matter. Facebook also has professional photographers snap Zuckerberg, say, taking a run in Beijing or reading to his daughter

In an internal Facebook post, Zuckerberg defends his company's ongoing association with Peter Thiel -- Facebook investor/board member and major donor to white-supremacist / pro-rape candidate Donald Trump.

This is your irregularly-scheduled reminder:

If you work for Facebook, quit. It is morally indefensible for you to use your skills to make that company more powerful. By working there, you are making the world an objectively worse place. I'm sure you can find a job working for a company that you don't have to apologize for all the time.

You can do it. I believe in you.

Previously, previously, previously, previously, previously.

Read the whole story
tregagnon
2645 days ago
reply
France
Share this story
Delete
1 public comment
smadin
2649 days ago
reply
If you work for Facebook, quit.
Boston

Caching best practices & max-age gotchas

1 Share

Getting caching right yields huge performance benefits, saves bandwidth, and reduces server costs, but many sites half-arse their caching, creating race conditions resulting in interdependent resources getting out of sync.

The vast majority of best-practice caching falls into one of two patterns:

Pattern 1: Immutable content + long max-age

Cache-Control: max-age=31536000
  • The content at this URL never changes, therefore…
  • The browser/CDN can cache this resource for a year without a problem
  • Cached content younger than max-age seconds can be used without consulting the server

Page: Hey, I need "/script-v1.js", "/styles-v1.css" and "/cats-v1.jpg" 10:24

Cache: I got nuthin', how about you Server? 10:24

Server: Sure, here you go. Btw Cache: these are fine to use for like, a year. 10:25

Cache: Thanks! 10:25

Page: Cheers! 10:25

The next day

Page: Hey, I need "/script-v2.js", "/styles-v2.css" and "/cats-v1.jpg" 08:14

Cache: I've got the cats one, here you go. I don't have the others. Server? 08:14

Server: Sure, here's the new CSS & JS. Btw Cache: these are also fine to use for a year. 08:15

Cache: Brilliant! 08:15

Page: Thanks! 08:15

Later

Cache: Hm, I haven't used "/script-v1.js" & "/styles-v1.css" for a while. I'll just delete them. 12:32

In this pattern, you never change content at a particular URL, you change the URL:

<script src="/script-f93bca2c.js"></script>
<link rel="stylesheet" href="/styles-a837cb1e.css">
<img src="/cats-0e9a2ef4.jpg" alt="…">

Each URL contains something that changes along with its content. It could be a version number, the modified date, or a hash of the content - which is what I do on this blog.

Most server-side frameworks come with tools to make this easy (I use Django's ManifestStaticFilesStorage), and there are smaller Node.js libraries that achieve the same thing, such as gulp-rev.

However, this pattern doesn't work for things like articles & blog posts. Their URLs cannot be versioned and their content must be able to change. Seriously, given the basic spelling and grammar mistakes I make, I need to be able to update content quickly and frequently.

Pattern 2: Mutable content, always server-revalidated

Cache-Control: no-cache
  • The content at this URL may change, therefore…
  • Any locally cached version isn't trusted without the server's say-so

Page: Hey, I need the content for "/about/" and "/sw.js" 11:32

Cache: I can't help you. Server? 11:32

Server: Yep, I've got those, here you go. Cache: you can hold onto these, but don't use them without asking. 11:33

Cache: Sure thing! 11:33

Page: Thanks! 11:33

The next day

Page: Hey, I need content for "/about/" and "/sw.js" again 09:46

Cache: One moment. Server: am I good to use my copies of these? My copy of "/about/" was last updated on Monday, and "/sw.js" was last updated yesterday. 09:46

Server: "/sw.js" hasn't changed since then… 09:47

Cache: Cool. Page: here's "/sw.js". 09:47

Server: …but "/about/" has, here's the new version. Cache: like before, you can hold onto this, but don't use it without asking. 09:47

Cache: Got it! 09:47

Page: Excellent! 09:47

Note: no-cache doesn't mean "don't cache", it means it must check (or "revalidate" as it calls it) with the server before using the cached resource. no-store tells the browser not to cache it at all. Also must-revalidate doesn't mean "must revalidate", it means the local resource can be used if it's younger than the provided max-age, otherwise it must revalidate. Yeah. I know.

In this pattern you can add an ETag (a version ID of your choosing) or Last-Modified date header to the response. Next time the client fetches the resource, it echoes the value for the content it already has via If-None-Match and If-Modified-Since respectively, allowing the server to say "Just use what you've already got, it's up to date", or as it spells it, "HTTP 304".

If sending ETag/Last-Modified isn't possible, the server always sends the full content.

This pattern always involves a network fetch, so it isn't as good as pattern 1 which can bypass the network entirely.

It's not uncommon to be put off by the infrastructure needed for pattern 1, but be similarly put off by the network request pattern 2 requires, and instead go for something in the middle: a smallish max-age and mutable content. This is a baaaad compromise.

max-age on mutable content is often the wrong choice

…and unfortunately it isn't uncommon, for instance it happens on Github pages.

Imagine:

  • /article/
  • /styles.css
  • /script.js

…all served with:

Cache-Control: must-revalidate, max-age=600
  • Content at the URLs changes
  • If the browser has a cached version less than 10 minutes old, use it without consulting the server
  • Otherwise make a network fetch, using If-Modified-Since or If-None-Match if available

Page: Hey, I need "/article/", "/script.js" and "/styles.css" 10:21

Cache: Nothing here, Server? 10:21

Server: No problem, here they are. Btw Cache: these are fine to use for the next 10 minutes. 10:22

Cache: Gotcha! 10:22

Page: Thanks! 10:22

6 minutes later

Page: Hey, I need "/article/", "/script.js" and "/styles.css" again 10:28

Cache: Omg, I'm really sorry, I lost the "/styles.css", but I've got the others, here you go. Server: can you give me "/styles.css"? 10:28

Server: Sure thing, it's actually changed since you last asked for it. It's also fine to use for the next 10 minutes. 10:29

Cache: No problem. 10:29

Page: Thanks! Wait! Everything's broken!! What's going on? 10:29

This pattern can appear to work in testing, but break stuff in the real world, and it's really difficult to track down. In the example above, the server actually had updated HTML, CSS & JS, but the page ended up with the old HTML & JS from the cache, and the updated CSS from the server. The version mismatch broke things.

Often, when we make significant changes to HTML, we're likely to also change the CSS to reflect the new structure, and update the JS to cater for changes to the style and content. These resources are interdependent, but the caching headers can't express that. Users may end up with the new version of one/two of the resources, but the old version of the other(s).

max-age is relative to the response time, so if all the above resources are requested as part of the same navigation they'll be set to expire at roughly the same time, but there's still the small possibility of a race there. If you have some pages that don't include the JS, or include different CSS, your expiry dates can get out of sync. And worse, the browser drops things from the cache all the time, and it doesn't know that the HTML, CSS, & JS are interdependent, so it'll happily drop one but not the others. Multiply all this together and it becomes not-unlikely that you can end up with mismatched versions of these resources.

For the user, this can result in broken layout and/or functionality. From subtle glitches, to entirely unusable content.

Thankfully, there's an escape hatch for the user…

A refresh sometimes fixes it

If the page is loaded as part of a refresh, browsers will always revalidate with the server, ignoring max-age. So if the user is experiencing something broken because of max-age, hitting refresh should fix everything. Of course, forcing the user to do this reduces trust, as it gives the perception that your site is temperamental.

A service worker can extend the life of these bugs

Say you have the following service worker:

const version = '2';

self.addEventListener('install', event => {
  event.waitUntil(
    caches.open(`static-${version}`)
      .then(cache => cache.addAll([
        '/styles.css',
        '/script.js'
      ]))
  );
});

self.addEventListener('activate', event => {
  // …delete old caches…
});

self.addEventListener('fetch', event => {
  event.respondWith(
    caches.match(event.request)
      .then(response => response || fetch(event.request))
  );
});

This service worker…

  • Caches the script and styles up front
  • Serves from the cache if there's a match, otherwise goes to the network

If we change our CSS/JS we bump the version to make the service worker byte-different, which triggers an update. However, since addAll fetches through the HTTP cache (like almost all fetches do), we could run into the max-age race condition and cache incompatible versions of the CSS & JS.

Once they're cached, that's it, we'll be serving incompatible CSS & JS until we next update the service worker - and that's assuming we don't run into another race condition in the next update.

You could bypass the cache in the service worker:

self.addEventListener('install', event => {
  event.waitUntil(
    caches.open(`static-${version}`)
      .then(cache => cache.addAll([
        new Request('/styles.css', { cache: 'no-cache' }),
        new Request('/script.js', { cache: 'no-cache' })
      ]))
  );
});

Unfortunately the cache options aren't yet supported in Chrome/Opera and only recently landed in Firefox Nightly, but you can sort-of do it yourself:

self.addEventListener('install', event => {
  event.waitUntil(
    caches.open(`static-${version}`)
      .then(cache => Promise.all(
        [
          '/styles.css',
          '/script.js'
        ].map(url => {
          // cache-bust using a random query string
          return fetch(`${url}?${Math.random()}`).then(response => {
            // fail on 404, 500 etc
            if (!response.ok) throw Error('Not ok');
            return cache.put(url, response);
          })
        })
      ))
  );
});

In the above I'm cache-busting with a random number, but you could go one step further and use a build-step to add a hash of the content (similar to what sw-precache does). This is kinda reimplementing pattern 1 in JavaScript, but only for the benefit of service worker users rather than all browsers & your CDN.

The service worker & the HTTP cache play well together, don't make them fight!

As you can see, you can hack around poor caching in your service worker, but you're way better off fixing the root of the problem. Getting your caching right makes things easier in service worker land, but also benefits browsers that don't support service worker (Safari, IE/Edge), and lets you get the most out of your CDN.

Correct caching headers means you can massively streamline service worker updates too:

const version = '23';

self.addEventListener('install', event => {
  event.waitUntil(
    caches.open(`static-${version}`)
      .then(cache => cache.addAll([
        '/',
        '/script-f93bca2c.js',
        '/styles-a837cb1e.css',
        '/cats-0e9a2ef4.jpg'
      ]))
  );
});

Here I'd cache the root page using pattern 2 (server revalidation), and the rest of the resources using pattern 1 (immutable content). Each service worker update will trigger a request for the root page, but the rest of the resources will only be downloaded if their URL has changed. This is great because it saves bandwidth and improves performance whether you're updating from the previous version, or 10 versions ago.

This is a huge advantage over native, where the whole binary is downloaded even for a minor change, or involves complex binary diffing. Here we can update a large web app with relatively little download.

Service workers work best as an enhancement rather than a workaround, so instead of fighting the cache, work with it!

Used carefully, max-age & mutable content can be beneficial

max-age on mutable content is often the wrong choice, but not always. For instance this page has a max-age of three minutes. Race conditions aren't an issue here because this page doesn't have any dependencies that follow the same caching pattern (my CSS, JS & image URLs follow pattern 1 - immutable content), and nothing dependent on this page follows the same pattern.

This pattern means, if I'm lucky enough to write a popular article, my CDN (Cloudflare) can take the heat off my server, as long as I can live with it taking up to three minutes for article updates to be seen by users, which I am.

This pattern shouldn't be used lightly. If I added a new section to one article and linked to it in another article, I've created a dependency that could race. The user could click the link and be taken to a copy of the article without the referenced section. If I wanted to avoid this, I'd update the first article, flush Cloudflare's cached copy using their UI, wait three minutes, then add the link to it in the other article. Yeah… you have to be pretty careful with this pattern.

Used correctly, caching is a massive performance enhancement and bandwidth saver. Favour immutable content for any URL that can easily change, otherwise play it safe with server revalidation. Only mix max-age and mutable content if you're feeling brave, and you're sure your content has no dependancies or dependents that could get out of sync.

Read the whole story
tregagnon
2888 days ago
reply
France
Share this story
Delete

Reddit’s Warrant Canary Just Died

1 Comment and 3 Shares

Cory Doctorow, writing at BoingBoing:

In early 2015, Reddit published a transparency report that contained heading for National Security Requests, noting, “As of January 29, 2015, reddit has never received a National Security Letter, an order under the Foreign Intelligence Surveillance Act, or any other classified request for user information.”

Five hours ago, Reddit published its 2015 edition, which contains no mention of classified requests for user information.

“Warrant canaries” are a response to the practice by governments of serving warrants on service providers that include gag orders forbidding the service from disclosing the warrant’s existence.

The idea of warrant canaries is ingenious — but when one works, it’s both terrifying and sad.

Read the whole story
tregagnon
2944 days ago
reply
France
Share this story
Delete
1 public comment
aaronwe
2944 days ago
reply
Props to Reddit for putting the canary out there.
Denver
stefanetal
2944 days ago
I'm wondering if the Feds can prevent the people who put out transparency reports from knowing...can the Feds make a small part of the firm legal department directly implement the National Security letter via the appropriate IT system, without telling anybody else? Easily imaginable, especially if canaries become an issue.

Hebocon – The Crappy Robots Competition

1 Share

I was not aware of the existence of Hebocon, but then just today I found the video shown below. Hebocon is a robot competition in which the important thing is not to have the most sophisticated robot as in the Robocup, what matters is the originality and above all the crappiness of the robots. In fact “Hebo” ヘボ, within the name of the competition “Hebocon”, means “Crappy”.

The rules are so simple that even kids younger than 10 years old participate.

hebocon

hebocon


>hebocon

hebocon

hebocon

Read the whole story
tregagnon
3214 days ago
reply
France
Share this story
Delete

Optimising SVG images

1 Share

SVG is a vector image format based on XML. It has great advantages, most notably it is lightweight. Since SVG is a text format, it can be viewed and modified using a simple text editor, and applying GZIP compression produces excellent results.

It’s critical for a website to provide assets that are as lightweight as possible, especially on mobile where bandwidth can be very limited. You want to optimise your SVG files to have your app load and display as quickly as possible.

This article will show how to use dedicated tools to optimise SVG images. You will also learn how the markup works so you can go the extra mile to produce the lightest possible images.

Introducing svgo

Optimising SVG is very similar to minifying CSS or other text-based formats such as JavaScript or HTML. It is mainly about removing useless whitespace and redundant characters.

The tool I recommend to reduce the size of SVG images is svgo. It is written for node.js. To install it, just do:

$ npm install -g svgo

In its basic form, you’ll use a command line like this:

$ svgo --input img/graph.svg --output img/optimised-graph.svg

Please make sure to specify an --output parameter if you want to keep the original image. Otherwise svgo will replace it with the optimised version.

svgo will apply several changes to the original file—stripping out useless comments, tags, and attributes, reducing the precision of numbers in path definitions, or sorting attributes for better GZIP compression.

This works with no surprises for simple images. However, in more complex cases, the image manipulation can result in a garbled file.

svgo plugins

svgo is very modular thanks to a plugin-based architecture.

When optimising complex images, I’ve noticed that the main issues are caused by two svgo plugins:

  • convertPathData
  • mergePaths

Deactivating these will ensure you get a correct result in most cases:

$ svgo --disable=convertPathData --disable=mergePaths -i img/a.svg

convertPathData will convert the path data using relative and shorthand notations. Unfortunately, some environments won’t fully recognise this syntax and you’ll get something like:

Screenshot of Gnome Image Viewer displaying an original SVG image (left) and a version optimised via svgo on (right)

Screenshot of Gnome Image Viewer displaying an original SVG image (left) and a version optimised via svgo on (right)



Please note that the optimised image will display correctly in all browsers. So you may still want to use this plugin.

The other plugin that can cause you trouble—mergePaths—will merge together shapes of the same style to reduce the number of <path> tags in the source. However, this might create issues if two paths overlap.

Merge paths issue

In the image on the right, please note the rendering differences around the character’s neck and hand, also note the Twitter logo. The outline view shows 3 overlapping paths that make up the character’s head.

My suggestion is to first try svgo with all plugins activated, then if anything is wrong, deactivate the two mentioned above.

If the result is still very different from your original image, then you’ll have to deactivate the plugins one by one to detect the one which causes the issue. Here is a list of svgo plugins.

Optimising even further

svgo is a great tool, but in some specific cases, you’ll want to compress your SVG images even further. To do so, you have to dig into the file format and do some manual optimisations.

In these cases, my favourite tool is Inkscape: it is free, open source and available on most platforms.

If you want to use the mergePaths plugin of svgo, you must combine overlapping paths yourself. Here’s how to do it:

Open your image in Inkscape and identify the path with the same style (fill and stroke). Select them all (maintain shift pressed for multiple selection). Click on the Path menu and select Union. You’re done—all three paths have been merged into a single one.

Merge paths technique

The 3 different paths that create the character’s head are merged, as shown by the outline view on the right.



Repeat this operation for all paths of the same style that are overlapping and then you’re ready to use svgo again, keeping the mergePaths plugin.

There are all sorts of different optimisations you can apply manually:

  • Convert strokes to paths so they can be merged with paths of similar style.
  • Cut paths manually to avoid using clip-path.
  • Exclude an underneath path from a overlapping path and merge with a similar path to avoid layer issues. (In the image above, see the character’s hair—the side hair path is under his head, but the top hair is above it—so you can’t merge the 3 hair paths as is.)

Final considerations

These manual optimisations can take a lot of time for meagre results, so think twice before starting!

A good rule of thumb when optimising SVG images is to make sure the final file has only one path per style (same fill and stroke style) and uses no <g> tags to group path to objects.

In Firefox OS, we use an icon font, gaia-icons, generated from SVG glyphs. I noticed that optimising them resulted in a significantly lighter font file, with no visual differences.

Whether you use SVG for embedding images on an app or to create a font file, always remember to optimise. It will make your users happier!

Read the whole story
tregagnon
3330 days ago
reply
France
Share this story
Delete
Next Page of Stories