Sunday, 5 April 2020

Standards for N95 Respirators and Surgical Masks

I'm not a medical professional, but as a mother of a young baby and daughter of parents who are over 60 years old, I ended up doing so much reading up on masks/respirators that I thought I may as well collate my thoughts together in a post like this which might be useful for others to understand the standards for respirators and masks, and the difference between disposable dust masks, surgical masks, and N95 respirators. [Note: this post was many days in the making because I've been caring for a small and very wriggly baby, so do note that I started writing this post before PM Lee announced the 'circuit-breaker' in Singapore. I expect that guidances may change regarding masks over time...]

It is rather long, so here is a summary so you can skip to the specific sections quickly...

Pitta Masks and PM2.5 pollution masks do not filter for virus particles
The difference between Masks and Respirators
N95 Respirators Standards - US Standards (NIOSH) & EU Standards (CE)
Surgical Mask Standards (EN 14683+AC Medical face masks) vs Non-medical Filtering half masks Standards (EN149:2001+A1:2009)

Pitta Masks and PM2.5 pollution masks do not filter for virus particles

Recently I made the decision to withdraw baby from infantcare (having foreseen that there would be a long period where I would be asked to work from home - that prediction has now been confirmed as reality) and as a result my parents have been coming to my house to care for baby whilst I work from home. Because this meant that my parents were exposed more than me because of their transit from their house to ours, my mother expressed worry about taking Grab. So I decided to buy some masks first and foremost for my 60+ year old parents who are in the most vulnerable category.

Two months ago, my very crafty and industrious mother tried to sew some reusable masks out of cotton - but I just was not convinced that this was safe. Even from my layman perspective, how could a cloth filter the microscopic particles of a virus? So I went online and randomly bought a packet of the first aesthetically pleasing mask I saw online - the Japanese Pitta mask. But since then, I have read the fine print: Pitta Masks cannot filter tiny virus particles. Online you can find people who have done tests on the Pitta masks and found that they "captured an astounding 0% of 0.3-micron particles".
To be fair, the Pitta mask manufacturer does not advertise it for virus protection. It is simply that online sellers are putting it up unscrupulously without mentioning the fact of the matter that PM2.5 is not enough.

Image source:
Read more:

The difference between Masks and Respirators

Having ruled out Pitta masks, I started trying to understand if I should look for Surgical Masks and Respirators.

In a 3M document I found online by their "3M Subject Matter Expert – Asia Pacific Region", they state the different purposes of the masks vs respirators. It also says that "for any airborne particulate contamination such as an outbreak of PM2.5 / PM 10/ Severe Acute Respiratory Syndrome (SARS), Avian Flu, Ebola Virus etc. only respirators and not Masks, should be used to safeguard oneself from getting any kind of respiratory diseases."

Respirators = designed to protect you from the environment
Masks = designed to protect the environment from you

N95 Respirators Standards - US Standards (NIOSH) & EU Standards (CE)

WHAT DOES A LESS THAN 0.1 MICRON PARTICLE LOOK LIKE? Does a mask really filter a less than 0.1 micron particle? Sadly, it seems that figuring out which mask can really filter such a tiny particle is impossible to us alone to judge; it is entirely down to a lab test - so you could say then that it is down to the various 'standards' that the batch of masks have to meet when they are tested in a lab. So this it led me to find out - who tests the masks to make sure they are up to the standard they say they are?

US Standard:
N95 Respirator [Filters at least 95% of airborne particles]
N99 Respirator [Filters at least 99% of airborne particles]
N100 Respirator [Filters at least 99.7% of airborne particles]

European Standard:
FFP1 Respirator [Filters at least 80% of airborne particle]
FFP2 Respirator [Filters at least 94% of airborne particles]
FFP3 Respirator Filters at least 99% of airborne particles]
Check for CE cert by looking at the cert's issuing body. Unfortunately, there isn't a centralised data base you can dial it into. For more, read here at CE-Check Support. But, I have found that some of the certifying bodies have got sites where you can enter the cert number to verify.

Besides those standards, there is the Chinese KN95 and Korean KF94 which are supposed to be equivalent in standard to N95. How do we verify them? Is there any way to verify them? I don't really know. ¯\_(ツ)_/¯

This is where it got murky... So I wanted to buy some working N95 Respirators. I dialed in "N95" on Qoo10, Lazada, Shopee and ezbuy. Here's an example of the first respirator with dodgy papers which I found on Qoo10 from the seller "Collectible Haven":

I took it one step further. I took the number they posted and checked it with the Standards body for NIOSH. THE CERT NUMBER THEY PROVIDED WAS NOT FOR THEIR COMPANY OR THE PRODUCT! It did have the same product number 1200F though, but I do feel deceived.

There were several other similar examples online but I won't go through them. You can do the sleuthing yourself if you are concerned. But here is an example where I did buy the mask, and it checked out correctly.

Here's an original box of 3M 8210 masks which on the list of N95 masks from the company, along with a diagram of how to tell if the mask is an actual N95 mask.

Source: How to check the NIOSH marking on the mask (ie: the N95 standard)

After discovering that several masks online that provided "documentation" of the masks which was fake, I decided to see what really constituted a surgical mask. They've been saying that surgical masks are what you need to wear, but that there were many cases reported in the media where people bought expensive 'surgical masks' which were very thin or suspicious. The Health Sciences Authority regulates the importation of surgical masks, and their website (currenly down) had stated that there was a difference between standard face masks (paper or cloth) and surgical masks (medical product, Bacterial filtration efficiency above 95%*).

Surgical Mask Standards (EN 14683+AC Medical face masks) vs Non-medical Filtering half masks (EN149:2001+A1:2009)

Some days back, I preemptively placed an order for what I thought were surgical masks in case we needed some.

These were the masks I ordered on Shopee from posted a screenshot of the cert

These masks said they had a CE mark which I also went back to the original agency to cross-verify. Really strange to go to the effort to see if it was true, but I did and on the website (a weird marketing-oriented platform which helps foreign/chinese companies bring products to European markets through CE and other certifications) if you dial in the code from the cert - truly the certification comes up as authentic.

But at the time I didn't check what device they were certifying for. What is "EN149:2001+A1:2009"? If you search for the original definition of the standard, this is for Respiratory protective devices: Filtering half masks to protect against particles. Which translated means it is a dust mask not a surgical mask. For a surgical mask, the European standard is "EN 14683+AC" - Medical face masks.

Source: on Shopee

Again, this wasn't false advertising on this seller's part on Shopee. They did not say it was a surgical mask / medical product. I saw the picture of the mask and I personally assumed it was a surgical mask but in fact it is only a dust mask. A high quality, authentically tested and CE marked dust mask. Fair enough to them.

What is more unclear is that there are actually many sellers who DO sell their product as a SURGICAL MASK with CE mark but then the certs they posted show it is not a normal disposable face mask or some just posted nonsense. Here are 3 different examples (but there are countless more examples of this online)

Eg: Weilan777 on qoo10 - "surgical mask" with disposable face mask cert

Eg: OurFirstStore on lazada - "surgical mask" with cert for their machinery not the mask itself

Eg: Boslun on lazada - "surgical mask" with cert for different product

Did the sellers just assume that people would just glaze over looking at the certs and assume it was all good? You could say, "who cares about standards? just get the masks to the people quickly!" but then if you are buying this to protect a loved one, you don't want to feel like you've been deceived into buying it, especially when the masks are being sold at increasingly cut-throat prices.

(There is also the question in my mind: is it ethical for me to be buying masks when there is a shortage in other countries? I acknowledge the privilege that I have to be in a comfortable financial position to purchase masks/respirators in Singapore where it is very readily sold on many consumer platforms - and also to be in a position to help others. What about people who can't afford masks, or who don't have a good home environment to spend the lockdown in? Since I am not in a position to volunteer, I looked into the organisations who are helping those who would be in need during covid-19. If you can, do consider giving to these groups:

AWARE - Vulnerable Women's Fund: The COVID-19 pandemic underscores the already stark inequalities experienced by women in areas such as unemployment, housing, caregiving and domestic violence. This March, AWARE received 619 calls - our highest-ever number of monthly calls - with many callers dealing with emotional and psychological distress, violence and abuse. We need your help to ensure that we can continue to provide our services during this period, to aid these women in crisis.

Migrant Workers' Assistance Fund: The assistance fund aims to provide emergency humanitarian assistance to distressed migrant workers. The assistance offered ranges from emergency shelter, daily essentials and basic sustenance needs to employment-related issues such as salary arrears. In the event where the migrant workers tragically lose their lives, the MWAF may also provide their next-of-kin with monetary assistance or in kind. The funds collected from our previous fund-raising activities have benefitted many distressed workers and helped them return to their home country. To continue to provide emergency humanitarian assistance and reach out to more migrant workers, who contribute to our economy.

Friday, 6 March 2020

Quick things to do in Blender: Video Editing, Bake Sound to F-curves, VR/3D Storyboarding, Compositing 3D model into photo, and Motion Tracking

I've used (and taught) Blender quite intensively for a couple years now but I haven't really mined all its possibilities yet, and even today when I watch different staff and students work in it I still pick up new things from time to time. My selection criteria for these features is that: YES you could conceivably do all of these, even with just 5 minutes to spare and you are perched on the edge of the bed with your laptop and mouse, half-expecting baby to wake up at any moment...

Things you can do rather quickly in Blender:
Simple Video Editing
Bake Sound to F-curves
Simple Compositing of 3D model into photo
3D/360 Illustration draft with Grease Pencil
Motion Tracking to composite a 3D model into a Video

1. Edit a simple video with simple edits, cropping, overlays, audio, etc in very little time

Earlier in March I attended a training but because I'm a completer finisher sort of person I ended up doing the training material on two programs simultaneously just to see if Blender could do everything Premiere could do. It turns out that YES you can do video editing in Blender and it is even faster and simpler to do it than in Premiere! The interface looks very very similar to Premiere actually and if you go into the Video Editing view there is really no excuse for having any UI related issues because the interface is just so easy now!

Features you might need like text overlay, image overlay, offset, cropping, transitions, fade in fade out, panning audio, compositing, motion tracking - all of them are possible in Blender! I think might use this for my next video edit.

2. Bake Sound to F-curves

F-curve refers to the curve of the interpolation between two animated properties or keyframes. Interpolation has modes (linear, constant, bezier) and it has easing equations (linear, quadratic, cubic, exponential, etc), and stuff like that. But the funny feature in blender is the ability to bake a sound as the F-curve, or to make the sound wave the F-curve, such that your animation pulses along with the audio wave.

3. Do a sketch/storyboard for VR/360 or 3D illustration with the grease pencil

Personally I don't use this quite enough but the grease pencil is super handy for making some rough sketches or even storyboard before you do an illustration work. For example, I saw a interesting video in which someone used the grease pencil to good effect to do a storyboard for a 360 work here:

You create an empty grease pencil mesh (Shift-A) and then go into the "Draw" mode. You can only draw on the flat image plane that you are facing. But after you draw it, you can move, rotate, and scale the grease pencil drawing at will and move it all around the scene. Many possibilities!

4. Composite a 3D model into a Photo (the simple way)

Somehow it got even easier and easier. You can set the world image background's vector texture coordinate to "window" and when you look at the Render viewport your object is now in the world with all the colour and light from the background image itself. Works if you only had 5 minutes before the baby was about to wake up and you wanted something super simple. :-D

5. Motion Tracking to composite a 3D model into a Video

I decided to sit down and spend a few minutes trying out camera tracking which I've always known was a feature. Can you do it in a few minutes? Well yes, in the sense that Blender can do most of the legwork for you with camera solve but you'll need to spend some quality time editing the tracks for best effect (especially for correct rotation). Above is an example of a terrible solve.... but it kinda works!

Sunday, 23 February 2020

The Kappa's Izakaya: 360° Illustration Process

Recently I worked on a 360° illustration of an Izakaya in Daryl Qilin Yam's Kappa Quartet and I was asked if I could share a bit more about the process of doing such an illustration.

Artistic disclaimer: It just so happened that I watched a lot of Midnight Diner at the time when I was doing this illustration, so those spaces were definitely in my mind's eye. There was also the show Samurai Gourmet which was a bit tiresome to watch, but had a few good shots of a traditional izakaya too. Alas, although I have visited Tokyo several times before, at this point I haven't really been to a bar or izakaya in some years now...

From "Midnight Diner: Tokyo Stories"

From "Samurai Gourmet"

Some things I realised from these portraits of izakayas is that when in doubt on how to fill the bar space, you can put stacks of tiny crockery or cover it up with a cupboard!

I even made a little crackleware... not that the detail is visible in the final render

Another disclaimer: Where 3D modelling is concerned, I mainly design spaces and architectural/interior renders and I'm not a character designer! This will probably be apparent to people who actually do proper animation / character design because here I chose to render all the other people in the scene in this weird white low-poly form. Personally I thought it a good fix for my weakness and also that it kinda worked for this 'horror' piece...

Initially I thought that I would actually try to do the entire drawing by hand because I have actually enjoyed doing similar illustrations entirely by hand in the past - especially with lens distorsion like this:

2 illustrations from the set of 4 that I was commissioned to do for the Commuting Reader

I usually work out a lot of the alignment for this kind of illustration by doing a 3d model using a fisheye or panoramic lens. After arranging white blocks in the space and rendering it out, I just use those lines in the render as perspective reference for my drawing.

Example: this plain equirectangular render with no materials...

And for all other details that you need to fill in by hand, you can rely on an equirectangular grid (here is a link to an equirectangular grid by David Swart that you can use as a template) and think of it as a 4 point perspective drawing as so:

Here's a 4 hour sketch I made using the grid for the fun of it in 2018...
(Back when I had a lot of free time huh)

Problem right now is that feeding and caring for Beano made it extraordinarily difficult for me to be able to use the tablet or cintiq. If left to her own devices, she wants to pull on all the type-c cables and gnaw on my stylus and then slap my cintiq screen! Attempts to set up my workstation in the bedroom so I can use the cintiq when she's asleep have failed in baby safety. In fact I've more or less resigned myself to the fact that spending time with the tablet is impossible now - WHO WANTS TO BUY A MEDIUM WACOM AND/OR A CINTIQ PRO IN EXCELLENT CONDITION??? - and I've had to streamline my time spent designing, thinking of the fastest way to get the visual output. Hours spent doing digital painting like in the old days? Not happening anymore. A blender render is all I can muster now, which is great because whilst I feed and entertain Beano, I can easily set a render going so that I feel as if my illustration is partly doing itself whilst I'm busy...

I also use a renderfarm to speed things up a bit and I usually do a smaller resolution render to check that things are alright before doing the full size. At the 50% of the resolution I wanted it cost about 40-60 US cents (0.85 SGD) for each one. For the final render at 100% resolution and twice the samples, it cost about 4 USD (5.60 SGD).

I don't know how most people do the next step but I usually go through a process of annotating my renders and then ticking them off in Monosnap through as I do the edits:

Finally we end up with the base render onto which I can add faces and other details in Photoshop. I do find that adding a bit of noise (0.5%-2%) also helps make it more 'painterly' because when the render is too sharp it becomes a bit disconcerting and unreal. I also drop the file periodically into this equirectangular viewer to see if the work is shaping up correctly - usually common issues might include (1) some things in the image that seemed further away may suddenly seem extremely close to the camera view or (2) items may be blocked when you render the specific view - so some time needs to be spent finetuning the arrangement.

Render Breakdown

This was another work made possible by the Dingparents who came down to take care of Beano on the weekends so I could continue my artistic pursuits! I am grateful to have the time to continue to make my work.

Come see the final image at Textures: A Weekend of Words, at Sorta Scary Singapore Stories by Tusitala.

13 - 22 Mar
10am – 10pm
The Arts House

Textures: A Weekend of Words celebrates Singapore literature and its diverse community. No longer a solitary experience, reading becomes a shared adventure through performances, installations, and workshops that will take you on a trip through the literary worlds of local authors.

The third edition of the festival takes on the theme “These Storied Walls”. Inspired by The Arts House’s many identities as a Scotsman’s planned estate, our nation’s first parliament, and now Singapore’s literary arts centre, the walls of The Arts House have been etched with the stories of those who have walked these halls.

This year’s programming features more installations and participatory activities that invite you to go a step further — move a bit closer and look a little longer. As you discover undiscovered narratives of your own, join those who have come before and weave your story into the tapestry of The Arts House.

Textures is co-commissioned by The Arts House and #BuySingLit, and supported by National Arts Council

More about Sorta Scary Singapore Stories

Saturday, 22 February 2020

Paintpusher: Computer-aided Oil Painting (SUPER–TRAJECTORY: Life in Motion, ArtScience Galleries, 20 February to 8 March 2020)

Behold! This is a painting made by me and a little XY plotter which pushes the paint around. (I originally gave it a title with the word "sketches" in it because I like how it starts from a pencil sketch, to a processing sketch, then to this plotter's wonky sketch that pushes paint around on the canvas... but now thinking about it, I am actually thinking I should rename the work to "Paintpusher" because it is not really painting, it is really just pushing the oil paint around on the canvas...)

Every once in a while I get gripped by a desire to teach myself how to paint hyperrealistic or photorealistic -- just for the hell of it and being able to master it...? - but then I realise it will take me years of muddling along in the good old fashioned, humans-doing-oil-paintings-by-hand sort of way. Additionally, my own approach for understanding and making visual work has always been via the digital, so instead of mucking around helplessly in oils, I thought I would try to do a little "computer-aided oil painting"...

Doing 'precision' painting of any sort is messy and potentially very time-consuming, and now with a Bean to feed and care for (practically a 24hr job), carving out time to make art has been much more challenging (in addition to my teaching day job). Whilst spending long hours breastfeeding Beano, I had quite a lot of time to plot and scheme up things, but I only had rigidly fixed windows of time where I could personally execute the program (ie: when my parents were available to take care of Beano on the weekends). In theory, I thought that by devising a process for making the 'precise' paintings (and sticking to the process!) it would help me control the amount of time I was spending on "Debbiework"... although the prep work takes the longest in that case. This painting experiment would not have been possible without my parents coming down over a few weekends to help care for Beano whilst I made a big painterly mess.

The Mini Line-Drawing Machine

Line-Us Concept Image

Some time back there was a kickstarter for a little drawing machine called the Line-Us, which they rather pointedly emphasised on their promotional material as being "NOT A TOY". Well then, what is it exactly? I guess it is a small usb powered plotter in which you can insert a pen and have it trace out an SVG file (you can also muck around by hand on their app and see how it partially messes up your drawing. There was also a concept gif they released, imagining it doing water colours.

Line-us plotting some random SVG that I made in Illustrator

The app that comes bundled with the Line-us allows you to draw on your mobile screen to control the Line-us. IT allows you to take a photo, put it in the background, and then you can trace over the image your self. Which ultimately produces something which is not dissimilar to something you might choose to draw with your non-dominant hand.

I've got to say that drawing on my phone to control Line-us's pen doesn't really seem like the point of having a device like this. I mean it makes for hilarious results from this NON-TOY, but it makes more sense as a SVG plotter, which I'm surprised isn't the function of its main app. Maybe they dont want to get your hopes up of it being able to plot perfect squares and perfect circles... BECAUSE IT DOESN'T. I used this script contributed by another user (Set the IP to and it will connect the Line-US when in red mode)

The joy of the plotter is really in its "shonky-ness". It gets more and more askew as we progress further away on both axes. It wobbles and trembles and if your pen is tilted on an angle, the distortion from the tilt will become more and more pronounced at the extremes of the drawing board. One of the prominent apps touted for this "NOT-A-TOY" is a game where it will draw something (somewhat badly) and you have to guess what the Line-Us is trying to draw...

Painting Process

Initial Sketches

I started with some sketches of possible approaches. I had lofty dreams of doing a landscape painting at first, but in reality I don't have that much control (or rather it feels like youre in a constant state of almost losing control of the pen), and I found that with this kind of work, less is more. The more you push paint around, the more it looks like an indistinct mushy grey. Like if you smeared your face over a palette.

Line-us Manual Control - painted too long until paint became muddy

This is the mess it makes when you "overpush" the paint (output now discarded). Using manual control on the app meant that it was no different from me being an exceptionally incompetent painter. The process needs to be rigorously followed for this experiment to be meaningful, and I knew by this point that I wanted to make iterative paintings...

Processing Sketch

Referring to some of my pencil sketches, I wrote a Processing sketch to produce the drawing. I had more intentional and complex sketches at first, but as you can see, I ended up with something exceedingly basic. A super basic bezier. To be honest, everything more complicated just didn't make a good painting.

In Processing, you can use beginRecord to echo the drawing processes to a svg or pdf file. It generates an SVG file which comprises of the lines I drew with the code...

SVG Generated in Processing

And the SVG file is also readable by the plotter as a series of lines of coordinates which when joined up make the drawing.

The plotter outputs look a bit wonky, but the wonky-ness is consistent. If you made the Line-Us repeat the SVG, it would always outline over that same point, over and over again. So... it is very precisely inaccurate.

After testing out all the outputs, I prepared the canvas by using a palette knife to lay down a base colour that the plotter would paint over. I also experimented with using masking tape to mask out the area where I would be painting - I think the framing was crucial to the work looking as it does; without the framing, it just felt like a big messy paint blob; similarly without repetition one may not realise this is an iterative work or a work produced by a machine repeating an action over and over again.

After generating these tiny prints, I decided to digitise and blow up a set of 4 of them. I was originally only going to blow up one of them, but the output was better than I expected, almost resembling the fronds of a palm, with an organic form.

Initially I was going to get a normal photographic paper print when I happened to see at the printers how well the metallic prints seemed to bring out the colour, giving it more three-dimensionality. So... I decided to try doing my print on metallic and I love it!

The work is currently at ArtScience Museum, Level 4 until 9 March 2020. There wasn't an opening event due to coronavirus cancellations. But come and see it when you can and let me know what you think of them. And as for next steps, I think I will build a bigger XY plotter!...

20 February to 8 March | 10am–7pm
ArtScience Galleries, Level 4
Free admission

SUPER–TRAJECTORY: Life in Motion is a presentation of new media artworks from Taiwan and Singapore that reflects on the human experience in an era of instantaneity, transformation and conflict, where speed is the new scale.

Through a programme of installations and screenings, artists investigate the artistic and cultural consequences of new technologies, reflecting on what it means to be making art in an accelerating, media-influenced world.
The artists, in different ways, explore a digital world that generates itself and our longing for material qualities and tactile connections in our lives. We see Chih-Ming Fan, Ong Kian Peng and Syntfarm employ computational algorithms as interventions to the present moment as we are confronted with new realities; while Debbie Ding, Charles Lim and Weixin Quek Chong engage with the intimacy and agency of touch in an exploration of materiality and physicality in our relationships with technologies. In the works of Cecile Chagnaud, Mangkhut, Hsin-Jen Wang and Tsan-Cheng Wu, we encounter a delicate exchange with the artists’ worlds as they consider the notion of home and memory by mapping their personal experiences against the unprecedented impact of urbanisation.

Between today’s postdigital condition and the complex yet banal realities of contemporary life, this group of works poses the question: What are the humanistic values and principles in an increasingly formatted world?
SUPER–TRAJECTORY: Life in Motion at ArtScience Museum is a collaboration with INTERーMISSION (Urich Lau and Teow Yue Han), Tamtam ART Taiwan (Vicky Yun-Ting Hung, Wei-Ming Ho and Lois Wen-Chi Wang) and 臺南市美術館 Tainan Art Museum.

Exhibiting artists include Cecile Chagnaud, Debbie Ding, Chih-Ming Fan, Charles Lim, Mangkhut (Jeremy Sharma), Ong Kian Peng, Weixin Quek Chong, Syntfarm (Andreas Schlegel and Vladimir Todorovic), Hsin-Jen Wang and Tsan-Cheng Wu.

The first iteration of SUPER–TRAJECTORY, Media/Life Out of Balance (6 October 2019 to 3 March 2020), was presented at Tainan Art Museum, setting out this cross-regional platform for contemporary and experimental media art and exchange in discourses on technology in art.

More Info on Facebook