This time I will present some mini research about how smoking induces filter color changes. Experiment was following - I made cigarette filter photograph after each inhalation with camera.
Later these filter pictures were processed with C++ program to determine filter average normalized opacity and how it changes with relation to inhalation number. You can download this C++ program which analyzes this effect from here. This zip includes C++ project itself and PGM pictures of cigarette filter shot after each inhalation. C++ project is simple - it just has following parts:

- pgmImage class which has loading and saving methods for PGM image type.

- cigaretteFilterAnalyzer class which basically calculates normalized averaged filter opacity in each image.

- cigaretteFilterEffects.cpp program itself which outputs experiment data on stdout.

Basic effects as it should be of course is that with each inhalation filter gets darker and darker. I compiled 13 pictures into one to show this effect:

Number below each filter image represents inhalation number.
In addition to visual appeal of this effect I made plot from the experiment data processed with C++ program. You can see this plot how filter opacity changes over inhalation number here:

In this graph I added linear data fit. Linear data fit has R^2 of 0.95, so it seems that linear fit describes opacity changes pretty well.
What is interesting that given relative filter opacity you can deduce how many inhalations was taken. I bet there can be more interesting ideas to explore which even further expands on
research,- for example this effect should depend on filter quality. So basically some research can be done to explore how filter quality affects opacity effect. Which could be interesting to cigarette manufacturers. But that is just a guess. Also this effect should depend on filter structure, size, cigarette type, materials being used, inhalation duration, etc. For example cigarettes used for this experiment were with menthol capsule, so some additional effect arrived which relates how smoke propagates through menthol capsule.
Also below is interesting picture about how cigarette filter looks like without smoking:

Filter picture was converted to gray-scale and histogram equalization method was performed on image to amplify color differences between pixels. Because some color changes are too small for an eye to see, but after histogram equalization it is easy to see smaller differences of pixels. After performing this we get nice picture of filter porosity :-)
As I've said these cigarettes were with menthol capsule. So bellow is also one image after 12 inhalations converted to gray-scale and with histogram equalization performed:

Some lines were added to indicate menthol capsule. It can be clearly seen that capsule has strait in the middle - it is the place where it was crushed with the fingers before smoking. Otherwise you will not get menthol taste :-) At first I haven't understood why such shape appears in almost all pictures. I thought that we get concentric darker zone in the filter middle just because somehow smoke
propagates better through the center of filter. But for this being true, there must be some randomness in each filter picture, because you can't guarantee that in each inhalation smoke will propagate in the same way. But in contrary this shape was too clear and similar between all images. So just very determined situation can induce the same shape effect. Best explanation was menthol capsule effect. Maybe there are more explanations - I don't know :-)
This is it. People who smokes can make similar experiment themselves. But I bet it is better not to smoke at all because not just the opacity of filter changes but there are more serious effects on health too. And this opacity effect shows indirectly this too, because the reason why opacity gets greater with each inhalation is that filter becomes more polluted with each inhalation.
So you better stop smoking :-)
Regards,
Agnius

# Coding experiments

## Wednesday, July 30, 2014

## Wednesday, January 1, 2014

### ASP.NET web site for generating test data

Have you ever wanted ASP.NET site (and it's source code) with functionality as in generatedata.com (which aims at generating test data) ?
If so - just check my newest creation at codeplex. License is MIT, so you can use it as you wish.
Have fun at generating test data !
Best wishes,
Agnius

## Sunday, September 22, 2013

### Building logic gates from domino

Do you feel boring ? Try to build computer from dominos :-). It's at least theoretically possible to do it, but of course practically unfeasible. Practically it's feasible to build some primitive logic circuits from dominos. It is challenging and fun, because domino system is highly unstable so it is hard to construct 100% deterministic logic functions from it. We will try to build AND logic gate here. According to wikipedia it's easy to build OR and XOR gates like that:

So AND gate can be composed from these two gate types as wiki states by formula:

Well my scheme has it's own instability which has dependence on distance between dominos. But in my opinion distance can be more easily adjusted than timing between two events without any external devices ;-) So this is how my domino AND logic gate performs in live environment:

And here is an example schema how NOT gate could be built:

Have fun with domino computing ! :-)

So AND gate can be composed from these two gate types as wiki states by formula:

But here we have two problems:A AND B = (A OR B) XOR (A XOR B)

- We need many parts of dominos, because according to formula we need 2 XOR gates and 1 OR gates. Besides XOR gate is relatively big.
- Also as wiki states that XOR gate is extremely dependent on timing. So because we have 2 XOR gates here - dependence on timing will increase too.

Well my scheme has it's own instability which has dependence on distance between dominos. But in my opinion distance can be more easily adjusted than timing between two events without any external devices ;-) So this is how my domino AND logic gate performs in live environment:

And here is an example schema how NOT gate could be built:

Have fun with domino computing ! :-)

## Wednesday, September 26, 2012

### DNA sequence visualization

While reading this very interesting article about history of human genome I stumbled upon a fact that we have portion of our DNA that is 3 billion years old !! That's pretty damn a big number of years for some DNA sequence to survive in between of trillions of our ancestors. That's get my attention and I wanted to see how this 3 billion year sequence looks like in visualized form. But after a short google search I didn't found a simple DNA sequence visualizer. So I've decided to code that simple DNA sequence visualizer in browser with the help of javascript. So here it is:

(by default this 3 billion years old DNA sequence shared among all living creatures is set, but you can paste your own sequence as well - hit the button to see it).
So better check if your friend has this 3 billion year sequence - otherwise you may be talking with Cylon =)

(by default this 3 billion years old DNA sequence shared among all living creatures is set, but you can paste your own sequence as well - hit the button to see it).

Each nucleobase in different color

Labels:
DNA sequence visualization

## Thursday, October 6, 2011

### iPhone game "Pong Hau K'i" source code

Have you ever had a dream to write an iPhone board game and wondered where to start from ? Or maybe you wanted some simplistic iPhone game source code to look at and learn from ? Now you have a good opportunity to solve that problem . I've decided to publish my iPhone board game Pong Hau K'i

source code & assets. Use it for any purpose you wish - be it personal / educational or commercial use ...

HTH !

source code & assets. Use it for any purpose you wish - be it personal / educational or commercial use ...

HTH !

Labels:
game development,
iPhone games,
Lang_Objective-C

## Friday, May 27, 2011

### Ellipse detection in image by using Hough transform

How we can detect ellipses in images ? One way is to use Hough transform. I will use Hough transform algorithm variant created by Yonghong Xie and Qiang Ji. That algorithm pseudo-code:

Proof-of-concept algorithm implementation in Python:

(Prototype algorithm is slow, tested only on 50x50 images). So, by running this algo on this image: we will get such algorithm output:

Have fun in computer vision !

1. Store all edge pixels in a one dimensional array. 2. Clear the accumulator array. 3. For each pixel (x1, y1), carry out the following steps from (4) to (14). 4. For each other pixel (x2, y2), if the distance between (x1, y1) and (x2, y2) is greater than the required least distance for a pair of pixels to be considered then carry out the following steps from (5) to (14). 5. From the pair of pixels (x1, y1) and (x2, y2) calculate the center, orientation and length of major axis for the assumed ellipse. 6. For each third pixel (x, y), if the distance between (x, y) and (x0, y0) is greater than the required least distance for a pair of pixels to be considered then carry out the following steps from (7) to (9). 7. Calculate the length of minor axis. 8. Increment the accumulator for this length of minor axis by 1. 9. Loop until all pixels are computed for this pair of pixels. 10. Find the maximum element in accumulator array. The related length is the possible length of minor axis for assumed ellipse. If the vote is greater than the required least number for assumed ellipse, one ellipse is detected. 11. Output ellipse parameters. 12. Remove the pixels on the detected ellipse from edge pixel array. 13. Clear accumulator array. 14. Loop until all pairs of pixels are computed.

Proof-of-concept algorithm implementation in Python:

import sys from PIL import Image,ImageFilter, ImageDraw from math import * # some global constants EL_COVERAGE_RATIO = 0.9 EL_VERIFICATION_DISTANCE = 1. EL_PATH_POINTS = 51 MIN_MINOR_FREQUENCY = 30 MIN_HALF_MAJOR = 8 MIN_HALF_MINOR = 6 def distance(p1,p2): x1,y1 = p1 x2,y2 = p2 return sqrt((x1-x2)**2 + (y1-y2)**2) def nonnegative(v): return v if v >= 0 else 0 def parametricEllipse(center, a, b, angle): xc,yc = center elx = lambda t: xc + a * cos(t) * cos(angle) - b * sin(t) * sin(angle) ely = lambda t: yc + a * cos(t) * sin(angle) + b * sin(t) * cos(angle) return [(int(elx(2.*pi*x/float(EL_PATH_POINTS-1))),int(ely(2.*pi*x/float(EL_PATH_POINTS-1)))) for x in range(EL_PATH_POINTS)] assert len(sys.argv)!=2, "missing input and/or output file !" im = Image.open(sys.argv[1]) width, height = im.size io = Image.new('RGB',(width, height),(255,255,255)) draw = ImageDraw.Draw(io) # converting image to grayscale im = im.convert('L') # detecting edge pixels im = im.filter(ImageFilter.FIND_EDGES) # converting to binary image im = im.convert('1') pixels = im.load() pxy = [] # extracting binary pixels coordinates for x in range(width): for y in range(height): if pixels[x,y]==255: pxy.append((x,y)) # applying Hough transform for ellipses detection. # algorithm is taken from this paper: # http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.1.8792&rep=rep1&type=pdf cIx = -1 colors = [(255,0,0),(0,200,0),(0,0,255)] for x1,y1 in pxy: for x2,y2 in pxy: bbins = {} dist = distance((x1,y1),(x2,y2)) if dist >= 2*MIN_HALF_MAJOR: cent = ((x1+x2)/2.,(y1+y2)/2.) a = dist/2. # semi-length of major axis alfa = atan2((y2 - y1),(x2 - x1)) for rx,ry in pxy: d = distance((rx,ry),cent) if d >= MIN_HALF_MINOR: f = distance((rx,ry),(x2,y2)) cost = (a**2. + d**2. - f**2.)/(0.00001+2.*a*d) b = sqrt(nonnegative((a**2. * d**2. * (1.-cost**2.))/(0.00001 + a**2. - d**2. * cost**2.))) # semi-length of minor axis b = int(b) if bbins.has_key(b): bbins[b]+=1 elif b > 0: bbins[b]=1 bbins_rev = dict([(v,k) for k,v in bbins.iteritems()]) max_freq = max(bbins_rev.keys()) bmax = bbins_rev[max_freq] # Did we found probable ellipse ? if max_freq >= MIN_MINOR_FREQUENCY and alfa >=0.0 and bmax >= MIN_HALF_MINOR: elData = parametricEllipse(cent, a, bmax, alfa) supported = [] supportRatio = 0.0 # counting how much pixels lies on ellipse path for i in range(EL_PATH_POINTS): elx,ely = elData[i] added = False for x,y in pxy: if distance((elx,ely),(x,y)) <= EL_VERIFICATION_DISTANCE: supported.append((x,y)) if not added: supportRatio += 1./float(EL_PATH_POINTS) added = True supported = list(set(supported)) # if number of pixels on ellipse path is big enough if supportRatio >= EL_COVERAGE_RATIO: cIx = (cIx+1)%3 print "coverage %.2f" % supportRatio,"frequency ", max_freq, "center ", cent, "angle %.2f" % alfa, "axes (%.2f,%.2f)" % (a, bmax) # removing founded ellipse pixels from further analysis for p in supported: pxy.remove(p) # drawing founded ellipse for i in range(EL_PATH_POINTS): elx,ely = elData[i] if i < EL_PATH_POINTS-1: draw.line(elData[i] + elData[i+1], fill=colors[cIx]) io.save(sys.argv[2]) print "***************************************************************" print "************************** DONE *******************************" print "***************************************************************"

(Prototype algorithm is slow, tested only on 50x50 images). So, by running this algo on this image: we will get such algorithm output:

Have fun in computer vision !

## Friday, May 6, 2011

### Gradient transfer function

Suppose we need to draw linear gradient, but in a way which lets us to control color distribution between gradient parts. How to do that ? Answer is - gradient transfer function.

Algorithm is this:

1. Extract pixel's relative distance [0..1] from the start of gradient.

2. Update this distance by feeding it to gradient transfer function.

3. Blend source color with target color using updated distance as blend ratio.

4. Set calculated color to pixel.

We will use such gradient transfer function:

where x is pixel's relative distance from the start and a,b are some adjustable parameters.

Below is Javascript implementation of this method (your browser must support HTML5 canvas element). You can try to change a,b parameters of transfer function and see what happens to gradient.

And here is Javascript code which does that (plot is generated with FLOT library):

Have fun!

Algorithm is this:

1. Extract pixel's relative distance [0..1] from the start of gradient.

2. Update this distance by feeding it to gradient transfer function.

3. Blend source color with target color using updated distance as blend ratio.

4. Set calculated color to pixel.

We will use such gradient transfer function:

where x is pixel's relative distance from the start and a,b are some adjustable parameters.

Below is Javascript implementation of this method (your browser must support HTML5 canvas element). You can try to change a,b parameters of transfer function and see what happens to gradient.

**a** 0.5

**b** 5.00

And here is Javascript code which does that (plot is generated with FLOT library):

function showValue(newValue,el) { document.getElementById(el).innerHTML=parseFloat(newValue).toFixed(2); GeneratePlot(); } function Clamp(x,a,b) { return Math.min(Math.max(x, a), b); }; function NonLinearTransfer(x,a,b) { return (1-a)*x + a*Math.pow(1+Math.exp(b-2*b*x),-1); }; function GeneratePlot() { var data = []; var a = document.getElementById("rngNonlinearity").value; var b = document.getElementById("rngAbruptness").value; for (var i = 0; i <= 1; i += 0.01) data.push([i, Clamp(NonLinearTransfer(i,a,b),0,1)]); $.plot($("#placeholder"), [{ data: data, label: "Transfer function"}], { xaxes: [ { min: 0, max: 1 }], yaxes: [ { min: 0, max: 1 }], legend: { position: 'nw' } } ); GenerateGrad(); }; function Blend(k,x,y) { return (1-k)*x + k*y; } function setPixel(imageData, x, y, r, g, b, a) { index = (x + y * imageData.width) * 4; imageData.data[index+0] = r; imageData.data[index+1] = g; imageData.data[index+2] = b; imageData.data[index+3] = a; } function GenerateGrad() { element = document.getElementById("canvasGrad"); c = element.getContext("2d"); width = parseInt(element.getAttribute("width")); height = parseInt(element.getAttribute("height")); imageData = c.createImageData(width, height); scolor = [0,255,0]; tcolor = [0,0,255]; c1 = document.getElementById("rngNonlinearity").value; c2 = document.getElementById("rngAbruptness").value; // draw gradient for (x = 0; x < width; x++) { k = x/width; k = NonLinearTransfer(k,c1,c2); r = Blend(k,scolor[0],tcolor[0]); g = Blend(k,scolor[1],tcolor[1]); b = Blend(k,scolor[2],tcolor[2]); for (y = 0; y < height; y++) { setPixel(imageData, x, y, r, g, b, 0xff); } } c.putImageData(imageData, 0, 0); }

Have fun!

Labels:
gradient,
Lang_Javascript,
transfer function

Subscribe to:
Posts (Atom)