For my project, I will render an animation of rain hitting a window. I will use ray tracing to track how the light is distorted as it moves through the transparent object. The items in the background will be blurred out and out of focus, as the camera will be focused on the raindrops in the foreground against the window. I will ensure that no raindrops overlap as naturally water droplets combine when coming into contact
I started off my project with first rendering the floor, then I experimented with making the appearance of glass. For this I used a vector to store the color behind the glass that I determined by recursively calling the trace function after the ray hits the glass so that I can determine the color. I then blend the glass color and the background color at a ratio of 10 / 90. I purposly cut the glass in half for now in order to compare that the bacground looks like with and without the glass. I have the tint working but there are more adjustments I would like to do to make it look more like glass
I eventually ditched the idea of trying to render a window on screen. When looking at some of my reference rain images, the glass is not really too visible. It seems more like an illusion: The fact that all the rain drops are on the same z-plane, as well as the fact that the camera is focused on the drops with a blurry background meant that it did not seem necessary to actually render in glass. Therefore, I focused my energy on transforming the glass ball from a previous lab into water droplets.
I also experimented with making my own background, but I realized it would look better to import an image and use that. Therefore, I use a function to sample the background color based on the direction of the ray by mapping the rays direction to 2d coordinates on the background image. I match the pixel that relates to these coordinates by scaling to match the image dimentions.
I then achieved blur on the image by taking the average of the neighboring pixels around the sample pixel, a technique known as convolution blur. For each pixel, I check a square grid of pixels around the target pixel, summing up the color values of these pixels and dividing by the number of samples to get an average color and achieve a blur effect
After, I added in the glass ball back to the code, but realized not all the droplets will be a perfect sphere. I then made changes to the Sphere object, now named Droplet, in which I applied a random scaling factor for the width, height, and depth. These values are set randomly to make each drop appear slightly different. In order to determine if the ray intersected a droplet, I applied the same intersection logic used for spheres but with adjustments for the droplets scaling. By transforming the rays origin and direction into droplet space, I could treat the distorted shape as if it were a sphere, then apply the ray-sphere intersection formula. After I found the intersection, I transform the hit point and normal back to world space so that ray tracing works properly through the droplets.
I then added a loop to make multiple droplets, but they would overlap unrealistically and be way too large. For this, I added a function to check each droplet object and ensure that no two droplets are placed too close to each other. This function checks the position of a new droplet relative to existing droplets, and if they are too close it will attempt to place the new droplet in a different location or skip adding it entirely after a certain number of attempts.
I also added another scene where I limited the max size of droplets even further in a seperate loop in order to make the image appear more realistic with small micro droplets around the larger ones.