Formatting drives on any operating system can be a handful of instructions specific to that operating environment. In today’s post we’ll walk through the process of formatting a USB drive to FAT32, explaining each command along the way.
Identifying the Device
Before making any changes, you need to determine which device corresponds to your USB drive. The best way to do this is:
dmesg | grep da
or, for a more detailed view:
geom disk list
On FreeBSD, USB mass storage devices are typically named /dev/daX (where X is a number). If you only have one USB drive plugged in, it is likely /dev/da0.
Device naming in FreeBSD is quite uniform:
USB Drives: /dev/daX
SATA/SAS/IDE Drives: /dev/adaX
NVMe Drives: /dev/nvmeX
RAID Volumes: /dev/mfidX, /dev/raidX
Partitioning the Drive
Now that we know the device name, we need to set up a partition table and create a FAT32 partition.
Destroying Existing Partitions
If the drive has existing partitions, remove them:
gpart destroy -F /dev/da0
This ensures a clean slate.
Creating a Partition Table
We create a Master Boot Record (MBR) partition table using:
gpart create -s mbr /dev/da0
-s mbr: Specifies an MBR (Master Boot Record) partition scheme.
Other options include gpt (GUID Partition Table), which is more modern but may not be supported by all systems.
Adding a FAT32 Partition
Now, we create a FAT32 partition:
gpart add -t fat32 /dev/da0
-t fat32: Specifies the FAT32 partition type.
Other valid types include freebsd-ufs (FreeBSD UFS), freebsd-swap (swap partition), freebsd-zfs (ZFS), and linux-data (Linux filesystem).
After running this command, the new partition should be created as /dev/da0s1.
Formatting the Partition as FAT32
To format the partition, we use newfs_msdos:
newfs_msdos -L DISKNAME -F 32 /dev/da0s1
-L DISKNAME: Assigns a label to the volume.
-F 32: Specifies FAT32.
/dev/da0s1: The newly created partition.
Why /dev/da0s1 instead of /dev/da0?
When using MBR, partitions are numbered starting from s1 (slice 1), meaning that the first partition on da0 becomes da0s1. Using /dev/da0 would format the entire disk, not just a partition.
Wrapping Up
At this point, your USB drive is formatted as FAT32 and ready to use. You can mount it manually if needed:
Rendering realistic 3D environments is more than just defining surfaces—atmospheric effects like fog, mist, and light
scattering add a layer of depth and realism that makes a scene feel immersive. In this post, we’ll explore volumetric
fog and how we can implement it in our ray-marched Mandelbulb fractal shader.
What is Volumetric Fog?
Volumetric fog is an effect that simulates light scattering through a medium, such as:
Mist over a landscape
Dense fog hiding distant objects
Hazy light beams filtering through an object
Unlike simple screen-space fog, volumetric fog interacts with geometry, light, and depth, making it appear more
natural. In our case, we’ll use it to create a soft, atmospheric effect around our Mandelbulb fractal.
How Does It Work?
Volumetric fog in ray marching is achieved by stepping through the scene and accumulating fog density based on
distance. This is done using:
Exponential Fog – A basic formula that fades objects into the fog over distance.
Light Scattering – Simulates god rays by accumulating light along the ray path.
Procedural Noise Fog – Uses random noise to create a more natural, rolling mist effect.
We’ll build each of these effects step by step, expanding on our existing Mandelbulb shader to enhance its atmosphere.
If you haven’t seen them already, suggested reading are the previous articles in this series:
We will start with the following code, which is our phong shaded, lit, mandelbulb with the camera spinning around it.
floatmandelbulbSDF(vec3pos){vec3z=pos;floatdr=1.0;floatr;constintiterations=8;constfloatpower=8.0;for(inti=0;i<iterations;i++){r=length(z);if(r>2.0)break;floattheta=acos(z.z/r);floatphi=atan(z.y,z.x);floatzr=pow(r,power-1.0);dr=zr*power*dr+1.0;zr*=r;theta*=power;phi*=power;z=zr*vec3(sin(theta)*cos(phi),sin(theta)*sin(phi),cos(theta))+pos;}return0.5*log(r)*r/dr;}vec3getNormal(vec3p){vec2e=vec2(0.001,0.0);returnnormalize(vec3(mandelbulbSDF(p+e.xyy)-mandelbulbSDF(p-e.xyy),mandelbulbSDF(p+e.yxy)-mandelbulbSDF(p-e.yxy),mandelbulbSDF(p+e.yyx)-mandelbulbSDF(p-e.yyx)));}// Basic Phong shadingvec3phongLighting(vec3p,vec3viewDir){vec3normal=getNormal(p);// Light settingsvec3lightPos=vec3(2.0,2.0,-2.0);vec3lightDir=normalize(lightPos-p);vec3ambient=vec3(0.1);// Ambient light// Diffuse lightingfloatdiff=max(dot(normal,lightDir),0.0);// Specular highlightvec3reflectDir=reflect(-lightDir,normal);floatspec=pow(max(dot(viewDir,reflectDir),0.0),16.0);// Shininess factorreturnambient+diff*vec3(1.0,0.8,0.6)+spec*vec3(1.0);// Final color}// Soft Shadows (traces a secondary ray to detect occlusion)floatsoftShadow(vec3ro,vec3rd){floatres=1.0;floatt=0.02;// Small starting stepfor(inti=0;i<24;i++){floatd=mandelbulbSDF(ro+rd*t);if(d<0.001)return0.0;// Fully in shadowres=min(res,10.0*d/t);// Soft transitiont+=d;}returnres;}voidmainImage(outvec4fragColor,invec2fragCoord){vec2uv=(fragCoord-0.5*iResolution.xy)/iResolution.y;// Rotating Camerafloatangle=iTime*0.5;vec3rayOrigin=vec3(3.0*cos(angle),0.0,3.0*sin(angle));vec3target=vec3(0.0);vec3forward=normalize(target-rayOrigin);vec3right=normalize(cross(vec3(0,1,0),forward));vec3up=cross(forward,right);vec3rayDir=normalize(forward+uv.x*right+uv.y*up);// Ray marchingfloattotalDistance=0.0;constintmaxSteps=100;constfloatminDist=0.001;constfloatmaxDist=10.0;vec3hitPoint;for(inti=0;i<maxSteps;i++){hitPoint=rayOrigin+rayDir*totalDistance;floatdist=mandelbulbSDF(hitPoint);if(dist<minDist)break;if(totalDistance>maxDist)break;totalDistance+=dist;}// Compute lighting only if we hit the fractalvec3color;if(totalDistance<maxDist){vec3viewDir=normalize(rayOrigin-hitPoint);vec3baseLight=phongLighting(hitPoint,viewDir);floatshadow=softShadow(hitPoint,normalize(vec3(2.0,2.0,-2.0)));color=baseLight*shadow;// Apply shadows}else{color=vec3(0.1,0.1,0.2);// Background color}fragColor=vec4(color,1.0);}
Depth-based Blending
To create a realistic sense of depth, we can use depth-based blending to gradually fade objects into the fog as they
move further away from the camera. This simulates how light scatters in the atmosphere, making distant objects appear
less distinct.
In ray marching, we calculate fog intensity using exponential depth functions like:
where distance is how far along the ray we’ve traveled, and densityFactor controls how quickly objects fade into
fog.
By blending our object’s color with the fog color based on this function, we achieve a smooth atmospheric fade effect.
Let’s implement it in our shader.
voidmainImage(outvec4fragColor,invec2fragCoord){vec2uv=(fragCoord-0.5*iResolution.xy)/iResolution.y;// Rotating Camerafloatangle=iTime*0.5;vec3rayOrigin=vec3(3.0*cos(angle),0.0,3.0*sin(angle));vec3target=vec3(0.0);vec3forward=normalize(target-rayOrigin);vec3right=normalize(cross(vec3(0,1,0),forward));vec3up=cross(forward,right);vec3rayDir=normalize(forward+uv.x*right+uv.y*up);// Ray marchingfloattotalDistance=0.0;constintmaxSteps=100;constfloatminDist=0.001;constfloatmaxDist=10.0;vec3hitPoint;for(inti=0;i<maxSteps;i++){hitPoint=rayOrigin+rayDir*totalDistance;floatdist=mandelbulbSDF(hitPoint);if(dist<minDist)break;if(totalDistance>maxDist)break;totalDistance+=dist;}// Compute lighting only if we hit the fractalvec3color;if(totalDistance<maxDist){vec3viewDir=normalize(rayOrigin-hitPoint);vec3baseLight=phongLighting(hitPoint,viewDir);floatshadow=softShadow(hitPoint,normalize(vec3(2.0,2.0,-2.0)));color=baseLight*shadow;}else{color=vec3(0.1,0.1,0.2);// Background color}// Apply depth-based exponential fogfloatfogAmount=1.0-exp(-totalDistance*0.15);color=mix(color,vec3(0.5,0.6,0.7),fogAmount);fragColor=vec4(color,1.0);}
Once this is running, you should see some fog appear to obscure our Mandelbulb:
Light Scattering
When light passes through a medium like fog, dust, or mist, it doesn’t just stop—it scatters in different directions,
creating beautiful effects like god rays or a soft glow around objects. This is known as volumetric light scattering.
In ray marching, we can approximate this effect by tracing secondary rays through the scene and accumulating light
contribution along the path. The more dense the medium (or the more surfaces the ray encounters), the stronger the
scattering effect. A simplified formula for this accumulation looks like:
where:
is the total scattered light along the ray.
measures how much fog or medium is present at each step.
controls how frequently we sample along the ray.
By applying this technique, we can simulate light beams filtering through objects, making our Mandelbulb feel immersed
in an atmospheric environment.
You can see the god rays through the centre of our fractal:
With this code we’ve:
Shot a secondary ray into the scene which accumulates scattered light
The denser the fractal, the more light it scatters
density += 0.02 controls the intensity of the god rays
Noise-based Fog
Real-world fog isn’t uniform—it swirls, shifts, and forms dense or sparse patches. To create a more natural effect, we
can use procedural noise to simulate rolling mist or dynamic fog layers.
Instead of applying a constant fog density at every point, we introduce random variations using a noise function:
where:
determines the fog’s thickness at position .
is the overall fog intensity.
generates small-scale variations to make fog look natural.
By sampling noise along the ray, we can create wispy, uneven fog that behaves more like mist or smoke, enhancing the
realism of our scene. Let’s implement this effect next.
We’ll add procedural noise to simulate smoke or rolling mist.
After these modifications, you should start to see the fog moving as we rotate:
The final version of this shader can be found here.
Conclusion
By adding volumetric effects to our ray-marched Mandelbulb, we’ve taken our scene from a simple fractal to a rich,
immersive environment.
These techniques not only enhance the visual depth of our scene but also provide a foundation for more advanced
effects like clouds, smoke, fire, or atmospheric light absorption.
Ray tracing is known for producing stunning reflections, we can achieve the same effect using ray
marching. In this post, we’ll walk through a classic two-sphere reflective scene, but instead of traditional ray
tracing, we’ll ray march our way to stunning reflections.
The first step is defining a scene with two spheres and a ground plane. In ray marching, objects are defined using
signed distance functions (SDFs). Our scene SDF is just a combination of smaller SDFs.
SDFs
The SDF for a sphere gives us the distance from any point to the surface of the sphere:
Now we trace a ray through our scene using ray marching.
vec3rayMarch(vec3rayOrigin,vec3rayDir,intmaxSteps,floatmaxDist){floattotalDistance=0.0;vec3hitPoint;for(inti=0;i<maxSteps;i++){hitPoint=rayOrigin+rayDir*totalDistance;floatdist=sceneSDF(hitPoint);if(dist<0.001)break;// Close enough to surfaceif(totalDistance>maxDist)returnvec3(0.5,0.7,1.0);// Sky colortotalDistance+=dist;}returnhitPoint;// Return the hit location}
Surface Normals
For lighting and reflections, we need surface normals. These are estimated using small offsets in each direction:
voidmainImage(outvec4fragColor,invec2fragCoord){vec2uv=(fragCoord-0.5*iResolution.xy)/iResolution.y;// Camera Setupvec3rayOrigin=vec3(0,0,-5);vec3rayDir=normalize(vec3(uv,1.0));// Perform Ray Marchingvec3hitPoint=rayMarch(rayOrigin,rayDir,100,10.0);// If we hit an object, apply shadingvec3color;if(hitPoint!=vec3(0.5,0.7,1.0)){vec3viewDir=normalize(rayOrigin-hitPoint);vec3baseLight=phongLighting(hitPoint,viewDir);vec3reflection=computeReflection(hitPoint,rayDir);color=mix(baseLight,reflection,0.5);// Blend reflections}else{color=vec3(0.5,0.7,1.0);// Sky color}fragColor=vec4(color,1.0);}
Running this shader, you should see two very reflective spheres reflecting each other.
Conclusion
With just a few functions, we’ve recreated a classic ray tracing scene using ray marching. This technique allows
us to:
Render reflective surfaces without traditional ray tracing
Generate soft shadows using SDF normals
Extend the method for refraction and more complex materials
Fractals are some of the most mesmerizing visuals in computer graphics, but rendering them in 3D space requires
special techniques beyond standard polygonal rendering. This article will take you into the world of ray marching,
where we’ll use distance fields, lighting, and soft shadows to render a Mandelbulb fractal — one of the most famous 3D
fractals.
By the end of this post, you’ll understand:
The basics of ray marching and signed distance functions (SDFs).
How to render 3D objects without polygons.
How to implement Phong shading for realistic lighting.
How to compute soft shadows for better depth.
How to animate a rotating Mandelbulb fractal.
This article will build on the knowledge that we established in the Basics of Shader Programming
article that we put together earlier. If you haven’t read through that one, it’ll be worth taking a look at.
What is Ray Marching?
Ray marching is a distance-based rendering technique that is closely related to ray tracing.
However, instead of tracing rays until they hit exact geometry (like in traditional ray tracing), ray marching steps
along the ray incrementally using distance fields.
Each pixel on the screen sends out a ray into 3D space. We then march forward along the ray, using a signed
distance function (SDF) to tell us how far we are from the nearest object. This lets us render smooth implicit
surfaces like fractals and organic shapes.
Our first SDF
The simplest 3D object we can render using ray marching is a sphere. We define its shape using a
signed distance function (SDF):
// Sphere Signed Distance Function (SDF)floatsdfSphere(vec3p,floatr){returnlength(p)-r;}
The sdfSphere() function returns the shortest distance from any point in space to the sphere’s surface.
We can now step along that ray until we reach the sphere. We do this by integrating our sfpSphere() function into our
mainImage() function:
voidmainImage(outvec4fragColor,invec2fragCoord){vec2uv=(fragCoord-0.5*iResolution.xy)/iResolution.y;// Camera setupvec3rayOrigin=vec3(0,0,-3);vec3rayDir=normalize(vec3(uv,1));floattotalDistance=0.0;constintmaxSteps=100;constfloatminDist=0.001;constfloatmaxDist=20.0;for(inti=0;i<maxSteps;i++){vec3pos=rayOrigin+rayDir*totalDistance;floatdist=sdfSphere(pos,1.0);if(dist<minDist)break;if(totalDistance>maxDist)break;totalDistance+=dist;}vec3col=(totalDistance<maxDist)?vec3(1.0):vec3(0.2,0.3,0.4);fragColor=vec4(col,1.0);}
First of all here, we convert the co-ordinate that we’re processing into screen co-ordinates:
We now iterate (march) down the ray to a maximum of maxSteps (currently set to 100) to determine if the ray
intersects with our sphere (via sdfSphere).
Finally, we render the colour of our sphere if the distance is within tolerance; otherwise we consider this part of
the background:
In order to make this sphere look a little more 3D, we can light it. In order to light any object, we need to be able
to compute surface normals. We do that via a function like this:
vec3getNormal(vec3p){vec2e=vec2(0.001,0.0);// Small offset for numerical differentiationreturnnormalize(vec3(sdfSphere(p+e.xyy,1.0)-sdfSphere(p-e.xyy,1.0),sdfSphere(p+e.yxy,1.0)-sdfSphere(p-e.yxy,1.0),sdfSphere(p+e.yyx,1.0)-sdfSphere(p-e.yyx,1.0)));}
We make decisions about the actual colour via a lighting function. This lighting function is informed by the surface
normals that it computes:
vec3lighting(vec3p){vec3lightPos=vec3(2.0,2.0,-2.0);// Light source positionvec3normal=getNormal(p);// Compute the normal at point 'p'vec3lightDir=normalize(lightPos-p);// Direction to lightfloatdiff=max(dot(normal,lightDir),0.0);// Diffuse lightingreturnvec3(diff);// Return grayscale lighting effect}
We can now integrate this back into our scene in the mainImage function. Rather than just making a static colour
return when we establish a hit point, we start to execute the lighting function towards the end of the function:
voidmainImage(outvec4fragColor,invec2fragCoord){vec2uv=(fragCoord-0.5*iResolution.xy)/iResolution.y;// Camera setupvec3rayOrigin=vec3(0,0,-3);// Camera positioned at (0,0,-3)vec3rayDir=normalize(vec3(uv,1));// Forward-facing ray// Ray marching parametersfloattotalDistance=0.0;constintmaxSteps=100;constfloatminDist=0.001;constfloatmaxDist=20.0;vec3hitPoint;// Ray marching loopfor(inti=0;i<maxSteps;i++){hitPoint=rayOrigin+rayDir*totalDistance;floatdist=sdfSphere(hitPoint,1.0);// Distance to the sphereif(dist<minDist)break;// If we are close enough to the surface, stopif(totalDistance>maxDist)break;// If we exceed max distance, stoptotalDistance+=dist;}// If we hit something, apply shading; otherwise, keep background colorvec3col=(totalDistance<maxDist)?lighting(hitPoint):vec3(0.2,0.3,0.4);fragColor=vec4(col,1.0);}
You should see something similar to this:
Mandelbulbs
We can now upgrade our rendering to use something a little more complex then our sphere.
SDF
The Mandelbulb is a 3D fractal inspired by the 2D Mandelbrot Set. Instead of working in 2D complex numbers, it uses
spherical coordinates in 3D space.
The core formula: is extended to 3D using spherical math.
Instead of a sphere SDF, we’ll use an iterative function to compute distances to the fractal surface.
This function iterates over complex numbers in 3D space to compute the Mandelbulb structure.
Raymarching the Mandelbulb
Now, we can take a look at what this produces. We use our newly created SDF to get our hit point. We’ll use this
distance value as well to establish different colours.
voidmainImage(outvec4fragColor,invec2fragCoord){vec2uv=(fragCoord-0.5*iResolution.xy)/iResolution.y;// Camera setupvec3rayOrigin=vec3(0,0,-4);vec3rayDir=normalize(vec3(uv,1));// Ray marching parametersfloattotalDistance=0.0;constintmaxSteps=100;constfloatminDist=0.001;constfloatmaxDist=10.0;vec3hitPoint;// Ray marching loopfor(inti=0;i<maxSteps;i++){hitPoint=rayOrigin+rayDir*totalDistance;floatdist=mandelbulbSDF(hitPoint);// Fractal distance functionif(dist<minDist)break;if(totalDistance>maxDist)break;totalDistance+=dist;}// Color based on distance (simple shading)vec3col=(totalDistance<maxDist)?vec3(1.0-totalDistance*0.1):vec3(0.1,0.1,0.2);fragColor=vec4(col,1.0);}
You should see something similar to this:
Rotation
We can’t see much with how this object is oriented. By adding some basic animation, we can start to look at the complexities
of how this object is put together. We use the global iTime variable here to establish movement:
voidmainImage(outvec4fragColor,invec2fragCoord){vec2uv=(fragCoord-0.5*iResolution.xy)/iResolution.y;// Rotate camera around the fractal using iTimefloatangle=iTime*0.5;// Adjust speed of rotationvec3rayOrigin=vec3(3.0*cos(angle),0.0,3.0*sin(angle));// Circular pathvec3target=vec3(0.0,0.0,0.0);// Looking at the fractalvec3forward=normalize(target-rayOrigin);vec3right=normalize(cross(vec3(0,1,0),forward));vec3up=cross(forward,right);vec3rayDir=normalize(forward+uv.x*right+uv.y*up);// Ray marching parametersfloattotalDistance=0.0;constintmaxSteps=100;constfloatminDist=0.001;constfloatmaxDist=10.0;vec3hitPoint;// Ray marching loopfor(inti=0;i<maxSteps;i++){hitPoint=rayOrigin+rayDir*totalDistance;floatdist=mandelbulbSDF(hitPoint);// Fractal distance functionif(dist<minDist)break;if(totalDistance>maxDist)break;totalDistance+=dist;}// Color based on distance (simple shading)vec3col=(totalDistance<maxDist)?vec3(1.0-totalDistance*0.1):vec3(0.1,0.1,0.2);fragColor=vec4(col,1.0);}
You should see something similar to this:
Lights
In order to make our fractal look 3D, we need to be able to compute our surface normals. We’ll be using the
mandelbulbSDF function above to accomplish this:
To make the fractal look more realistic, we’ll implement soft shadows. This will really enhance how this object looks.
// Soft Shadows (traces a secondary ray to detect occlusion)floatsoftShadow(vec3ro,vec3rd){floatres=1.0;floatt=0.02;// Small starting stepfor(inti=0;i<24;i++){floatd=mandelbulbSDF(ro+rd*t);if(d<0.001)return0.0;// Fully in shadowres=min(res,10.0*d/t);// Soft transitiont+=d;}returnres;}
Pulling it all together
We can now pull all of these enhancements together with our main image function:
voidmainImage(outvec4fragColor,invec2fragCoord){vec2uv=(fragCoord-0.5*iResolution.xy)/iResolution.y;// Rotating Camerafloatangle=iTime*0.5;vec3rayOrigin=vec3(3.0*cos(angle),0.0,3.0*sin(angle));vec3target=vec3(0.0);vec3forward=normalize(target-rayOrigin);vec3right=normalize(cross(vec3(0,1,0),forward));vec3up=cross(forward,right);vec3rayDir=normalize(forward+uv.x*right+uv.y*up);// Ray marchingfloattotalDistance=0.0;constintmaxSteps=100;constfloatminDist=0.001;constfloatmaxDist=10.0;vec3hitPoint;for(inti=0;i<maxSteps;i++){hitPoint=rayOrigin+rayDir*totalDistance;floatdist=mandelbulbSDF(hitPoint);if(dist<minDist)break;if(totalDistance>maxDist)break;totalDistance+=dist;}// Compute lighting only if we hit the fractalvec3color;if(totalDistance<maxDist){vec3viewDir=normalize(rayOrigin-hitPoint);vec3baseLight=phongLighting(hitPoint,viewDir);floatshadow=softShadow(hitPoint,normalize(vec3(2.0,2.0,-2.0)));color=baseLight*shadow;// Apply shadows}else{color=vec3(0.1,0.1,0.2);// Background color}fragColor=vec4(color,1.0);}
Finally, you should see something similar to this:
Shaders are one of the most powerful tools in modern computer graphics, allowing real-time effects, lighting, and
animation on the GPU (Graphics Processing Unit). They are used in games, simulations, and rendering engines
to control how pixels and geometry appear on screen.
In this article, we’ll break down:
What shaders are and why they matter
How to write your first shader
Understanding screen coordinates
Animating a shader
This guide assumes zero prior knowledge of shaders and will explain each line of code step by step.
All of the code here can be run using Shadertoy. You won’t need to install any dependencies,
but you will need a GPU-capable computer!
What is a Shader?
A shader is a small program that runs on the GPU. Unlike regular CPU code, shaders are executed in parallel for
every pixel or vertex on the screen.
Types of Shaders
Vertex Shader – Moves and transforms individual points in 3D space.
Fragment Shader (Pixel Shader) – Determines the final color of each pixel.
For now, we’ll focus on fragment shaders since they control how things look.
Your First Shader
Let’s start with the simplest shader possible: a solid color fill.
Seeing Red!
voidmainImage(outvec4fragColor,invec2fragCoord){fragColor=vec4(1.0,0.0,0.0,1.0);// Solid red color}
Breaking this code down:
void mainImage(...) → This function runs for every pixel on the screen.
fragColor → The output color of the pixel.
vec4(1.0, 0.0, 0.0, 1.0) → This defines an RGBA color:
1.0, 0.0, 0.0 → Red
1.0 → Fully opaque (no transparency)
Try This: Change the color values:
vec4(0.0, 1.0, 0.0, 1.0); → Green
vec4(0.0, 0.0, 1.0, 1.0); → Blue
vec4(1.0, 1.0, 0.0, 1.0); → Yellow
Mapping Colors to Screen Position
Instead of filling the screen with a single color, let’s map colors to pixel positions.
A Gradient Shader
voidmainImage(outvec4fragColor,invec2fragCoord){vec2uv=fragCoord/iResolution.xy;// Normalize coordinates (0 to 1)fragColor=vec4(uv.x,uv.y,0.5,1.0);}
Breaking this code down:
fragCoord / iResolution.xy → Converts pixel coordinates into a 0 → 1 range.
uv.x → Controls red (left to right gradient).
uv.y → Controls green (bottom to top gradient).
0.5 → Keeps blue constant.
This creates a smooth gradient across the screen!
Try This: Swap uv.x and uv.y to see different patterns.
Animation
Shaders can react to time using iTime. This lets us create dynamic effects.
sin(uv.x * 10.0 + iTime) → Creates waves that move over time.
The whole screen pulses from black to white dynamically.
Try This: Change 10.0 to 20.0 or 5.0 to make waves tighter or wider.
Wrapping up
Here have been some very simple shader programs to get started with. In future articles, we’ll build on this knowledge
to build more exciting graphics applications.