diff --git a/ATDS.mp3 b/ATDS.mp3 new file mode 100644 index 0000000..e9b0446 Binary files /dev/null and b/ATDS.mp3 differ diff --git a/README.md b/README.md old mode 100644 new mode 100755 index d4ef264..a61f961 --- a/README.md +++ b/README.md @@ -1,118 +1,30 @@ -# [Project 1: Noise](https://github.com/CIS700-Procedural-Graphics/Project1-Noise) + MULTI OCTAVE NOISE -## Objective +Implemented Multi Octave Pseudo-Random noise and applied it to each vertex point on the mesh. -Get comfortable with using three.js and its shader support and generate an interesting 3D, continuous surface using a multi-octave noise algorithm. +Implemented cosine interpolation and smoothing function to obtain the average noise at each lattice point surround the given vertex of the mesh and to find an interpolated noise value for the vertex point. -## Getting Started +The vertex positions are offset by a factor of the noise along the normal direction and the height is interpolated along all the intermediate points of the mesh. -1. [Install Node.js](https://nodejs.org/en/download/). Node.js is a JavaScript runtime. It basically allows you to run JavaScript when not in a browser. For our purposes, this is not necessary. The important part is that with it comes `npm`, the Node Package Manager. This allows us to easily declare and install external dependencies such as [three.js](https://threejs.org/), [dat.GUI](https://workshop.chromeexperiments.com/examples/gui/#1--Basic-Usage), and [glMatrix](http://glmatrix.net/). Some other packages we'll be using make it significantly easier to develop your code and create modules for better code reuse and clarity. These tools make it _signficantly_ easier to write code in multiple `.js` files without globally defining everything. +The texture applied on the mesh is manipulated with the noise value for randomising the output. -2. Fork and clone [this repository](https://github.com/CIS700-Procedural-Graphics/Project1-Noise). +Music is sampled and analised to mnipulate the position and offset of the vertices of the mesh when the music is playing. -3. In the root directory of your project, run `npm install`. This will download all of those dependencies. -4. Do either of the following (but I highly recommend the first one for reasons I will explain later). - a. Run `npm start` and then go to `localhost:7000` in your web browser +Reference Website List: - b. Run `npm run build` and then go open `index.html` in your web browser +1.) For loading and using audio on the website using Java script I used the code and explanation on this site: +https://www.patrick-wied.at/blog/how-to-create-audio-visualizations-with-javascript-html - You should hopefully see the framework code with a 3D cube at the center of the screen! +2.) WebGL 1.0 API referenece Card a quick and extensive list of all functions and libraries available for you while using WebGL 1.0.: +https://www.khronos.org/files/webgl/webgl-reference-card-1_0.pdf +3.) Counting Uniforms in WebGL, each OS and machine has its own requirments and limitations on how many uniforms are available at a time in a shader I used this website to referece it for my machine: +https://bocoup.com/weblog/counting-uniforms-in-webgl -## Developing Your Code -All of the JavaScript code is living inside the `src` directory. The main file that gets executed when you load the page as you may have guessed is `main.js`. Here, you can make any changes you want, import functions from other files, etc. The reason that I highly suggest you build your project with `npm start` is that doing so will start a process that watches for any changes you make to your code. If it detects anything, it'll automagically rebuild your project and then refresh your browser window for you. Wow. That's cool. If you do it the other way, you'll need to run `npm build` and then refresh your page every time you want to test something. +4.) Three.js documentation of its Libraries and Syntax of functions: +https://threejs.org/docs/index.html#Reference/Extras.Core/CurvePath -## Publishing Your Code -We highly suggest that you put your code on GitHub. One of the reasons we chose to make this course using JavaScript is that the Web is highly accessible and making your awesome work public and visible can be a huge benefit when you're looking to score a job or internship. To aid you in this process, running `npm run deploy` will automatically build your project and push it to `gh-pages` where it will be visible at `username.github.io/repo-name`. - -## What is Actually Happening? -You can skip this part if you really want, but I highly suggest you read it. - -### npm install -`npm install` will install all dependencies into a folder called `node_modules`. That's about it. - -### package.json - -This is the important file that `npm` looks at. In it, you can see the commands it's using for the `start`, `build`, and `deploy` scripts mentioned above. You can also see all of the dependencies the project requires. I will briefly go through what each of these is. - - dat-gui: Gives us a nice and simple GUI for modifying variables in our program - - - gl-matrix: Useful library for linear algebra, much like glm - - - stats-js: Gives us a nice graph for timing things. We use it to report how long it takes to render each frame - - - three: Three.js is the main library we're using to draw stuff - - - three-orbit-controls: Handles mouse / touchscreen camera controls - - - babel-core, babel-loader, babel-preset-es2015: JavaScript is a a really fast moving language. It is constantly, constantly changing. Unfortunately, web browsers don't keep up nearly as quickly. Babel does the job of converting your code to a form that current browsers support. This allows us to use newer JavaScript features such as classes and imports without worrying about compatibility. - - - gh-pages-deploy: This is the library that automates publishing your code to Github - - - webpack: Webpack serves the role of packaging your project into a single file. Browsers don't actually support "importing" from other files, so without Webpack, to access data and functions in other files we would need to globally define EVERYTHING. This is an extremely bad idea. Webpack lets us use imports and develop code in separate files. Running `npm build` or `npm start` is what bundles all of your code together. - -- webpack-dev-server: This is an extremely useful tool for development. It essentially creates a file watcher and rebuilds your project whenever you make changes. It also injects code into your page that gets notified when these changes occur so it can automatically refresh your page. - - - webpack-glsl-loader: Webpack does much more than just JavaScript. We can use it to load glsl, css, images, etc. For whatever you want to import, somebody has probably made a webpack loader for it. - -### webpack.config.js - -This is the configuration file in webpack. The most important part is `entry` and `output`. These define the input and output for webpack. It will start from `entry`, explore all dependencies, and package them all into `output`. Here, the `output` is `bundle.js`. If you look in `index.html`, you can see that the page is loading `bundle.js`, not `main.js`. - -The other sections are just configuration settings for `webpack-dev-server` and setup for loading different types of files. - -## Setting up a shader - -Using the provided framework code, create a new three.js material which references a vertex and fragment shader. Look at the adamMaterial for reference. It should reference at least one uniform variable (you'll need a time variable to animate your mesh later on). - -Create [an icosahedron](https://threejs.org/docs/index.html#Reference/Geometries/IcosahedronBufferGeometry), instead of the default cube geometry provided in the scene. Test your shader setup by applying the material to the icosahedron and color the mesh in the fragment shader using the normals' XYZ components as RGB. - -Note that three.js automatically injects several uniform and attribute variables into your shaders by default; they are listed in the [documentation](https://threejs.org/docs/api/renderers/webgl/WebGLProgram.html) for three.js's WebGLProgram class. - -## Noise Generation - -In the shader, write a 3D multi-octave lattice-value noise function that takes three input parameters and generates output in a controlled range, say [0,1] or [-1, 1]. This will require the following steps. - -1. Write several (for however many octaves of noise you want) basic pseudo-random 3D noise functions (the hash-like functions we discussed in class). It's fine to reference one from the slides or elsewhere on the Internet. Again, this should just be a set of math operations, often using large prime numbers to random-looking output from three input parameters. - -2. Write an interpolation function. Lerp is fine, but for better results, we suggest cosine interpolation. - -3. (Optional) Write a smoothing function that will average the results of the noise value at some (x, y, z) with neighboring values, that is (x+-1, y+-1, z+-1). - -4. Write an 'interpolate noise' function that takes some (x, y, z) point as input and produces a noise value for that point by interpolating the surrounding lattice values (for 3D, this means the surrounding eight 'corner' points). Use your interpolation function and pseudo-random noise generator to accomplish this. - -5. Write a multi-octave noise generation function that sums multiple noise functions together, with each subsequent noise function increasing in frequency and decreasing in amplitude. You should use the interpolate noise function you wrote previously to accomplish this, as it generates a single octave of noise. The slides contain pseudocode for writing your multi-octave noise function. - - -## Noise Application - -View your noise in action by applying it as a displacement on the surface of your icosahedron, giving your icosahedron a bumpy, cloud-like appearance. Simply take the noise value as a height, and offset the vertices along the icosahedron's surface normals. You are, of course, free to alter the way your noise perturbs your icosahedron's surface as you see fit; we are simply recommending an easy way to visualize your noise. You could even apply a couple of different noise functions to perturb your surface to make it even less spherical. - -In order to animate the vertex displacement, use time as the third dimension or as some offset to the (x, y, z) input to the noise function. Pass the current time since start of program as a uniform to the shaders. - -For both visual impact and debugging help, also apply color to your geometry using the noise value at each point. There are several ways to do this. For example, you might use the noise value to create UV coordinates to read from a texture (say, a simple gradient image), or just compute the color by hand by lerping between values. - -## Interactivity - -Using dat.GUI and the examples provided in the reference code, make some aspect of your demo an interactive variable. For example, you could add a slider to adjust the strength or scale of the noise, change the number of noise octaves, etc. - -## For the overachievers (extra credit) - -- More interactivity (easy): pretty self-explanatory. Make more aspects of your demo interactive by adding more controlable variables in the GUI. - -- Custom mesh (easy): Figure out how to import a custom mesh rather than using an icosahedron for a fancy-shaped cloud. - -- Mouse interactivity (medium): Find out how to get the current mouse position in your scene and use it to deform your cloud, such that users can deform the cloud with their cursor. - -- Music (hard): Figure out a way to use music to drive your noise animation in some way, such that your noise cloud appears to dance. - -## Submission - -- Update README.md to contain a solid description of your project - -- Publish your project to gh-pages. `npm run deploy`. It should now be visible at http://username.github.io/repo-name - -- Create a [pull request](https://help.github.com/articles/creating-a-pull-request/) to this repository, and in the comment, include a link to your published project. - -- Submit the link to your pull request on Canvas. \ No newline at end of file +5.)A Good giude to understand and implement Perlin noise: +https://web.archive.org/web/20160510013854/http://freespace.virgin.net/hugo.elias/models/m_perlin.htm diff --git a/adam.jpg b/adam.jpg index a762190..c341995 100644 Binary files a/adam.jpg and b/adam.jpg differ diff --git a/adam1.jpg b/adam1.jpg new file mode 100755 index 0000000..a762190 Binary files /dev/null and b/adam1.jpg differ diff --git a/adam2.jpg b/adam2.jpg new file mode 100644 index 0000000..b542138 Binary files /dev/null and b/adam2.jpg differ diff --git a/explosion.png b/explosion.png new file mode 100644 index 0000000..797b65c Binary files /dev/null and b/explosion.png differ diff --git a/index.html b/index.html old mode 100644 new mode 100755 index f775186..444a38b --- a/index.html +++ b/index.html @@ -14,6 +14,7 @@ + \ No newline at end of file diff --git a/package.json b/package.json old mode 100644 new mode 100755 diff --git a/src/framework.js b/src/framework.js old mode 100644 new mode 100755 index 9cfcd1b..d5d31e7 --- a/src/framework.js +++ b/src/framework.js @@ -4,6 +4,33 @@ const OrbitControls = require('three-orbit-controls')(THREE) import Stats from 'stats-js' import DAT from 'dat-gui' +//Sound Global Variables +var audio; +var analyser; +var frequencyData; + +window.onload = function() { + var ctx = new AudioContext(); + audio = document.getElementById('myAudio'); + var audioSrc = ctx.createMediaElementSource(audio); + analyser = ctx.createAnalyser(); + // we have to connect the MediaElementSource with the analyser + audioSrc.connect(analyser); + audioSrc.connect(ctx.destination); + // we could configure the analyser: e.g. analyser.fftSize (for further infos read the spec) + + // frequencyBinCount tells you how many values you'll receive from the analyser + frequencyData = new Uint8Array(analyser.frequencyBinCount); + + // we're ready to receive some data! + // loop + function renderFrame() { + requestAnimationFrame(renderFrame); + + } + //audio.play(); +}; + // when the scene is done initializing, the function passed as `callback` will be executed // then, every frame, the function passed as `update` will be executed function init(callback, update) { @@ -52,10 +79,17 @@ function init(callback, update) { framework.scene = scene; framework.camera = camera; framework.renderer = renderer; - + framework.audio = audio; + // begin the animation loop (function tick() { stats.begin(); + // update data in frequencyData + analyser.getByteFrequencyData(frequencyData); + // render frame based on values in frequencyData + framework.frequencyData = frequencyData; + + update(framework); // perform any requested updates renderer.render(scene, camera); // render the scene stats.end(); @@ -72,4 +106,5 @@ export default { } export const PI = 3.14159265 -export const e = 2.7181718 \ No newline at end of file +export const e = 2.7181718 + diff --git a/src/main.js b/src/main.js old mode 100644 new mode 100755 index 92b19a4..4073602 --- a/src/main.js +++ b/src/main.js @@ -3,6 +3,43 @@ const THREE = require('three'); // older modules are imported like this. You sho import Framework from './framework' import Noise from './noise' import {other} from './noise' +import DAT from 'dat-gui' + +var adamMaterial = new THREE.ShaderMaterial({ + uniforms: { + image: { // Check the Three.JS documentation for the different allowed types and values + type: "t", + value: THREE.ImageUtils.loadTexture('./explosion.png') + }, + time: { + type: "f", + value: 1.0 + }, + persistance_p: { + type: "f", + value: 0.5 + }, + audData: { + type: "iv1", + value: new Array + } + + }, + vertexShader: require('./shaders/adam-vert.glsl'), + fragmentShader: require('./shaders/adam-frag.glsl') + }); + +var timer ={ + speed: 0.03 + } + +var persist = { + persistance: 1.13 +} + +var audToggle = { + AudioToggle: false +} // called after the scene loads function onLoad(framework) { @@ -11,23 +48,16 @@ function onLoad(framework) { var renderer = framework.renderer; var gui = framework.gui; var stats = framework.stats; - + var audio = framework.audio; + //var data = framework.frequencyData; + // LOOK: the line below is synyatic sugar for the code above. Optional, but I sort of recommend it. - // var {scene, camera, renderer, gui, stats} = framework; + // var {scene, camera, renderer, gui, stat} = framework; // initialize a simple box and material - var box = new THREE.BoxGeometry(1, 1, 1); - - var adamMaterial = new THREE.ShaderMaterial({ - uniforms: { - image: { // Check the Three.JS documentation for the different allowed types and values - type: "t", - value: THREE.ImageUtils.loadTexture('./adam.jpg') - } - }, - vertexShader: require('./shaders/adam-vert.glsl'), - fragmentShader: require('./shaders/adam-frag.glsl') - }); + //var box = new THREE.BoxGeometry(1, 1, 1); + var box = new THREE.IcosahedronGeometry(1,5); + var adamCube = new THREE.Mesh(box, adamMaterial); // set camera position @@ -41,11 +71,37 @@ function onLoad(framework) { gui.add(camera, 'fov', 0, 180).onChange(function(newVal) { camera.updateProjectionMatrix(); }); + + gui.add(timer, 'speed', 0,0.05, 0.001).onChange(function(newVal1){ + timer.speed = newVal1; + }); + + gui.add(persist, 'persistance', 0,2).onChange(function(newVal2){ + persist.persistance = newVal2; + }); + + gui.add(audToggle, 'AudioToggle').onChange(function(newVal3){ + //audToggle.AudioToggle = newVal3; + if(newVal3 === true) audio.play(); + else audio.pause(); + }); + } + + + //var gui1 = new DAT.GUI(); + + // called on frame updates function onUpdate(framework) { // console.log(`the time is ${new Date()}`); + + adamMaterial.uniforms.time.value += timer.speed; + adamMaterial.uniforms.persistance_p.value = persist.persistance; + adamMaterial.uniforms.audData.value = Int32Array.from(framework.frequencyData); + //console.log(framework.frequencyData); + } // when the scene is done initializing, it will call onLoad, then on frame updates, call onUpdate diff --git a/src/noise.js b/src/noise.js old mode 100644 new mode 100755 diff --git a/src/shaders/adam-frag.glsl b/src/shaders/adam-frag.glsl old mode 100644 new mode 100755 index 5dfa18c..b150c9a --- a/src/shaders/adam-frag.glsl +++ b/src/shaders/adam-frag.glsl @@ -1,13 +1,18 @@ varying vec2 vUv; -varying float noise; +varying float n; uniform sampler2D image; - +varying vec3 col; +varying vec3 nor; +varying float s; void main() { - vec2 uv = vec2(1,1) - vUv; - vec4 color = texture2D( image, uv ); + vec2 uv = vec2(1,1) - vUv * cos(n); + vec4 color = texture2D( image, uv * sin(n) + s); gl_FragColor = vec4( color.rgb, 1.0 ); + + //gl_FragColor = vec4(abs(nor.rgb), 1.0); + //gl_FragColor = vec4(col.rgb, 1.0); } \ No newline at end of file diff --git a/src/shaders/adam-vert.glsl b/src/shaders/adam-vert.glsl old mode 100644 new mode 100755 index e4b8cc0..14f9303 --- a/src/shaders/adam-vert.glsl +++ b/src/shaders/adam-vert.glsl @@ -1,6 +1,118 @@ - +varying float n; varying vec2 vUv; +varying vec3 nor; +varying vec3 col; +uniform float time; +uniform float persistance_p; +uniform int audData[1000]; + +varying float s; + +float Noise3D(int x, int y, int z) +{ + float ft = fract(sin(dot(vec3(x,y,z), vec3(12.989, 78.233, 157))) * 43758.5453); + //int a = int(ft); + return ft; +} + + +float SmoothNoise3D(int X, int Y, int Z) +{ + float far = (Noise3D(X-1, Y+1, Z+1) + Noise3D(X+1, Y+1, Z+1) + Noise3D(X-1, Y+1, Z-1) + Noise3D(X+1, Y+1, Z-1) + Noise3D(X-1, Y-1, Z+1) + Noise3D(X+1, Y-1, Z+1) + Noise3D(X-1, Y-1, Z-1) + Noise3D(X+1, Y-1, Z-1)) / 64.0;//80.0; + + float medium = (Noise3D(X-1, Y+1, Z) + Noise3D(X+1, Y+1, Z) + Noise3D(X-1, Y-1, Z) + Noise3D(X+1, Y-1, Z) + Noise3D(X, Y+1, Z+1) + Noise3D(X, Y+1, Z-1) + Noise3D(X, Y-1, Z+1) + Noise3D(X, Y-1, Z-1) + Noise3D(X-1, Y, Z+1) + Noise3D(X+1, Y, Z+1) + Noise3D(X-1, Y, Z-1) + Noise3D(X+1, Y, Z-1)) / 32.0;//60.0; + + float closest = (Noise3D(X-1, Y, Z) + Noise3D(X+1, Y, Z) + Noise3D(X, Y-1, Z) + Noise3D(X, Y+1, Z) + Noise3D(X, Y, Z+1) + Noise3D(X, Y, Z-1)) / 16.0;//19.999; + + float self = Noise3D(X, Y, Z) / 4.0; + + + return self + closest + medium + far; +} + + +float Interpolate(float a, float b, float x) +{ + float t = (1.0 - cos(x * 3.14159)) * 0.5; + + return a * (1.0 - t) + b * t; +} + +float InterpolateNoise3D(float x, float y, float z) +{ + int int_X = int(x); + int int_Y = int(y); + int int_Z = int(z); + + float float_X = x - float(int_X); + float float_Y = y - float(int_Y); + float float_Z = z - float(int_Z); + + //8 Points on the lattice sorrunding the given point + float p1 = SmoothNoise3D(int_X, int_Y, int_Z); + float p2 = SmoothNoise3D(int_X + 1, int_Y, int_Z); + float p3 = SmoothNoise3D(int_X, int_Y + 1, int_Z); + float p4 = SmoothNoise3D(int_X + 1, int_Y + 1, int_Z); + float p5 = SmoothNoise3D(int_X, int_Y, int_Z + 1); + float p6 = SmoothNoise3D(int_X + 1, int_Y, int_Z + 1); + float p7 = SmoothNoise3D(int_X, int_Y + 1, int_Z + 1); + float p8 = SmoothNoise3D(int_X + 1, int_Y + 1, int_Z + 1); + + float i1 = Interpolate(p1, p2, float_X); + float i2 = Interpolate(p3, p4, float_X); + float i3 = Interpolate(p5, p6, float_X); + float i4 = Interpolate(p7, p8, float_X); + + float n1 = Interpolate(i1, i2, float_Y); + float n2 = Interpolate(i3, i4, float_Y); + + float t1 = Interpolate(n1, n2, float_Z); + + return t1; +} + + +float Generate_Noise3D(vec3 pos, float persistance, int octaves) +{ + float total = 0.0; + float p = persistance; + int n = octaves; + + //int i = 0; + for(int i=0; i < 4; i++) + { + float frequency = pow(float(2), float(i)); + float amplitude = pow(p, float(i)); + + total = total + InterpolateNoise3D((pos.x + time )* frequency, (pos.y + time) * frequency, (pos.z + time) * frequency) * amplitude; + + } + + return total; +} + void main() { vUv = uv; - gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 ); + nor = normal; + + int index = 200; + float sound = float(audData[index]) / float(255); + s = sound; + float noise = Generate_Noise3D(position, persistance_p, 8); + n = noise; + + + //float red = vec3(1.0,0.0,0.0); + //float white = vec3(0.5,0.5,0.5); + //float tcol = vec3(0.0,0.0,0.0); + + //manuplating the output colors + //tcol = red * (1.0 - noise) + white * noise; + //col = mix(normal, vec3(noise * (1.0 - sound) + normal.x, noise * (1.0 - sound) + normal.x, noise * (1.0 - sound) + normal.x), sound); + + //manuplating the position + vec3 pos_new; + pos_new = position * 10.0 * sound + ((noise * normal)); + + gl_Position = projectionMatrix * modelViewMatrix * vec4( pos_new, 1.0 ); } \ No newline at end of file diff --git a/webpack.config.js b/webpack.config.js old mode 100644 new mode 100755