Sonic Design / Weekly Note (Class / Lecture) and Progress

Week 01

This week, we had our first lecture on Sonic Design. The lecture introduced the module on the MIB, gave us a brief overview of our assignment, and provided some basic guidance on how to use Adobe Audition. We were also assigned an exercise.


A brief note on the assignment idea provided in today's lecture.

Project 1Exercise only

Project 2 (2–3 minutes)

  • No need to record.
  • Demonstrate near and far, left and right positioning.
  • Expression is allowed.
  • No music or only short music is allowed.
  • Avoid using sounds that tell a story or include words.

Project 3 (2–3 minutes)

  • Record voice.
  • Focus on recording clean and professional sound.
  • Record a fairy-tale (can use PPT, no need for animation).
  • Focus on voice only.
  • Background music should match the scene (you will need to download music and sound effects).
  • You can have others record the voice (as long as it is a human voice).

Final

  • Bring gameplay to life using sound effects.
  • 100% of the sound effects must be recorded by you

Starting Exercise 01 - Exercise 01

In Today Class Our lecture have also though us on how to set up Adobe Audition and how to start with our assignment 01 first exercise 

Here are some Note :
How to add a new track for multitrack 

How to create a Parametric Equalizer

How the Parametric Equalizer look at the beginning

How Multitrack look like 


Weekly Reflection - In this first class, I was introduced to Adobe Audition and also learned about Sonic. Both were quite new to me, but throughout the class, and gained a greater interest in what we will be doing next. I found it fascinating to adjust the bass and treble in the parametric equalizer.



Week 02

Lecture Video :

Lecture Note :

Human Ear :
  • Outer Ear: Sound is captured and directed into the ear canal.
  • Middle Ear: Contains the paper-thin eardrum and a small, air-filled cavity with three tiny bones (malleus, incus, and stapes).
  • Inner Ear: Consists of the cochlea (hearing canal), endolymphatic sac, and semicircular canals.

Psychoacoustics :
  • Wavelength: The distance between two points in a sound wave.  
  • Amplitude: The higher the amplitude, the louder the sound.  
  • Frequency: The number of complete cycles in a second; a higher frequency means more cycles per second.

Properties to sound 
  • Pitch  
  • Loudness  
  • Timbre: The quality of sound.  
  • Perceived Duration: The perception of speed or duration of sound.  
  • Envelope: The structure of a sound's onset, sustain, and decay.  
  • Spatialization: The location of sound in a space or the placement of sound.

***Sound travel by vibration***
***Sound Vibration is count in Hz***

Pitch :
  • Vibrations per second = frequency. 
  • Fewer vibrations = lower pitch and lower frequency.  
  • More vibrations = higher pitch and higher frequency. 
  • 1 second = 1 Hz.

Humans can hear sounds ranging from 20Hz to 20kHz. The bass/treble changes as the frequency (Hz) changes. You can adjust the Hz to alter the sound.  
  • 20Hz to 250Hz: Low-Bass frequency.  
  • 5kHz to 20kHz: Highs-Treble frequency.

Tutorial and Practical - Exercise 02 

In today's class, the lecturer gave us a sound and instructed us to modify it based on the environment he specified as part of Exercise 02 on Sound Shaping. During this process, he provided us with a basic understanding of how to listen to and identify different aspects of sound and how sound behaves in various spaces.

Weekly Reflection - This week, the lecture taught us the basics of how we hear and how vibrations relate to high and low Hz. During the practical exercises, I struggled a lot to get the correct sound, which was quite challenging for me. However, I found it interesting to adjust the bass and treble to change the environment setting of sound. Even though it’s hard for me to identify if it fits in the environment, So i think i will need more practice improve my listening skills.



Lecture note (sound design):
***DAW = digital audio workstation (software we uses to do sound design)***

Steps that we uses in sound design
  • Layering
  • Time stretching
  • Pitch Shifting
  • Reversing
  • Mouth it!

Layering

  • Combine two or more sounds and place them on top of each other to create a new sound.
  • Mix sounds together to enhance the overall audio experience.

Time Stretching

  • Extend the duration of a sound without changing its pitch.
  • Useful for adjusting the pacing or tempo without altering the pitch of the sound.
  • Example: In a 30-second commercial, if the pacing is too slow, time-stretching can compress the sound, making it fit within the required time frame. However, if stretched too much (e.g., for 60 seconds), the sound might become distorted or unnatural.

Pitch Shifting

  • Change the pitch of a sound to make it higher or lower without affecting its length.
  • There’s a correlation between pitch and the perceived characteristics of the sound source.
  • Lowering the pitch often creates a slower and deeper sound, while increasing the pitch results in a faster and higher sound.

Reversing

  • Play the audio backward to create a unique, eerie, or unnatural effect.
  • Combined with layering, reversing can produce interesting and experimental sounds.

Mouth It!

  • If you can’t find the exact sound you need, try creating it with your mouth.
  • Sometimes, vocalizing sounds can provide a unique, customized audio element that you wouldn’t find elsewhere.

Tutorial and Practical - Exercise 03

In today's class, the lecturer gave us an explosion sound and taught us how to find and apply effects to edit the sound into different variations, such as explosions, firecrackers, and a triple combo sound effect for our Exercise 03.

Weekly Reflection - This week, I had explore more on Adobe Audition and further learn new concepts related to sound design. By doing exercises it helped me become a bit more familiar with Adobe Audition, even though I’m not used to it and sometimes got a little lost. However, by exploring, I gained a better understanding of how effects work and how they can be implemented to create different sounds.


Week 04

This week, the lecture provided a video about Diegetic vs. Non-Diegetic Sound, which is a new concept for me. After watching the video, I found it very interesting, as sound is one of the important elements that can trigger our emotions or feelings while watching movies. I also learned for the first time that movies use different techniques to separate types of sounds.

Lecture This week - Video Provided (lecture Video)

Notes 
Diegetic Elements

Definition: Things that characters can experience within their world.

Diegetic Sound: Refers to any sound that characters within the story world can hear.

  • Zones of Diegetic Sound:
    • Acousmatic Zones: Sound heard but not seen on screen (e.g., bird sounds, off-screen character sounds).
    • Visualized Zone: The source of the sound is visible on screen.
  • Internal Diegetic Sound: 
    • Sounds that represent a character's thoughts or inner experiences, audible to the audience but not to other characters (e.g., a character’s internal monologue).
  • Examples:
    • Environmental sounds like weather and vehicles
    • Sounds from actions (weapons, music within the film’s world)
    • Dialogue among characters
  • Sound Transitions: Sound can move between zones, creating effects by revealing or hiding the sound’s source.

Non-Diegetic Elements

Definition: Elements that only the audience perceives, not the characters.

Non-Diegetic Sound: Sounds not heard by characters, often used to enhance the emotional impact or storytelling.

  • Examples:
    • Sound effects, musical scores
    • Forms of narration
  • Non-Diegetic Visuals:
    • Title cards, visual overlays, clarifications, and other elements outside the characters' world
  • Purpose: 
    • These elements can enhance movement, build emotion, and contribute to the narrative outside of the character's perception.
Trans-Diegetic Sound
  • Definition: 
    • A mix of diegetic and non-diegetic sounds that switch between the two, breaking audience expectations.
  • Examples:
    • Non-diegetic sounds (like background music) suddenly become diegetic (characters acknowledge the music), or vice versa.
    • An actor turning off music that started as non-diegetic, which then becomes diegetic as it's revealed within the story.
Greek Origins
  • The terms stem from Greek origins, with "diegetic" indicating elements that belong to the narrative world and "non-diegetic" those that exist outside it.
Additional Notes
  • Breaking the Rules: 
    • Sometimes, diegetic elements are manipulated to create unexpected outcomes, diverging from audience expectations.
  • Communicative Power: 
    • Both sound and image can communicate story elements and emotions, affecting audience engagement.

Tutorial and Practical - Exercise 04

This week, we are exploring sound direction from left to right using the feature in multitrack. We also visited the sound recording room to learn and gain a better understanding of sound recording techniques.

Weekly Reflection - This week, I learned new skills in identifying and creating sound movement from one side to another, which I found very interesting. I also gained some basic knowledge about the facilities in the recording room, which was fascinating to explore.


Week 05

This week, the lecture showed us past projects from our seniors and gave us some useful tips, such as using color to differentiate between different types of sounds.

Lecture Notes:

  • Hard Limiter: Prevents sound from exceeding a certain level (red line) and helps control amplitude. It should only be used if necessary, as the last step in the process (apply EQ first).
  • Grouping Tracks: To change multiple tracks simultaneously, go to multitrack → track → stereo bus track.
  • Mastering: The final step used to make small adjustments (the final polish).
  • Creating a Sound Environment:
    • Identify the sounds you have.
    • Determine which sound effects are needed to build the environment.
  • After everything is completed, save as a multitrack (save the mix).
  • Submit the file in WAV format (multitrack), not MP3, with a sample rate of 48,000 Hz and 16-bit depth.

Assignment 2 Steps:

  1. Decide which project to pursue.
  2. Create a storyline.
  3. Categorize which elements are foreground and which are background.
  4. Once the storyline is complete, determine the sound effects you want to use for each scene.
  5. Plan the audio and extract the sounds you want. Consider that other sounds might overshadow the primary sound you want.
Finish the storyline and list the sound effects needed. Search for these sound effects. (first person perspective ) -todo this week (need for 2-3 minute) 
**do not use same sound more than once***



Week 06

This week, due to the Diwali holiday, we are not having any classes and are focusing on our own Assignment 2

< To Jump Link

Week 07

This week, we only need to consult with the lecturer on Assignment 2 to get attendance, then we are free to go.

< To Jump Link

Week 09

Note

Microphone Types

  1. Dynamic Microphones

    • Durability: Tough and durable, designed for stage and live use.
    • Sensitivity: Less sensitive to small sounds, resistant to handling noise.
    • Power: Does not require external power.
    • Usage:
      • Live performances
      • Loud sound sources like drums, guitar amps, and vocals in noisy environments.
  2. Condenser Microphones

    • Sensitivity: Highly sensitive, captures subtle details and nuances.
    • Fragility: More delicate; can break easily if dropped, affecting sound quality.
    • Structure:
      • Contains a front plate and a back plate that form a capacitor to detect sound.
      • Requires external power (e.g., battery or phantom power).
    • Usage:
      • Studio recordings
      • Vocals and acoustic instruments requiring detailed sound reproduction.
  3. Shotgun Microphones (a type of directional condenser microphone)

    • Directional Pickup: Highly focused on capturing sound directly where it is pointed.
    • Usage:
      • Outdoor recordings
      • Film and video production.
    • Tips:
      • Point the mic toward the sound source for accurate capture.
      • Avoid pointing directly at the mouth; aim at the chest for consistent audio.
      • Misalignment can cause sudden changes in volume and sound quality.

Pickup Patterns (Polar Patterns)

  1. Cardioid

    • Shape: Heart-shaped pattern.
    • Pickup Area: Strong in the front, some sensitivity on the sides, minimal at the back.
    • Usage:
      • Common for vocals and instruments.
      • Ideal for environments with minimal background noise.
  2. Omnidirectional

    • Shape: Captures sound equally from all directions.
    • Usage:
      • Great for capturing ambient sound or group recordings.
      • Suitable for recording in open spaces or nature settings.
  3. Hypercardioid

    • Shape: More focused pickup at the front with slight sensitivity at the back.
    • Usage:
      • Good for isolating sound in noisy environments.
      • Requires the sound source to remain relatively stationary.
  4. Figure-of-Eight (Bidirectional)

    • Shape: Captures sound evenly from the front and back, rejects sound from the sides.
    • Usage:
      • Professional setups like interviews or duets.
      • Often used with two microphones for stereo effects.
  5. Multi-Pattern Microphones:

    • Some microphones allow switching between polar patterns (e.g., cardioid, omnidirectional, figure-of-eight).
    • These are versatile but more expensive.

Connectors

  1. Types of Connectors:

    • Male Connector: Plugs into audio equipment.
    • Female Connector: Receives the male connector.
  2. Balanced vs. Unbalanced:

    • Balanced Connectors:
      • Reduces interference and noise.
      • Ideal for longer cables (e.g., XLR cables).
    • Unbalanced Connectors:
      • Simpler design but prone to interference.
      • Suitable for short distances (e.g., ¼” instrument cables).

Voice Recording Tips

  1. Proximity Effect:

    • Near the Microphone: Enhances bass and captures small details like breathing.
    • Far from the Microphone: Picks up more ambient sound and room acoustics.
    • Find the right distance to balance clarity and tone while minimizing unwanted noise.
  2. Microphone Placement (Different Settings):

    • Studio: Position for clarity and reduce reflections using pop filters and shock mounts.
    • Outdoor: Use directional mics like shotguns and protect against wind with windshields.
    • Live Performance: Dynamic mics placed close to the sound source for focused capture.

Mixer Basics

  1. Channels:

    • Determines how many microphones or instruments can be connected simultaneously.
  2. Types of Mixers:

    • Analog Mixers: Simple and tactile controls.
    • Digital Mixers: More flexible with effects and presets.
    • Hybrid Mixers: Combine analog and digital functionalities.
  3. Usage:

    • Commonly used in studios to control and mix multiple audio inputs.

Tips for Recording

  1. Control the Environment:

    • Minimize background noise by:
      • Turning off air conditioners, fans, and other noisy appliances.
      • Choosing a quiet time of day for recording (e.g., early morning or late night).
  2. Optimize the Recording Space:

    • Use smaller, enclosed spaces such as:
      • A closet or under a blanket.
      • A parked car, which provides good sound insulation.
    • The smaller the space, the better the sound quality due to reduced echo and noise.
  3. Soundproof the Area:

    • Cover the recording space with soft materials like blankets or foam to absorb noise.
    • Build a small, dedicated soundproof room or booth to reduce white noise.
  4. General Tips:

    • Always ensure the microphone and recording area are properly covered to block out unwanted noise.
    • Perform a test recording to check for sound quality before starting.

How to Reduce Unnecessary Noise in Audio Recordings

  1. Identify and Record Background Noise:

    • Record the environment for 3 seconds before starting to capture any unnecessary background noise.
    • Use this sample as a reference for noise reduction.
  2. Noise Reduction Process:

    • Open your audio editing software.
    • Navigate to Effect > Noise Reduction:
      1. Capture Noise Print: Highlight the 3-second noise recording and select "Capture Noise Print."
      2. Apply Noise Reduction: Select the entire recording, then apply the noise reduction process to remove similar background noise.
  3. Remove Unwanted Sounds:

    • After noise reduction, carefully listen to the recording and:
      • Identify and delete sounds like breathing, lip smacking, or any other unwanted noise.
      • Manually adjust the volume to make it consistent if sudden loud sounds are present.
  4. Dynamic Adjustments (Rack Effects):

    • Use the Dynamic Processor (Auto Gate) to filter out low-level unwanted noise and clean up the audio.
  5. Compression:

    • Apply a Compressor to:
      • Balance volume levels and make the sound more consistent.
      • Use the Make-Up Gain setting to amplify softer sounds after compression.
      • Adjust manually first before applying a compressor for finer control.
  6. De-Essing:

    • Use a De-Esser (Rack Effect) to reduce harsh “S” sounds:
      1. Identify the frequency range of the “S” sound.
      2. Apply the De-Esser to smooth out sharp high frequencies.
ADD VIDEO 
Multitrack first > Multitrack> Track>Add Video Track

Assignment Note
Create a video with story for 3 minute can add background music and also can add sound effect 
Once Done = Submit Multitrack and Video ,Go to other app and put the audio and the image

Class Activity Outcome



Week 10

This week, we were provided with a script and tasked with recording an advertisement. The lecturer gave us several options, and I chose the event planning one.

This was the outcome of the class activity.

This is the Script


Activity Outcome

< To Jump Link


Week 11

  • Brief on assignment 4/Final

Keep Note of the Sound (Prepare/Start and Ending):

  • Example: Arrow
    • Prepare to Pull: Sound of pulling the bowstring.
    • Pull Out: Sound of the string stretching.
    • Landing: Sound of the arrow hitting the target.
    • Final: Completing the action.

Create the Sound One at a Time (Process Slowly):

  • Focus on one sound at a time, completing the process before moving to the next.

Audio Storyboard (Track on Sound):

  • Track the sound and its time of appearance.
  • Show the timeline and indicate when each sound happens.
  • Use color coding to make it easier to follow.
sound story board (layout) - Example

Mark Before You Go to the Studio to Record:

  • Plan early.
  • Bring items relevant to the project.

***all sound must be recorded ***100% by you !!!*****

Week 12

Lecture Note

1. Storyboard: Visuals and Sounds

  • Details Required:

    • Clearly describe the visuals and sounds needed for each part of your project.
    • Specify what sound occurs during each action or scene.
    • Be precise and descriptive to ensure clarity in production.
  • Recording Sound for Assignment:

    • Record as many variations of sounds as possible.

    • Tips for Organization:

      • Announce the sound being recorded (e.g., "Glass breaking, Version 1").
      • Record different variables for each sound:
        • Speed: Fast, slow, or moderate.
        • Parts: Different components of the sound (e.g., the initial break, the fall, the echo).
        • Techniques: Experiment with methods (e.g., glass breaking from different heights).
    • Examples:

      • Zip: Record at different speeds.
      • Glass Sound: Capture the sound from different impacts (e.g., side hit vs. full break).
      • Electric Sound: Use tape to vary speeds and effects.
    • Key Reminder: Avoid overusing the same sound; incorporate variations for authenticity and richness.


2. Linear vs. Non-Linear Production

  • Linear Production:

    • Straightforward and direct process.
    • Based on predetermined actions or sequences.
    • Examples:
      • Traditional video storytelling.
      • Processes where the sequence is known and unchanging.
  • Non-Linear Production in Games:

    • Games are not linear because:
      • Outcomes depend on player choices and actions.
      • The narrative evolves dynamically.
    • Event Mapping for Games:
      • Predict possible actions that may occur in different stages.
      • Determine the required sound changes for each event.
      • Plan sound variations based on different game scenarios.

3. The Process of Recording Sound for Video

  • Past Methods:

    • Manual methods like creating sounds with physical objects (e.g., tape for electric sounds).
    • Recording on analog equipment with limited editing capabilities.
  • Present Methods:

    • Digital recording tools and software allow for more flexibility and precision.
    • Use of effects generators (e.g., tone generation tools shown in class).
    • Modern techniques support more realistic sound replication and post-production enhancements.

4. Sound Effects and Variation

  • Generating Effects:

    • Use tone generators and other tools demonstrated in class to create custom effects.
    • Experiment with different pitches, frequencies, and intensities to suit your project's needs.
  • Importance of Variations:

    • Record as many types and versions of a sound as possible.
    • Examples of variable recordings:
      • Speed: Create fast, medium, and slow variations.
      • Impact: Record sounds from different perspectives (e.g., near vs. far).
  • Final Advice:

    • Label and organize recordings meticulously to streamline the editing process.
    • Ensure a diverse sound library for flexibility during production.

Comments

Popular Posts