EGL _Assignment1 Mac OS

broken image


For Mac OS X, before you do any coding, you must install command-line utilities (make, gcc, etc.). Install XCode from the Mac app store, then go to XCode, and use 'Preferences/Download' to install the command line tools. Important: If you are using Mac OS X Mojave, you need to update the OS to the latest version of Mojave. Otherwise, OpenGL. Programming Assignment 1 Checklist: Percolation Frequently Asked Questions (Percolation) What are the goals of this assignment? Set up a Java programming environment. Use our input and output libraries. Learn about a scientific application of the union–find data structure. Mac OS X We strongly recommend all students update to the latest version of the operating system, Mac OS X Mavericks (10.9). Download and install the newest Xcode from the App Store to compile the CHAI3D library. No drivers are needed!

  1. Egl Assignment 1 Mac Os 11
  2. Egl Assignment 1 Mac Os Catalina
  3. Egl Assignment 1 Mac Os Download

FFmpeg filter for applying GLSL transitions between video streams (gl-transitions).

(example crosswarp transition)

Note

If you want an easier solution, I recommend checking out ffmpeg-concat, an npm module and CLI that allows you to concat a list of videos together using a standard build of ffmpeg along with the same sexy OpenGL transitions.

Intro

FFmpeg is the defacto standard in command-line video editing, but it is really difficult to concatenate videos together using non-trivial transitions. Here are some convolutedexamples of a simple cross-fade between two videos. FFmpeg filter graphs are extremely powerful, but for implementing transitions, they are just too complicated and error-prone.

GL Transitions, on the other hand, is a great open source initiative spearheaded by Gaëtan Renaudeau that is aimed at using GLSL to establish a universal collection of transitions. Its extremely simple spec makes it really easy to customize existing transitions or write your own as opposed to struggling with complex ffmpeg filter graphs.

Mac

This library is an ffmpeg extension that makes it easy to use gl-transitions in ffmpeg filter graphs.

Building

Since this library exports a native ffmpeg filter, you are required to build ffmpeg from source. Don't worry, though -- it's surprisingly straightforward.

Dependencies

First, you need to install a few dependencies. Mac OS is very straightforward. On Linux and Windows, there are two options, either using EGL or not using EGL. The main advantage of using EGL is that it is easier to run in headless environments.

Mac OS

GLEW + glfw3

Mac OS users should follow instructions for not using EGL.

Linux with EGL

We default to EGL rather than GLX on Linux to make it easier to run headless, so xvfb is no longer needed.

glvnd1.0building from source

mesaGL>=1.7 mesaGLU>=1.7

Catalina

This library is an ffmpeg extension that makes it easy to use gl-transitions in ffmpeg filter graphs.

Building

Since this library exports a native ffmpeg filter, you are required to build ffmpeg from source. Don't worry, though -- it's surprisingly straightforward.

Dependencies

First, you need to install a few dependencies. Mac OS is very straightforward. On Linux and Windows, there are two options, either using EGL or not using EGL. The main advantage of using EGL is that it is easier to run in headless environments.

Mac OS

GLEW + glfw3

Mac OS users should follow instructions for not using EGL.

Linux with EGL

We default to EGL rather than GLX on Linux to make it easier to run headless, so xvfb is no longer needed.

glvnd1.0building from source

mesaGL>=1.7 mesaGLU>=1.7

GLEW >=2.0building from source

Linux without EGL

If you don't want to use EGL, just comment out this line in vf_gltransition.c

GLEW

glfwbuilding from source

On headless environments without EGL, you'll also need to install xvfb.

Building ffmpeg

Non-EGL:

EGL:

Notes: Viddeo mac os.

  • See the official ffmpeg compilation guide for help building ffmpeg on your platform. I've thoroughly tested this filter on macOS Sierra (macOS compilation guide).
  • Depending on your platform, there may be slight variations in how GLEW and glfw are named (with regard to --extra-libs, above), e.g. -lglew or -lglfw3 - check pkg-config.
  • The above example builds a minimal ffmpeg binary with libx264, but there's nothing codec-specific about the filter itself, so feel free to add or remove any of ffmpeg's bells and whistles.

Here's an example of a more full-featured build configuration:

You can verify that the gltransition filter is available via:

Usage

Default Options:

Custom Options:

Params:

  • duration (optional float; default=1) length in seconds for the transition to last. Any frames outputted after this point will pass through the second video stream untouched.
  • offset (optional float; default=0) length in seconds to wait before beginning the transition. Any frames outputted before this point will pass through the first video stream untouched.
  • source (optional string; defaults to a basic crossfade transition) path to the gl-transition source file. This text file must be a valid gl-transition filter, exposing a transition function. See here for a list of glsl source transitions or the gallery for a visual list of examples.

Note that both duration and offset are relative to the start of this filter invocation, not global time values.

Examples

Egl Assignment 1 Mac Os 11

See concat.sh for a more complex example of concatenating three mp4s together with unique transitions between them.

For any non-trivial concatenation, you'll likely want to make a filter chain comprised of split, trim + setpts, and concat (with the v for video option) filters in addition to the gltransition filter itself. If you want to concat audio streams in the same pass, you'll need to additionally make use of the asplit, atrim + asetpts, and concat (with the a for audio option) filters.

There is no limit to the number of video streams you can concat together in one filter graph, but beyond a couple of streams, you'll likely want to write a wrapper script as the required stream preprocessing gets unwieldly very fast. See here for a more understandable example of concatenating two, 5-second videos together with a 1s fade inbetween. See here for a more complex example including audio stream concatenation.

Todo

  • simplify filter graph required to achieve multi-file concat in concat.sh
  • support default values for gl-transition uniforms
    • this is the reason a lot of gl-transitions currently appear to not function properly
  • remove restriction that both inputs be the same size
  • support general gl-transition uniforms
  • add gl-transition logic for aspect ratios and resize mode
  • transpile webgl glsl to opengl glsl via angle

Related

  • ffmpeg-concat - Concats a list of videos together using ffmpeg with sexy OpenGL transitions. This module and CLI are easier to use than the lower-level custom filter provided by this library.
  • Excellent example ffmpeg filter for applying a GLSL shader to each frame of a video stream. Related blog post and follow-up post.
  • gl-transitions and original github issue.
  • Similar project that attempts to use frei0r and MLT instead of extending ffmpeg directly.
  • FFmpeg filter guide.
  • awesome-ffmpeg - A curated list of awesome ffmpeg resources with a focus on JavaScript.

License

MIT © Travis Fischer

Note: this is the 2017 version of this assignment.

In this assignment you will practice putting together a simple image classification pipeline, based on the k-Nearest Neighbor or the SVM/Softmax classifier. The goals of this assignment are as follows:

  • understand the basic Image Classification pipeline and the data-driven approach (train/predict stages)
  • understand the train/val/test splits and the use of validation data for hyperparameter tuning.
  • develop proficiency in writing efficient vectorized code with numpy
  • implement and apply a k-Nearest Neighbor (kNN) classifier
  • implement and apply a Multiclass Support Vector Machine (SVM) classifier
  • implement and apply a Softmax classifier
  • implement and apply a Two layer neural network classifier
  • understand the differences and tradeoffs between these classifiers
  • get a basic understanding of performance improvements from using higher-level representations than raw pixels (e.g. color histograms, Histogram of Gradient (HOG) features)

Egl Assignment 1 Mac Os Catalina

Setup

Egl Assignment 1 Mac Os Download

You can work on the assignment in one of two ways: locally on your own machine, or on a virtual machine on Google Cloud.

Working remotely on Google Cloud (Recommended)

Note: after following these instructions, make sure you go to Download data below (you can skip the Working locally section).

As part of this course, you can use Google Cloud for your assignments. We recommend this route for anyone who is having trouble with installation set-up, or if you would like to use better CPU/GPU resources than you may have locally. Please see the set-up tutorial here for more details. :)

Working locally

Get the code as a zip file here. As for the dependencies:

Installing Python 3.5+:To use python3, make sure to install version 3.5 or 3.6 on your local machine. If you are on Mac OS X, you can do this using Homebrew with brew install python3. You can find instructions for Ubuntu here.

Virtual environment:If you decide to work locally, we recommend using virtual environment for the project. If you choose not to use a virtual environment, it is up to you to make sure that all dependencies for the code are installed globally on your machine. To set up a virtual environment, run the following:

Note that every time you want to work on the assignment, you should run source .env/bin/activate (from within your assignment1 folder) to re-activate the virtual environment, and deactivate again whenever you are done.

Download data:

Once you have the starter code (regardless of which method you choose above), you will need to download the CIFAR-10 dataset.Run the following from the assignment1 directory:

Start IPython:

After you have the CIFAR-10 data, you should start the IPython notebook server from theassignment1 directory, with the jupyter notebook command. The bayou mac os. (See the Google Cloud Tutorial for any additional steps you may need to do for setting this up, if you are working remotely)

If you are unfamiliar with IPython, you can also refer to ourIPython tutorial.

Some Notes

NOTE 1: This year, the assignment1 code has been tested to be compatible with python versions 2.7, 3.5, 3.6 (it may work with other versions of 3.x, but we won't be officially supporting them). You will need to make sure that during your virtualenv setup that the correct version of python is used. You can confirm your python version by (1) activating your virtualenv and (2) running which python.

NOTE 2: If you are working in a virtual environment on OSX, you may potentially encountererrors with matplotlib due to the issues described here. Youfighter mac os. In our testing, it seems that this issue is no longer present with the most recent version of matplotlib, but if you do end up running into this issue you may have to use the start_ipython_osx.sh script from the assignment1 directory (instead of jupyter notebook above) to launch your IPython notebook server. Note that you may have to modify some variables within the script to match your version of python/installation directory. The script assumes that your virtual environment is named .env.

Submitting your work:

Whether you work on the assignment locally or using Google Cloud, once you are doneworking run the collectSubmission.sh script; this will produce a file calledassignment1.zip. Video poker jackpot winners. Please submit this file on Canvas.

Q1: k-Nearest Neighbor classifier (20 points)

The IPython Notebook knn.ipynb will walk you through implementing the kNN classifier.

Q2: Training a Support Vector Machine (25 points)

The IPython Notebook svm.ipynb will walk you through implementing the SVM classifier.

Q3: Implement a Softmax classifier (20 points)

The IPython Notebook softmax.ipynb will walk you through implementing the Softmax classifier.

Q4: Two-Layer Neural Network (25 points)

The IPython Notebook two_layer_net.ipynb will walk you through the implementation of a two-layer neural network classifier.

Q5: Higher Level Representations: Image Features (10 points)

The IPython Notebook features.ipynb will walk you through this exercise, in which you will examine the improvements gained by using higher-level representations as opposed to using raw pixel values.

Q6: Cool Bonus: Do something extra! (+10 points)

Implement, investigate or analyze something extra surrounding the topics in this assignment, and using the code you developed. For example, is there some other interesting question we could have asked? Is there any insightful visualization you can plot? Or anything fun to look at? Or maybe you can experiment with a spin on the loss function? If you try out something cool we'll give you up to 10 extra points and may feature your results in the lecture.





broken image