

Use the four dry contact inputs (GPIO) with sources as varied as a switch on the wall, a BMS dry contact relay (e.g. With support for OSC, RS232, UDP, DMX, Art-Net, sACN protocols and digital contact closure inputs, you can trigger the S-Play with a number of external devices. Group multiple cues together in a sequence to be stored and played back at will. The S-PLAY’s engine also allows for multiple playlists to be triggered at the same time with varying priority levels or to allow only one playlist to be active per group. With easy access to your entire cue library within the web browser, simply drop your dynamic recordings, static scenes, effects & gradients onto the timeline to create a playlist, then drag each cue around to set start and end times or transitions such as fading in or out. (Over a week’s worth of 32 universe playback!) Save your recordings using the S-PLAY’s impressive 8GB internal memory, or expand your storage capacity with a micro SD card via the handy slot on the front of the unit.ĮNTTEC recommends Class 10 (minimum write speed of 10MB/s or higher) for best performance and have successfully tested cards of up to 64GB capacity. The SPLAY’s internal and removable storage options allow for complete flexibility to accommodate incredibly complex shows. Start and stop recording manually (perfect for simple, non-looping sequences), or configure precisely using remote triggers via the S-PLAY’s web browser home page (for more complex or seamlessly-looping shows). Individually add or bulk assign DMX values to channels, or simply record them from a DMX source.Įffects: Creation of basic pixel effects allowing for BPM, colour and speed to be set for a range of pixel types for a user-defined time period.ĭynamic recordings: Record a DMX source using the five-pin XLR ports, or Art-Net or sACN data via the network port.

You can even remotely access your S-Play from anywhere in the world via the internet or a 4G router! With countless applications from theatrical or entertainment events to architectural design and smart home integrations, the S-PLAY is the dream companion for any automation project. You can even create your own custom web interfaces too – allowing you to present your end users with as much or as little control as they need! Plug and play show control technology has never before looked so good or been this easy to master … or extend! With the multiunit sync feature, you can link and control additional S-PLAYs from a single interface, making it a breeze to expand your universe count.
Smart player machine generator#
With a built-in effects generator and intuitive, onboard controls to record, edit, schedule and play scenes, you don’t have to be an experienced lighting designer or systems integrator to conjure up amazing shows using our smart light show controller, but your audience will think you are. When emotion is detected, you can see the name of the emotion in the terminal open.In simple terms, think of it as being like a scene & animated effect creation engine, a 32 universe DMX recorder & playback engine and host for control interfaces that can send multiple AV specific protocols all rolled into one. You can manually move the song controller near the end to start the function. Face will be scanned in the ending of the currently playing song. Select emotion mode from the right bottom corner. Type the command 'python capture.py' in terminal.Ī window will open in chrome browser having the interface of the player. Fisherface module is must)Ĭhrome browser is needed (eel library is specifically designed for chrome)ĭownload all the files in a folder. Opencv version: 3.4.3 (Full opencv module.
Smart player machine download#
Python version: 3.6.5 (Try to download all python modules)(Important modules: glob, os, numpy, random, argparse, time) Note: I have downloaded python using Anaconda. Please make sure you have the following in your machine.
Smart player machine code#
This code is developed in Ubuntu Linux, with eel, opencv and Python downloaded.įor running the code in Windows or Mac, certain path changes are required. The interface is made up of HTML, CSS and JS, and the main code is of Python. This is a project using machine learning for detecting emotions based on the expression of the users.
