A cam running ML but without maintainer is called blindly maintained. This may happen after ML was ported to the cam some time ago and former maintainer is no longer working on it.
Such camera models get updates when new builds are published but there may be nobody feeling responsible to do basic testing and quality control. Bugs may get introduced and remain unnoticed until an user stumbles across it and fill in bug report.
A bootable card is an essential part of ML's startup process. During installation some information is written into a hidden section of your installation medium.
If a card is formated outside the cam or formated in a cam not actively running ML this information will get lost. Boot ability will not be affected by deleting files or directories.
To make additional cards bootable it is recommended to redo ML installation for each card.
Some information written and stored in cam during ML installation. It will only be deleted by proper deinstallation.
A cam with bootflag set will check if there is a bootable card found in cam's card slot(s). It's an essential part of ML's startup process.
Canon introduced Basic Scripting support in EOS camera line-up with Digic 8. It is available for all EOS cams hosting DiGiC 8 und X processors. Same functionionality is a long-standing part in PowerShot cameras (there for older DiGiC generations, too).
It makes a dev's life a lot easier by allowing to set cam's bootflag with some simple and portable lines of code. And other things, too. If you ever come across a statement like “Canon making ML development harder by introducing locked down cameras”: This is proof of the very opposite!
Our friends and frequent contributors from Canon Hacking Development Kit (CHDK) do enhance PowerShot digital cameras with some features not implemented by Canon. Similiar - in some aspects - what ML project does with EOS.
There are some major diffences how PowerShot and EOS have to be programmed and that is why there are 2 different project teams.
It may confuse some people that there are a few EOS M cameras not handled by ML but CHDK. And there are 2 PowerShot cameras not located in CHDK realm but ML. Those cameras run code contrary to their names!
EOS M3, M5, M6, M10, M100 work with PowerShot firmware and are therefore handled by CHDK.
PowerShot SX70 and SX740 work with EOS firmware and can be ported to ML.
Developer lingo. Changes to ML code require a review process. First part is a so called pull request. Designated devs will inspect the code changes. If those devs think the code is valid and it makes sense to integrate it into ML it will be “commited”. After changes are commited new builds will be generated in 24 hours or less.
Compression is the method to minimize data space required. For example: A string like “Abba Abba Abba Abba ” (= 20 characters) can be “compressed” into the info “4*Abba ” (= 7 characters). Of course in real life computer compression methods work completely different!
There are two different kinds of compressions:
Lossless: Data output after compression/decompression is exactly the same as data input. No information is lost! The process is reversible each and every time!
This kind of compression is mandatory for data backup (for example). And this method is used for Canon's CR2 data format. You may be aware that CR2 files from your camera differ in size. But all CR2 contain all data for the same number of pixels (for a given camera, of course).
Lossy: During compression a decision is made which data will be stored or not. For example: During JPEG compression a computer algorithm determines which parts of the picture are most likely to be ignored by human perception. Those parts don't make it into final JPEG. This may be a small or a large part depending on scene. Consequently, if you redo lossy compression with a single JPEG very often you will finally see degration! But for a single conversion it may be really hard to spot the difference. BTW: All your DVD and BlueRay media are compressed. You may not even have mentioned .. It works because the human perception is easily tricked into ignoring specific details, esp. in video.
ML/Canon camera specific part: Canon implemented a compression method called LJ92 into your camera. Because compression is a really high-performance task this function is hard-wired and therefore fast. The lossless part is used for CR2 and (optional) by ML's raw/MLV recording. Lossy compression is used for Canon's JPEG pictures and movie recording in H.264/Mov. (You may say lossy compression is the very reason why Canon's video modes generate sub-par output (compared to other cam manufacturers) but you are wrong. The reason is explained in skipping.) ML is able to use lossy compression for raw/MLV recording, too.
An ML version developed by a dev on his own system/repository.
They don't participate in the review process for nightly builds.
A developer (dev) in ML microcosmos is a person doing ML programming, porting, maintaining.
Digital Imaging Integrated Circuit
Canon's designation for camera's main processing unit. There are several generations mostly based on ARM architecture. At the moment Magic Lantern supports EOS cameras hosting Digic 4 and Digic 5 but not all of them. Canon's current EOS lineup features mostly Digic 8 and X (X = 10 in roman numeral). Some cameras with Digic 6 and 7 are listed, too. Digic 4 is still in use for entry-level DSLRs like 2000D/Rebel T7 and 4000D/Rebel T100.
Dual-ISO creates a picture or video frames where alternating sensor pixel lines use different ISO. it will enhance dynamic range on cost of details in highlights and dark areas. Resolution for such areas will be reduced. For Dual-ISO pics/movies postprocessing is mandatory.
The differences to “normal” HDR using different exposures for consecutive pics or video frames:
Dual-ISO is well suited for fast moving objects because it uses a single pic/frame. Resolution in highlights and dark areas is reduced and artefacts/moire may get visible.
Links to proper documentation and pp tools.
Maybe you will be surprised to find your forum post here because you created it elsewhere. What happened? An administrator or moderator judged your very question/request to be redundant because the same question was asked and answered before or may be even an F.A.Q. item. Most of the time no other indication about what you did wrong will be given but this magical replacement.
Instead of going ballistic about it you should thoroughly check F.A.Q. and/or use forum search (or external search engine) to look after an answer. Only if you are absolutely sure you have been mistreated you may want to post a reply to explain your trouble giving more details.
And don't reply to other peoples Duplicate Question! Just don't!
A Windows utility programmed by user Pelican. Used to make a card bootable or prepare a card for Canon Basic Scripting.
It was in wide-spread use by Canon community some time ago because back then it was the only option to prepare SDXC cards for ML. Modern ML installation method handles those cards, too.
EOScard is still used for Canon Basic Skripting and sometimes to prepare cards for analyzing bricked cams.
Related app for macOS/osX is MacBoot.
Linux users may use make_bootable.sh.
Expose-to-the-right (ETTR) is a method to enhance details in dark areas of a picture. Manually it is done by intentionally overexposing images (exposure indicator right of center). ML allows Auto-ETTR where the camera adjusts exposure according to settings. Of course ETTR technique requires exposure correction by postprocessing.
Auto-ETTR requires liveview and works best in mode M.
Some people have a misconception about ETTR intentionally overexposing beyond dynamic range of the sensor resulting in wide areas being “blown-out”. Auto-ETTR indeed has a margin (adjustable) for over-exposure but this is meant for sun in landscape and such scenes. In most cases you didn't want to have a bright day setting where the sun itself is properly exposed (without being outblown).
Link to Auto-ETTR documentation and credits for ETTR technique creators.
More or less the same as your computer's OS (Windows, macOS, Linux, …) made for an “embedded system”. Generally speaking it is not designed to be a platform for additional software installed by a user and it is designated to run for a specific hardware for a specific purpose. You will find firmware inside many devices you never thought about. Of course your camera runs a firmware provided by Canon. But your digital medical thermometer has its own firmware and your USB-Stick, too. Canon's firmware allows to run additional programs along with all the tasks Canon designed. This is the way ML runs and it is therefore not called firmware but a firmware add-on. See also Firmware update/upgrade/downgrade
Software running on an embedded system without replacing the original firmware.
Full Resolution Silent Pic → Silent Pic
A sensor's ability to grab all pixels at the same time. Canon (and most other consumer cameras) doesn't have it! See Rolling Shutter
Canon's way of storing video streams to card (MOV files). May also be called “native” recording.
When talking about H.264 here we are referring to Canon's implementation of the broader H.264 definition set and what kind of data manipulation happens on the way from sensor to final MOV file.
Because said implementation is done in hardware ML ability to change anything is limited. See hardwired.
Some functions in Canon cameras are run by dedicated hardware. Dedicated hardware may have a single task which can be performed very fast. There may be only limited ways (or none) to manipulate the inner workings of said hardware.
Such functions are called *hardwired*.
Example: H.264 video encoding. It works just with given specs and cannot be changed by ML devs.
High Dynamic Range: A method to enhance cam's limited dynamic. Video in general uses the same technique as in HDR photo: The camera shoots several frames with different exposure settings. During postprocessing all frames belonging together are merged into a single frame.
Consequently frame rate of merged HDR frames is = Frame rate of camera / number of shots with different exposure.
In HDR video exposure duration (“shutter speed”) and aperture cannot be changed like in HDR photo. The only remaining option to manipulate exposure is ISO.
Things to consider:
Because ML supported cams do have quite a limited highest frame rate the number of frames to merge is kept to a minimum and that is 2. Thus ML HDR videos output has half the frame rate used by cam during recording.
There is software able to restore (as good as possible) original frame rate. For example: Twixtor (Payware!).
As in HDR photo mode scenes with fast movements (= different frame contents) are problematic because merging such frames is prone to generate disturbing artefacts.
Do not confuse HDR video mode with Dual-ISO video! Both serve the same purpose (enhancing dynamic range) but are using different techniques.
HDR video uses different exposures in consecutive frames. Resulting frame rate is half of cam's frame rate. Not well suited for fast moving scenes.
Dual-ISO does not alter frame rate because each single frame contains 2 different ISO settings in alternating pixel lines. Vertical resolution is affected, esp. in high- and low-exposed areas.
LUA Scripting are the easiest approach to add own automotation tasks to ML.
-Script for shooting a whole set of total eclipse photos with all critical phases an astronomer has to get.
-Script for focus stacking for landscape, architecture with real-time calculator and user interface.
Main difference to programming ML (autoexec.bin) and modules: A scripting language doesn't need to be compiled. It runs line by line (simplified) and contains readable text. And the only thing to make such a file is a text editor.
A maintainer in ML microcosmos is a developer feeling responsible for a particular camera model. Maintaining a cam consists of tasks like programming, testing, quality control, porting ML to new firmware versions.
Cameras without a maintainer are called blindly supported.
ML needs 3 things to start on a supported cam:
1) During Power Up bootflag will force the cam to look if card is bootable.
→ 2a) If card is not bootable cam will startup with plain Canon firmware. → ✔
→ 2b) If a bootable card is detected cam will try to locate ML's “autoexec.bin” file.
→ → 3a) If autoexec.bin is not found cam will get stalled and remain in this state until battery is removed → ✘
→ → 3b) If autoexec.bin is found camera will load autoexec.bin and starts up with Canon firmware + ML features → ✔
By pressing SET during power up cam will start with plain Canon firmware.
Magic Lantern Video format.
ML's way to store raw data streams to storage cards. Structured to hold essential recording infos like shutter speed, aperture, time code, …
MLV requires postprocessing and conversion because it is not widely supported. Adobe and Blackmagic applications (and others) know nothing about MLV files.
MLV is successor of outdated format recorded by RAW_REC.mo.
There are two varieties:
Provided by module mlv_rec.mo
Full set feature set for best flexibility, reliability and recovery/repair.
Supports audio recording by module mlv_snd.mo
Provided by module mlv_lite.mo
Current version is 1.1
As name implies it is derived from “full” MLV. It sacrifices some features for performance reasons.
Some parts of ML's feature set are not loaded by default. To use them you have to access Modules tab/screen activate them* and restart camera. After powerup module's menu options are visible in ML's tabs/screens.
Some developers created custom modules not included in zipped ML builds. You can add modules by placing module file (*.mo) into card directory ML\Modules.
There are two reasons to place ML features into modules:
- Placing all functions into one single piece of software bloats memory requirements and may exceed camera limits for available memory during startup.
- Modules are somehow easier to develop. Such lowering the bar how to make features happen in ML.
Well known ML features codes as modules are: RAW/MLV recording, Dual-ISO, Silent Pics, ETTR …
In ML microcosmos PoC means a specific task/feature has been proven to be feasible.
For example: Running ML on a new camera requires (mandatory!) custom code (like ML) to run along with Canon's firmware. If there is any custom code running on it - no matter if it is as small and insignificant as “Hello, World!” shown on display - Proof of Concept is established.
It does not mean there is a port in progress or anyone is actually working on it! It simply means that a person with lot of spare time and some skills should be able to port ML to this camera.
Developer lingo: If a developer wants to add own changes to ML code he/she has to generate a “pull request”. A pull request will get approval by other devs (or not) after review. Approved pull requests will get commited.
QEMU (Quick Emulator) is some kind of virtual machine able to emulate a camera running Canon firmware inside a PC.
Such an emulator allows developers to test own software programs in a safe environment without potential hazard for the cam.
Making a ROM dump work in QEMU is a very early step in porting a camera.
ML feature allowing movie recording as a raw data stream to memory card.
Pro: Higher image quality compared to Canon's H.264 implementation.
Cons: Very high data rates. Example: H.264 in 1080p25 in IPB: Around 5.5 MByte/s. MLV in 720p25 uncompressed: Around 38 MByte/s.
Requires postprocessing because of ML's own raw video format → MLV
High processor load may contradict use of ML overlays like zebras, focus peak, histogramms, …
Preview in cam will be lagged (Processor load too high).
A piece of software able to run on your camera to store internal computer memory content to a card.
Such ROM dump may enable a developer to run camera's software inside an software emulator on a PC. The existence of a particular ROM dumper doesn't say anything about porting progress or even if a developer is actually working on that camera.
Symptom: EOS M can focus by half-press shutter but full-press won't work. Using touchscreen may lockup cam completely and you have to remove battery.
Cause: Most likely a so called “race condition” at the very beginning of cam's startup process. ATM ML devs cannot fix that.
Cure: None, at least in foreseeable future.
* Twist your lens in socket. Powerup cam. Twist lens into mount socket.
* Use a fast card!
Disclaimer: This entry contains no practical joke.
A serial protocol interface/connector inside your cam. ML devs suppose it was implemented by Canon for diagnostics and for low-level maintainance/repair.
By accessing it ML devs can read a lot of internal low-level data hidden from Canon menu. And in case of very-hard-to-hack cameras there may be a way to write data, too (for example: Enable bootflag).
Physically accessing UART in older cams required opening cam and maybe even soldering wires to camera's cirtuit board. Later cams have a UART connector on board for easier access.
On some of Canon's Digic 7, 8 and X cameras this UART is accessible without opening the camera by just removing a reusable rubber cover.
Video Anti-Aliasing Filter
Yet Another ML Menu Option. You may get this response after posting a feature request asking for an additional menu option. ML staff is reluctant to add any new menu options because they feel strongly there already are too many and they want to avoid spreading the disease.