intuitive-revelations:I’m pretty stunned right now. About an hour ago I saw a Youtube suggestion for
intuitive-revelations:I’m pretty stunned right now. About an hour ago I saw a Youtube suggestion for this video by Ben Marriott in which he demonstrates using a program called Ebsynth to impose the style of an image onto a video. I’ve recently been really interested in using machine-learning and similar tech for editing and restoration purposes (recently even attempting to create my own deepfake video) so when I saw it, I immediately thought “hey could this be used for colourisation?”So I downloaded it and, following a tutorial, quickly plugged in a low quality clip from the end of An Unearthly Child, along with a single colourised screencap by BabelColour and left it to render over a few minutes.The result is…actually really impressive. In total, it produced 11 and a half seconds of relatively high quality, colourised footage from a single colourised frame. There are some artefacts naturally, but this could easily be tidied up with some basic post processing, more keyframes, or just a higher quality source video. The screencap was even distorted somewhat compared to the clip, as can be seen by comparing the before and after gifs above, yet the software seems to have adapted to this.Look at it this way, let’s say it could produce five seconds of colourised footage per keyframe (which admittedly might be a bit optimistic with more dynamic scenes) - you could theoretically colourise the entire first episode of An Unearthly Child in 280 hand-coloured frames - about 11 seconds of hand-coloured footage.So yeah, I think I might be testing this out some more… -- source link
#bruh#classic who