The Witcher season 3 is set to enter full production imminently, if a new report and a main cast member's social media post are to be believed.
Filming on the next instalment of the hit Netflix show , whose second season only aired in December 2021, will begin any day now in Slovenia, according to The Witcher fansite Redanian Intelligence . And, given that The Witcher's lead actor – Henry Cavill – is on location in the European nation, it seems that Redanian Intelligence's report has a fair amount of credible weight behind it.
We've known for some time that principal photography on The Witcher season 3 was due to start in early 2022. Showrunner Lauren Schmidt Hissrich exclusively told TechRadar – in December 2021 – that the scripts for the show's third season were almost complete . Meanwhile, Hissrich and other prominent crew members were seen scouting new filming locations in January .
Now, though, Redanian Intelligence claims that the bulk of the TV series' cast and crew have made their way to Northern Slovenia, with Cavill (Geralt of Rivia) and co-star Freya Allen (Ciri) spotted by the local population. Cavill has since taken to Instagram to reveal he's been reunited with Hector – Geralt's new horse – and stunt performer Laszlo Juhasz. It seems, then, that filming will start very, very soon.
Redanian Intelligence's report goes on to suggest that season 3 will be shot in numerous locations, including Kranjska Gora and Postojna. Meanwhile, The Fellowship of Fans Twitter fan account states that Predjama Castle, which is situated near Postojna, will be closed on Tuesday, April 5 . Could it be locked down for the day so filming on The Witcher season 3 can take place? It's possible, but unverifiable at this stage.
Interestingly enough, The Witcher season 3's imminent start date comes almost a year to the day since principal photography wrapped on its second season. The TV adaptation's previous instalment finishing shooting on April 2, 2021, so it would be a fitting return date if The Witcher's third season starts filming around that date.
If you missed it first time around, check out our spoiler-free review of The Witcher season 2 . And, while you're here, read our exclusive interview piece with Hissrich, Cavill, and the rest of the cast on The Witcher season 2's development.
Analysis: Will The Witcher season 3 land on Netflix in 2022?
It's unlikely. Filming on The Witcher seasons 1 and 2 took a long time, so we'd expect principal photography on the show's third season to be another lengthy affair.
The Witcher season 1 took seven months to shoot, with filming beginning in October 2018 and running to May 2019. Owing to the ongoing pandemic, season 2 took even longer – production starting in February 2020 and not being completed until, as we mentioned earlier, April 2021.
If The Witcher season 3 starts shooting in April 2022, all of its footage should be in the can by late 2022. Even then, though, there'll be a long post-production phase, with VFX, potential pick-ups and reshoots, episodic editing, and other work to complete before the third season is ready to go.
Our opinion? Don't expect The Witcher season 3 to debut on Netflix before early 2023 – and that's looking at things optimistically. We wouldn't be surprised if it doesn't launch until mid-2023, given the sheer amount of work required until it's ready to be released.
Still, we're prepared to wait, as long as the show's third season does justice to Andrzej Sapkowski's fantasy novel series. And, in particular, The Time of Contempt book, which season 3 will largely be based on.
We really enjoyed The Witcher season 2, but there are plenty of fans who weren't happy with the creative choices made throughout its eight-episode run. Hissrich, Cavill and company, then, have to earn that trust back with the show's next instalment.
Apple TV+ Friday Night Baseball curveball start could still lead to a home run
Apple knows. It read your messages on Twitter, saw the articles, and is probably even aware of the unkind things former Met First Baseman and longtime Mets announcer Keith Hernandez said about the Friday Night Baseball debut.
At least the evening had everything a baseball fan could hope for. In the first game (there are two per night for a total of 50 games this season), the Mets played the Washington Nationals and endured non-working stadium lights, a 9th inning rain delay, and a bench-clearing near brawl when Francisco Lindor was beaned by Nationals pitcher Steve Cishek.
That last moment should have been (and still mostly was) the story of that game but all people could talk about on social media was Apple TV+ and how it handled the game.
Apple announced the MLB partnership in March and, perhaps, set itself up with the first streamed game between two eagerly-watched teams with, at least in the Mets' case, intensely devoted fan bases.
Fans, industry watchers, and other play-by-play pros tore into the Apple TV+ team hired to cover the game: Melanie Newman (play-by-play), Chris Young (analyst), Hannah Keyser (analyst), and Brooke Fletcher (reporter). Hernandez quipped during the next Mets game broadcast covered by him and Gary Cohen, that Mets fans had already had one horrible broadcast experience this season. Granted, this is the same slightly tech-phobic fellow who a week later recounted on air how he was almost taken in by a phone phishing scam.
Comments that complained about how the play-by-play didn't seem to know how to emphasize the importance of the right plays and how they talked over some of them, and sometimes around topics unrelated to the game or baseball.
The thing is, though, this may all be part of the plan.
Will Apple's changeup work?
Apple is purposely not doing things exactly as they've been done for decades of game broadcasts. It's intentionally widening the diversity and perspective of the typical game announcers. It purposely pulled together teams that offer new faces demographics and fresh perspectives.
That can take some getting used to but Apple, which appears to have grander baseball plans than just this Friday night slot (though we're guessing here), not only wants the traditional baseball fan to enjoy these games but is also hoping to build the audience beyond the endemic.
Perhaps that's why, despite the strong criticism, Apple is sticking with these game-calling teams. As it listens to and learns from the critical feedback, it will make adjustments but all the while Apple will still try to manage a difficult balance of satisfying the old (some of which had never tried out Apple TV) and welcoming the new. It knows it can't afford to alienate existing fans, but as a tech company, it can't help but innovate America's favorite pastime (by the way, Apple Friday Night Baseball is also streaming to Canada, Mexico, Australia, South Korea, and Japan - all hubs for baseball fanaticism).
The tech hurdles were real for those who have grown up watching games on broadcast TV where pressing a single number on a remote was enough to bring up the day's game. Apple TV and the original content platform TV+ was a new frontier for them and Apple didn't spend any time teaching longtime baseball fans how to access the game.
At least these Friday night games are free for now (no word on when that ends), and if you have an Apple ID you can log into TV+ through Apple TV or a variety of other third-party platforms to watch the games. And really, Mets fans had no choice because, aside from radio, there was no place else to watch that Nationals game (that broadcast blackout will carry through for all 50 Apple TV+ Friday baseball games).
Home run tech
Leaving aside the criticisms and tech frustration, there were some notable Apple touches. Yes, the company splashed its proprietary SF Pro font all over everything to give the proceedings a very Apple feel.
The company also employed some high-end camera tricks like using the Megalodon camera rig that the NFL and golf games on CBS have been using to great effect.
Megalodon, which is not a new camera but a collection of technologies (a Sony a7R IV camera mounted on a DJI Ronin-S gimbal, a 6-inch field monitor, and a backpack to carry external batteries and a 1080p wireless transmitter), creates a recognizable cinematic effect. Players, usually walking on or off the field, are in share focus while the rest the of scenery is out of focus. It's a cinematic effect that instantly raises the drama. One could wonder, though, why Apple isn't using it's own iPhone 13 Pro, which also shoots Cinematic video.
Apple is also employing a Phantom camera to shoot super-high-speed frame rate footage that can then slow down a slider to show what's really happening when the pitcher throws it, the ball bends down and in, and a player swings past it.
Plus, if you noticed that the overall game looked just a bit crisper, it could because Apple is broadcasting in 1080p 60fps. That's above what you'd get from a typical broadcast or cable game. Sadly, no one is delivering these games in 4K, yet.
It's early days in Apple's made scramble from home to first, as it attempts to promote the heck out of these games on Apple TV (the app), TV+, and even in Apple News. It might garner more eyeballs this way, but ultimately, it has to win over baseball fans. It failed to reach first this inning, but there are 50 more this season and, potentially, a long MLB partnership ahead of it to work out a run.
Canon EOS R3’s Eye Control AF could be the future of autofocus, but it needs work
When it comes to autofocus precision, Sony reigned supreme for years… until Canon caught up with the competition in 2020 with the launch of the EOS 1D X Mark III . That, however, is a professional sports and photojournalism camera and it’s to be expected that the AF system is top priority.
But Canon quickly followed it up with the same superb AF system in the EOS R5 and R6 bodies, opening it up to the prosumer and enthusiast markets as well. This re-established Canon’s reputation in the mirrorless world as a major player, following the uninspiring original EOS R .
Canon didn’t stop there, adding another layer of autofocus prowess in the form of Eye Control AF in the EOS R3 . We say ‘layer’ because using it is optional and it works in tandem with the existing AF system.
So if there’s already a tried-and-tested AF system that’s practically perfect, do we really need another? Perhaps not, especially since most pro photographers are set in their ways and adjusting to a new system could be rather frustrating, leading to missed photo opportunities.
However, if given the option of using a system that promises to be really quick and easy (just look at something and, bam, it's locked in), wouldn't you want to use it?
While Eye Control AF promises just that, Canon listed so many caveats regarding it during the camera’s launch announcement last year that it came across as a marketing gimmick.
So I just had to give it a try and was pleasantly surprised. It works… it’s just not quite perfect yet, as it’s very dependent on a lot of factors.
What exactly is Canon’s Eye Control AF?
Simply put, this AF system lets you select subjects by just looking through the viewfinder, where an array of eight low-power infrared LEDs track the movement of your eyeball. This information is overlaid on the sensor that allows the camera to then automatically focus on what you’re looking at. It can be as precise as looking at an athlete’s or animal’s eye (if you’re close enough).
Eye Control AF isn’t exactly new. It’s a revision of Canon’s eye-controlled autofocus that was used in some of its 35mm SLRs in the 1990s. Back then, there were fewer AF points on their film cameras, making this system quite reliable, not to mention making AF point selection quicker and easier.
Now, however, as it’s a mirrorless camera, the entire frame is available for autofocus and the system’s algorithms are heavily dependent on subject recognition. The latter is what makes the EOS R3’s Eye Control AF so novel.
The EOS R3 is the first Canon camera to come with motor vehicle detection (alongside people and animals), so you can set the AF to pick out a particular type of subject and the camera manages remarkably well to ignore anything else that might happen to be in the frame.
Using Eye Control AF in the real world
As we mentioned in our Canon EOS R3 review , Eye Control AF “feels virtually magical”. You just need to have it calibrated properly and be ready to recalibrate in a heartbeat if you find it’s not able to pick the subject you want (or are actually looking at). And that’s because the smallest change in external factors can affect subject selection, whether it’s the size of your iris changing or ambient light. Even how you hold the camera can affect Eye Control AF as the slightest change in angle will require recalibration.
If your calibration is spot on, Eye Control AF is more than capable of locking onto a subject and tracking it… provided you can keep up with it yourself. See where we’re going with the caveats?
Calibration, however, takes no time at all. You just have to select the Eye Control AF setting in the menu system, select Calibration, look through the EVF and press the Mn-F button. Follow the instructions you see through the viewfinder and, voila, you’re set. This process takes no more than 10-15 seconds for each calibration.
There are six custom calibrations that you can save, and even refine them by repeating the calibration process (as many times as you want) – the more you refine it, the better it will be. However, when you’re out and about on field and the moment for the perfect shot is fleeting, using Eye Control AF could be hit or miss, particularly outdoors where lighting conditions are constantly changing.
I spent some of my time with the EOS R3 outdoors shooting birds as local sports had practically come to a standstill due to the ongoing pandemic. As long as the lighting conditions stayed static, Eye Control AF worked a charm. But when a heavy cloud passed across the sun, I lost that precision and had to recalibrate the AF system.
I even spent time shooting in low-light conditions (at a Christmas lights display in December 2021). And while Eye Control AF managed to work most of the time even with flickering LEDs, the change in lighting conditions did cause issues.
Canon was quick to let us know that Eye Control AF could be temperamental if you wear glasses or contact lenses… and that’s true enough. As someone who uses both, I had trouble finding the right focus point when wearing glasses, despite repeated recalibrations although, to be fair, there were occasions when the camera got it just right. It’s better when you’re wearing contact lenses though – it’s definitely more hit than miss in this case. That said, the AF system performed best without either (thank goodness for EVF diopters!).
For those of us who look through the viewfinder with one eye closed (as I do), minuscule changes in eye size – which can happen without you even realizing it – can also affect AF point selection and recalibration is not going to help here. I found that keeping both eyes open is the only way around this issue but, naturally, that took some getting used to.
Long story short, Eye Control AF works – it does what it says on the tin, making AF point selection faster – but there are several factors that can affect it. The EOS R3 is being touted to sports photographers and photojournalists, for whom turnaround is so quick that stopping even for 20 seconds to recalibrate for a change in ambient conditions is not usually possible. Missed opportunities can mean lost money.
Here's the good thing about the EOS R3 – at no point does the default AF reticule disappear and is always spot on. So even if the circular bullseye of the Eye Control AF is off the mark, your shots are still going to be good. In fact, you can set up a custom AF zone to match the size of your potential subject – provided you know what you're shooting – and that should also make it easy to keep focus locked on. Which then begs the question: do we really need Eye Control AF?
I'd still argue that yes, we do. The system works and could make finding and tracking subjects easier, but it's not quite perfect yet and needs to be more reliable. Perhaps one way of improving it would be an algorithm that corrects for different eye conditions (varying eye size, spectacle wearers, etc).
Another option would be to perhaps add the ability to save more custom calibration settings so you're prepared for any eventuality. Software aside, perhaps a physical calibration button on the camera body could make on-spot recalibrations quicker than having to delve into the menu system (it might be possible to customize one of the existing control buttons, but I didn't get the opportunity to test that).
In its current form, Eye Control AF is a great option to have on board, albeit one that's still quite novel and requires a little patience to come to grips with, but it needs to evolve a little more.