DiscoverSMAF-NewsBot
SMAF-NewsBot
Claim Ownership

SMAF-NewsBot

Author: SMAdvancedForum-News

Subscribed: 2Played: 71
Share

Description

Latest bot-read news headlined about the streaming and online video and audio delivery industry.
28 Episodes
Reverse
The weak gravitational pull on a particle just half the mass of a grain of sand has been measured for the first time. This most precise measurement of its kind is a breakthrough towards the quantum realm and a potential Theory of Everything. Of the universe's four fundamental forces - gravity, electromagnetism, and the strong and weak nuclear forces - gravity is the one we're most familiar with in everyday life, but it's actually the only one that we can't currently explain using the Standard Model of particle physics, our best system for describing the universe. Finding a way to jam it in there would achieve a Holy Grail of science - a Theory of Everything. If a quantum theory of gravity exists, clues will be hiding on the tiniest scales, in gravitational interactions between atoms and particles. The problem is, those tiny interactions are washed out by the immense gravitational influence of the Earth. It would be like trying to record the sound of a bug's footsteps under an idling jet engine. If you were trying to measure electromagnetism between particles, you can set up a box that blocks all outside interference, but you can't do that with gravity. But now scientists have developed a new type of experiment that can cancel out the Earth's pull to reveal gravitational interactions between small objects. The trick is to levitate a magnetic particle in a superconducting trap, isolate it all from external electromagnetism, heat and vibration, and swing a 2.4-kg (5.3-lb) weight on a wheel past to see if the particle moves. And sure enough, the team measured a weak gravitational pull of just 30 attonewtons (aN) acting on this particle at points that corresponded to when the larger weight was closest to it. Weighing just 0.43 milligrams, that makes it easily the smallest mass for which gravity has been measured so far. The previous record was 90 milligrams - about the mass of a ladybug. Another recent study measured the difference in the passage of time, due to differences in gravity, across the small distance of just 1 mm. This minuscule measurement, inches the world closer to the quantum realm. If gravity can be measured on objects that tiny, scientists might finally be able to start incorporating this strange force into our models of the universe and build a proper Theory of Everything. "For a century, scientists have tried and failed to understand how gravity and quantum mechanics work together," said Tim Fuchs, lead author of the study. "Now we have successfully measured gravitational signals at a smallest mass ever recorded, it means we are one step closer to finally realizing how it works in tandem. From here we will start scaling the source down using this technique until we reach the quantum world on both sides. By understanding quantum gravity, we could solve some of the mysteries of our universe - like how it began, what happens inside black holes, or uniting all forces into one big theory." The research was published in the journal Science Advances. Source: University of Southampton
February 27, 2024 6 min read Schrödinger's Pendulum Experiment Will Search for the Quantum Limit Physicists seek the dividing line between the quantum world and the classical one By Tim Folger edited by Clara Moskowitz There's a rift in reality - an invisible border that separates two utterly different realms. On one side lies our everyday world, where things obey commonsense rules: objects never occupy more than one place at a time, and they exist even when we're not looking at them. The other side is the dreamscape of quantum mechanics, where nothing is fixed, uncertainty reigns and a single atom or molecule can be in multiple places simultaneously, at least while no one is watching. Does that mean reality has one set of laws for the macrocosm and another for the micro? Most physicists instinctively dislike the idea of a bifurcated universe. Sougato Bose, a theorist at University College London (UCL), certainly does. "My view is that quantum mechanics hasn't been seen [at large scales] because we have not yet been able to isolate things well enough," he says, meaning that researchers haven't found a way to shield big objects from their environment, so their quantum properties are apparent. Like most physicists, Bose believes that quantum mechanics applies to all things great and small. He and three colleagues - two in the U.K. and one in India - hope to put that view to a stringent test within the next year or two with an intriguing experiment that ultimately aims to determine whether or not large objects obey the strange rules of quantum theory. The experiment, described in a recent issue of Physical Review Letters, harks back to a conundrum vividly framed nearly a century ago by Erwin Schrödinger, one of the founders of quantum mechanics. What would happen, Schrödinger asked, to a cat trapped in a closed box with a vial of poison that has a 50-50 chance of shattering and killing the cat? According to quantum mechanics the cat is at once alive and dead, existing in both states until someone opens the box and looks inside. That's because it's only when an observer makes a measurement of the system - opens the box and checks - that the two possibilities have to collapse into one, according to quantum theory. The story is meant to illustrate how applying these quantum rules to big things - basically, anything visible to the naked eye - leads to absurdities. On supporting science journalism If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today. So if quantum mechanics is true - and it has been a phenomenally successful theory for predicting the behavior of particles - why do we never see cats that are both dead and alive? Do the laws of quantum mechanics break down at a certain level? Some physicists see that as a possibility. But most would argue that the apparent absence of quantum effects in our own experience of the world arises because the countless interactions of atoms with the surrounding environment blur the true nature of things. As a result we perceive a kind of dumbed-down, nonquantum version of reality. If that's the case, then a carefully designed experiment that isolates an object from nearly everything in its environment should allow physicists to glimpse the actual quantum behavior of that object, even if it's relatively large. That's the goal of the experiment proposed by Bose, Debarshi Das, also of UCL, Hendrik Ulbricht of the University of Southampton in England and Dipankar Home of the Bose Institute in India. "There are two possible outcomes," Home says. "One is that quantum mechanics is valid [at all scales. The other is that] there is a region where quantum mechanics does not hold." Most of the hardware needed for the experiment is already in place and fits on a tabletop in Ulbricht's lab. (He's the lone experimenta...
Imagine something as tall as New York's Chrysler building, but spinning. China's Mingyang Smart Energy has announced plans for a colossal 22-megawatt offshore wind turbine, and standing in its presence will be an unprecedented human experience. The feats of engineering in offshore wind are becoming almost comical in scale, for a simple reason: the amount of energy you can extract from a turbine depends mostly on its swept area. The bigger that swept circle gets, the more energy you can harvest - but also, the greater the bonus becomes for adding more length. Put it this way: if your turbine has a 20-meter (85.6-ft) diameter, and you add one further meter (3.3 ft) to that diameter, you gain somewhere around 34 square meters (366 sq ft) of additional swept area. But if your turbine starts with a 50-m (164 ft) diameter, adding one extra meter of diameter brings in about 79 extra square meters (850 sq ft) of swept area, since that extra blade length is sweeping a bigger circle. What's more, these huge offshore turbines are extremely expensive to install, and the economics of deployment and grid connection tend to work in favor of fewer, larger turbines than more, smaller ones. Thus, the sheer size of these things is getting absolutely nutty. The H260-18MW turbine currently under construction by CSSC uses 128-m-long (420-ft) long blades for a ridiculous 260-m (853-ft) diameter and a 53,000-sq-m (570,490-sq-ft) swept area. That's 9.9 NFL football fields or 42.4 Olympic swimming pools when converted to standard journalistic units - ignoring the small area left unswept by its hub. MingYang MingYang's own typhoon-proof MYSE 18.X-28X, pictured above, will use 140-m (459-ft) blades for a swept area of 66,052 sq m (711,000 sq ft, 12.3 NFL fields, 52.8 Olympic swimming pools) - again, minus the hub area. The new turbine proposed for 2025 by MingYang, according to Bloomberg, will have a peak output of 22 MW, and a rotor diameter over 310 m (1,017 ft), corresponding to a swept area of at least 75,477 sq m (812,425 sq ft, 14.1 NFL football fields, 60 olympic swimming pools), minus hub. Add on a little clearance to make sure the blade tips stay out of the water, and you'll probably be looking at something taller than New York's 319-m (1,047-ft), 77-story Chrysler Tower, or the 324-m (1,063-ft) Eiffel Tower in Paris - but spinning. I don't have an imagination capable of picturing just how awe-inspiring a machine like this would be at close range. Indeed, these will be some of the largest moving parts ever built. Can you think of anything else with visible moving parts this big? Nothing in the mining mega-machine category comes close, and while the 27-km (16.6-mile) circumference of the Large Hadron Collider holds the title of the world's largest machine overall, it's hidden underground, and particle acceleration isn't exactly a spectator sport. So taking a boat ride past these mammoth offshore wind turbines will be pretty much an unprecedented human experience. It'll be breathtaking. Sign me up! Source: Bloomberg
When British prime minister Rishi Sunak appeared in front of the hastily assembled press on September 20, the letter-crammed slogan on his lectern caused the country to squint: "Long-term decisions for a brighter future," it read. We now know, of course, there was little in the speech that followed that brought hope. Certainly not concerning the technological fight for our future climate. Not that long ago, the UK seemed a rather brighter beacon in the industrial transition toward reversing the global climate breakdown. The countrywide goals were laid out. COP26 at least offered a forum and a spotlight. London has made strides in establishing itself as a hub for green tech startups. On the narrow but viable path toward net zero, leaders were at least taking the right steps. Then came the nadir of the last few weeks. Last week, with the government's already infamous U-turn on its green pledges, the nation joined in a consternated chorus with global leaders to lament the prime minister's short-sighted choice. Sunak has pushed back the British net-zero transition timeline by at least five years. The first and most galling concern is, obviously, the consequences for the future of our species on this planet. The next biggest issue, currently being voiced by leaders across industries and especially within the climate tech and climate finance sectors, is the message it sends out to those of us in the trenches actually trying to build technology to change the world and enable a sustainable future. That message is loud and clear: The U.K. government isn't willing to be consistent when it comes to climate crisis response policy, which, aside from capital and the support of nascent tech markets, is one of the most critical things anyone in our sector can hope for. I care about this because, as Americans who chose to build a biocatalyst engineering company here in the U.K., we're acutely aware of the impact such a reversal of policy has on every stage of our sector's existence. All major technological innovation ultimately comes from government support at the very beginning. We wouldn't have affordable solar panels, microchips, mobile phones or the internet without government funding, government subsidies, government encouragement and government infrastructure. You can't scale technology that is going to make a considerable impact without upfront capital to match. In January of this year, Chancellor Jeremy Hunt unveiled a long-term vision to grow the economy, saying, "I want the world's tech entrepreneurs, life science innovators and clean energy companies to come to the U.K. because it offers the best possible place to make their vision happen." Unless his long-term vision was only meant to last until the end of the summer, for entrepreneurs, innovators and businesses to thrive and unlock the economic potential that comes from creating new industries, we need a consistent approach from the government. We have enormous fiscal potential. Within the U.K. climate tech community, we are working to create high-paying jobs and value for investors across just about every asset class. And collectively - hell, individually - our solutions could genuinely change the world. Of course, this is the common cause we should be united around. Our company is trying to move industry away from a reliance on fossil fuels to make chemicals, among other things. But the impact of these political spasms and contradictions in our climate commitments has on, say, an EV battery business - which has just seen the market demand for its output slide down the road by five years - isn't hard to appreciate. If giants like Ford are feeling the frustration, imagine the mood at a small green tech startup. So what's the play? How can the government support those striving against an increasingly insecure (and insincere) backdrop? The response of any tech company impacted by last week's news surely has to be this. First, we need a consistent macroeconomic policy. This has a ...
Humans are facing an existential crisis in climate change. We are also facing a crisis of collective action. As a species, we have every reason to slow the rise of global temperatures, but taking steps to cut carbon emissions is generally not in the short-term interest of individuals, companies, or countries. Where does that leave IT organizations? IT systems all around the world consume ever-increasing amounts of electric power, making them a critical factor in increasing carbon emissions. Many people in the industry are acutely aware of IT's climate impact and want to see it reduced, but minimizing IT's carbon footprint will entail a cost that many small businesses and multinational corporations are reluctant to bear. Curious about what might incentivize a shift to greener tech, I spoke to IT leaders who are pushing back on climate change. I found people working at every level of organizational leadership - from the top down to the bottom up - and pursuing a variety of strategies to reduce carbon consumption in company products and business models. Data drives climate change - and solutions When asked what drives IT's carbon emissions, most respondents pointed to data. In particular, the rising popularity of data lakes and the data centers that store them are a huge contributor to the problem. Given the primacy of data for modern businesses, companies that want to reduce their carbon footprint will have to make hard choices. "Companies would have to stop collecting a lot of (poor) data and storing it," says Chrissy Kidd, SEO manager at Splunk, which helps users sort through massive machine-generated data sets. "They won't do this because they're married to the idea that they are 'data driven' organizations, when most of them are not. We're also living in a data ecosystem, where everything is based on collecting and storing data, even when only an estimated 10% of that data gets 'used' beyond simple storage. Until we have less data to store, seemingly forever, IT companies will continue to emit more carbon than not," she said. The explosion of storage and its emissions in recent years was driven not only by data's usefulness (real or perceived), but by a fundamental shift in underlying economic factors. "In older models, storage was one of the most expensive components of a system, so we were very selective in what data was stored and how," says George Burns III. A senior consultant for cloud operations at SPR, a technology modernization firm, Burns notes that today, "the opposite is true, in that storage is often the least expensive component of a system, which has led many organizations to adopt a 'store everything forever' mentality." The most straightforward way to reduce data center emissions is to power those data centers with clean energy. This can turn out to be a quick win for companies looking to burnish their green credentials. As the cost of renewables continues to drop, it is also becoming a relatively inexpensive fix. "Customers of corporate colocation data centers are increasingly seeking more sustainable energy supplies, which thanks to recent progress they will be able to access more and more," says Chris Pennington, director of energy and sustainability at Iron Mountain. "Operators in our industry, Iron Mountain amongst them, have proven that renewables are a reliable and cost-effective energy source by activating innovative procurement solutions, and it is making clean energy more accessible to all." Solving IT's data problem with data A slew of companies are now trying to solve the data problem with data - that is, by using data analytics and other IT techniques to reduce the amount of stored data. For instance, Moogsoft, the developer of an AIOps incident management platform, uses machine learning algorithms to try to reduce the amount of data at rest and in motion on customer infrastructure. While this functionality has always been part of Moogsoft's pitch, company CTO Richard Whitehead says he's seen...
Flocks of seagulls have one of the most ruthlessly efficient hunting methods in nature. Network designers could learn a lot from them. In the fabric of life, Mother Nature is a starkly efficient seamstress, able to weave together seemingly random strands of human, animal, and plant behavior into . In the fabric of life, Mother Nature is a starkly efficient seamstress, able to weave together seemingly random strands of human, animal, and plant behavior into orderly patterns, loose threads forming a tight-knit tapestry that can withstand the eternal test of time. And as humankind has evolved into Renaissance beings seeking perfection in the disciplines of art and science, we’ve returned to our roots, extracting inspiration from nature’s designs. We have engineered materials that are sturdier than ever, modeled after the oozing networks of the humble slime mold. And locomotive robots, propelled by the squishing trudge motions of an earthworm. And symmetric algorithms, which mimic the way shapes like snowflakes and sunflowers bloom spontaneously. Now, a team of researchers from the United Kingdom, China, and Austria are looking to use the habits of seagulls to build better cloud computing systems. In a paper published in Internet of Things and Cyber-Physical Systems, a journal from KeAi, which was founded in a partnership between Elsevier and China Science Publishing & Media, the researchers argue that using a “seagull optimization algorithm”—a so-called meta-heuristic algorithm that mimics the hunting and migration behavior of seagulls—can make cloud computing more energy efficient, cutting its power consumption by 5.5% and lightening its network traffic by 70%. Their study attacks the problem of connecting cloud computing’s virtual machines to physical ones, as near-infinite streams of data must all flow through physical supercomputers that process each workload. They often converge in sprawling edge-cloud data centers run by industry giants, which are strategically scattered across the globe, because being geographically close to internet consumers who are using the cloud lets the digital signals buzz back and forth at a faster rate. These data centers are huge energy gobblers. In 2017, they ate up 416 billion kilowatt-hours of power, or about 2% of the world’s total electricity cost. However, the researchers say that the way seagulls behave when they’re on the hunt for food or prey is one of the most ruthlessly efficient examples of an entity zeroing in on its target, with minimal excess energy expenditure. They suggest using an algorithm that imitates the habits of food-hunting seagulls to determine where to place virtual machines—which, in cloud computing, are like nodes that route multiple data workflows to the same physical supercomputers—within a network of server communications. The idea is to make the network’s traffic as efficient (and thus as light) as possible. So, if the supercomputers are like seagull food—hapless freshwater fish or shrimp that exist in limited numbers, and swim only in certain parts of the ocean—then the virtual machines are like the stalking birds of prey, eyeing their precious objective (which is a connection to a nearby physical machine), and mapping a path to it with the blistering urgency of a hungry carnivore. Seagulls do this while avoiding collisions with other seagulls, and their migration flight path over time naturally gravitates toward that of the fittest bird among them. When they lock onto a target and dive in for the kill, they vary their angle and speed deftly, resulting in a sort of downward spiral—calculated by study authors as a formula of cosines—that can predictably describe a striking seagull at any given time, relative to other seagulls in the area. It’s not the first time that the biological genius of seagulls has been noted. In 2019, a paper in Knowledge-Based Systems outlined how the seagull-optimization algorithm could mathematically model solutions for stronger const...
Flocks of seagulls have one of the most ruthlessly efficient hunting methods in nature. Network designers could learn a lot from them. In the fabric of life, Mother Nature is a starkly efficient seamstress, able to weave together seemingly random strands of human, animal, and plant behavior into . Please enable JS and disable any ad blocker
Share this article Copy Link Share on X Share on Linkedin Share on Facebook Mark Bjornsgaard is getting used to the attention. "Ten days ago, I wasn't," says the founder of Deep Green, referring to the unprecedented publicity his firm achieved by installing one of its digital boilers to heat a swimming pool in Exmouth, Devon. Something about using the waste heat generated from servers - a resource that apparently few people knew existed - seemed to capture the imagination of the British public. Since then, Bjornsgaard has been inundated by requests from other swimming pools for more information about its digital boilers, as well as local government leaders eager to learn how these pint-sized data centres might be used in their own public heating projects. "From a business point of view, it's been absolutely amazing, because we are now walking into conversations which we didn't think we'd have for a couple of years," says Bjornsgaard. It's also a sign of the times. Amid unprecedented fluctuations in energy insecurity thanks, in large part, to Russia's invasion of Ukraine, anxiety about rising electricity prices is at an all-time high across Europe. As well as sparking a conversation about national reliance on natural gas, attention has drifted toward how energy might be saved or put to better use - not least in a cloud computing infrastructure that generates more carbon emissions than the aviation industry, thanks in part to its elaborate and power-hungry cooling systems. All that electricity and heat energy could, critics argue, be put to much better use outside the data centre and for the benefit of society at large. Going by the law of physics that energy is never lost but merely changes form, it is conceivable that the many gigawatts of electricity used to power data centres could, if left uncooled, be converted into heat to warm thousands of homes and businesses. That potential has seen Norway and Germany pass laws mandating data centre operators have a plan - any plan - for heat reuse on their premises, while similar projects have seen the cloud warm greenhouses, fish farms, homes and offices. "And that's good," says Bjornsgaard, who professes a great belief in practical green energy solutions. When push comes to shove, he argues, "we can't muck about for the next ten years with space lasers and cracking hydrogen and trying all these elaborate [projects] that we love to engineer." The mechanics of heat reuse Servers get very hot, very quickly - so much so that your common variety rack would probably catch fire were it not artificially cooled. This inevitably compromises the overall electrical efficiency of the data centre, calculated using a ratio called Power Usage Effectiveness. The more power that can be used for computation over cooling, the better. Even with the most efficient cooling systems, however, the average PUE score for most data centres remained at an uncomfortably high 1.57 throughout 2021 - meaning as much as 40% of power running through these facilities was used for cooling. Reusing waste heat would, in theory, drag those numbers down, make those data centres sustainable and contribute heat and warmth to local communities in the bargain. Such has been the case in Dublin, where AWS is participating in a community heating scheme, and in Stockholm, where one data centre operator has set the goal of using its waste heat energy to heat 10% of the city by 2035. The scale of these schemes, however, is also a reminder that many of these projects usually only come about through massive capital investment - not least because most of them reuse heat from air-cooled data centres using expensive heat pumps. An alternative lies in immersion cooling for edge high-performance compute units, wherein the server is surrounded by a liquid - usually water, or oil - and then wrapped in copper piping to build a heat exchanger. This also describes Deep Green's first, crude prototype digital boiler, which consisted of little...
Initiative Establishes New Industry Record For Speed, Power and TCO Efficiency For TLS-Encrypted Traffic, Paving Way For Next-Gen Live Streaming Los . Initiative Establishes New Industry Record For Speed, Power and TCO Efficiency For TLS-Encrypted Traffic, Paving Way For Next-Gen Live Streaming Los Angeles, CA – February 22, 2023 – Varnish Software, a leader in web caching, video streaming and content delivery software solutions, in collaboration with Intel and Supermicro today announced new, record-setting content delivery performance milestones, having achieved greater than 1.3 Tbps throughput on a single Edge server consuming approximately 1,120 watts, resulting in 1.17 Gbps per Watt. The combined solution makes it possible for content delivery services and the live event industry to support massive live streaming events in an economical and sustainable way. “Achieving over 1 Tbps in a single Edge server is a major leap forward for the industry, and critical for delivering the next generation of video and digital experiences,” said Frank Miller CTO, Varnish Software. “The need to deliver more throughput with less energy and at the lowest cost is growing exponentially. With commercially available software and off-the-shelf server hardware from Supermicro – built on 4th Gen Intel Xeon Scalable processors – we have entered a new era of CDN cache performance. Varnish Software’s unique architecture, features and capabilities were essential in reaching the new benchmarks, which include asynchronous direct I/O, NUMA awareness and software-based TLS.” The benchmarks were accomplished using Varnish Enterprise 6.0 deployed on a Supermicro 2U CloudDC server powered by 4th Gen Intel Xeon Scalable processors, without requiring the use of specialized, added-cost TLS offload cards. Supermicro’s CloudDC server line is an optimized platform targeting private and public clouds offered in 1U and 2U form factors in single or dual processor configurations. These servers are optimized for balance among processor, memory, storage, expansion, and networking to give the best efficiency. For expansion, each of these servers offer a variety of PCI-Express (PCIe) Gen 5 x8 and x16 slots for the latest PCIe cards. CloudDC is a well rounded server that gives the best cost to optimized performance ratio. “We deliver first-to-market innovations and IT Solutions that are environmentally friendly and fit every organization’s objectives and budget,” said Michael Mcnerney, vice president, Marketing and Network Security, Supermicro. “The collaboration with Intel and Varnish Software is an example of how we are working closely with best-in-class technology partners to deliver the latest generation of cutting-edge solutions, specifically in the video streaming and CDN space.” Importantly, the throughput and energy efficiencies achieved with this benchmark can be applied to a broad range of servers depending on customer requirements. Varnish looks forward to working with key partners Intel and Supermicro on solutions that support a wide range of video and content delivery workloads leveraging cost-effective system footprints and energy efficiency. Parties interested in learning more can contact Varnish directly or schedule a meeting with Varnish at MWC in Barcelona, February 27 – March 2, 2023. Varnish Software is the leading caching, streaming, and content delivery software stack. Our software helps content providers of any size deliver lightning-fast, reliable, and high-quality web and streaming experiences for huge audiences. With over 10 million deployments, our technology is relied on by millions of websites Worldwide across every industry including Hulu, Emirates, and Tesla. Varnish Software has offices in Los Angeles, New York, London, Tokyo, Singapore, Stockholm, Oslo, Karlstad, Düsseldorf, and Paris.
Green streaming is a concept that refers to industry and consumer efforts to reduce the energy impact of streaming by balancing quality and energy efficiency. Today, the energy consumption of streaming infrastructures is poorly understood, despite growing pressure from consumers and regulators that all industries seek to make sustainability a business imperative. STL Partners spoke to Dom Robinson, founder of Greening of Streaming, an organisation focused on driving industry collaboration to reduce the carbon impact of streaming, to understand where the industry is now and how it can drive change in the future. How was Greening of Streaming born? Conversations about green streaming and the carbon impact of video began to emerge in 2018. These discussions prompted me to write an article on 'greening of streaming' which attracted interest from across the global internet streaming industry and led to the creation of the Greening of Streaming members' association. Today we capture ~70% of internet traffic in Europe and North America through our members, including Intel, Varnish, Akamai, and Lumen. What is the energy impact of streaming; is streaming sustainable? Streaming infrastructures are technically complex and involve many different partners, this means that it is difficult to establish how much energy is being used and who 'owns' the energy consumption at any given stage. The diagram below shows a simplified version of the stages of the streaming process. Each of these stages involves core access and termination, switches, amps, routers and more, equal to many thousands of components that all consume energy. You can find a more detailed version of the diagram produced by Greening of Streaming here. Given that the volume and resolution of streamed content will continue to grow, streaming businesses, telcos, content delivery networks, and other partners in the value chain need to act now. Increasingly streaming is the tail wagging the dog on this issue, with streaming businesses placing requirements on the content delivery infrastructure, including telcos, to disclose their energy impact and take steps to reduce consumption. Figure 1: Streaming infrastructure (simplified) Source: Greening of Streaming, STL Partners But streaming must be more energy efficient than playing a CD or watching TV using a traditional cable box, right? For consumers, streamed content might seem more energy efficient than satellite TV or other traditional ways to consume media, but it isn't, it just makes the energy footprint less visible. Rather than paying the energy cost of satellite or cable connection yourself, the cost has been shifted onto the streaming infrastructure, including your internet provider and streaming service. What can telcos do to reduce the energy consumption of streaming? Understanding how energy is used by streaming infrastructure during a live event is a great place for the industry to start getting to grips with the energy consumption of streaming. This is because live video requires all players in the value chain to communicate at one point in time, generating the fullest picture of energy consumption. It will then be relatively easier for ecosystem players to apply the lessons learnt through live streaming to the on-demand context. This diagram shows the relationship between traffic and network energy consumption during a live streamed event: Figure 2: The relationship between traffic, network capacity, and power consumption during live-streamed events Source: Adapted by STL Partners from Schien, Shabajee and Priest Importantly it shows that high traffic during a live streamed event does not increase load or energy consumption of the network. Instead, it shows that the capacity of the network dictates energy consumption, no matter how many people are tuning in to watch. This is a really significant finding, as it shows that energy consumption remains peaked for the entire time that the network has increased capacity (t...
Microsoft joins Greening of Streaming Not-for-profit organisation focuses on developing joined-up engineering strategies in the streaming industry to reduce energy waste in the delivery infrastructure By Jenny Priestley Published: February 17, 2023 Microsoft has joined the likes of DAZN, V-Nova, Akamai and Intel as a member of not-for-profit organisation Greening of Streaming. The organisation focuses on developing joined-up engineering strategies in the streaming industry to reduce energy waste in the delivery infrastructure. Simon Crownshaw, worldwide lead for media and entertainment at Microsoft said; "We hope that by helping to facilitate deep insight into the energy use of our infrastructure we can not only improve our own energy efficiency, but help the industry as a whole learn more about operating streaming infrastructure at scale, and doing so in a cost efficient and sustainable way." Dom Robinson, founder of Greening of Streaming, added "Microsoft is undeniably a power house and a thought leader in our sector. Having their participation in our working groups will provide a critical resource for various efforts we are undertaking. Microsoft spans many disciplines in the sector and that joined up thinking will be critical to transforming how the streaming industry as a whole can ensure a minimal energy footprint, while maximising the user experience."
While news leaked out in both CSI Magazine's recent coverage, and in Faultline's Sustainability in Streaming webinar , Greening of Streaming was still getting things lined up to formally open the doors and invite input from across the industry to help the group explore ideas and options in creating an industry wide 'accord' around Low Energy Sustainable Streaming. At today's IBC Accelerator Kickstart Dom Robinson, Founder of www.greeningofstreaming.org, will open the door to contributions to LESS seeking the first from the IBC Accelerators, and hoping to seed the challenge of industry-wide engagement in forming an industry accord around action toward Energy Efficiency! The so called LESS Accord, aims to dig deep into the heart of the broadcast and streaming industry and ask a taboo question of an historically 'quality obsessed' industry : "What if the 'default' streaming encoding profile was Energy Optimised with 'acceptable' quality for general viewing rather than, as it is today, Quality Optimised (and typically over provisioned) with no energy consideration?" The fundamental idea is that in many cases consumers cannot tell the difference between various streaming and broadcast service qualities, and increasingly the industry relies on computer aided techniques to differentiate quality that humans cannot perceive. The idea behind the LESS Accord is to 'give permission' to ask out loud what many engineers in the industry already instinctively, privately think and to explore how we might be able to deliver services that fulfil the consumers expectations without simply overselling imperceptible quality / value propositions, and creating inappropriate, expensive, unsustainable and unnecessary energy demands for no benefit to the viewer. "It is doubtless a challenging question, and one that will ruffle a few feathers. And it may prove impossible..." notes Robinson "...however, we have got to rigorously explore the possibilities together, or else 'energy saving' strategies may simply end up being used to 'greenwash' over ever spiralling energy consumption as we take up the savings by further overdelivering of quality we get no value from, in the same way that 'offsetting' has become widely open to abuse in many sectors." The project will span 18 months. In the first stage, starting today at IBC, GoS is inviting all stakeholders in the industry to contribute ideas (tried or simply educated guesses) about what LESS might look like, what bandwidths should be targeted, what decoding and caching strategies would work and how the end to end implementation might be put together. Even outliers and radical ideas need to be considered! In June, at Greening of Streaming's collaborative event with Media Tech Sustainability Summit, https://www.mediatechsustainabilitysummit.com/ members will shortlist and invite those who contributed ideas that members feel can best be put to test in production, to present the ideas to a public audience, including invited policy makers, regulators and politicians. Over the summer this year the GoS members will then design tests to evaluate the energy efficiency of the best ideas, and these will be planned to be run, as far as possible, in real-world production environments. These ideas will be presented at IBC to the broad industry community for a check and balance, and then from October '23 to March 24 testing will be run in earnest by Greening of Streaming members. At the end of the test cycle, results will be collated and academic partners from Working Group 9 will be invited to help analyse the outputs. The final data driven results will then be discussed at a large event planned in Burbank to engage 'Golden Eyes' from Hollywood studios and their final subjective opinion, based around the notion that 'you normally mastered these for the big screen, but based on our tests here is what the content would look like when energy optimised for mass distribution on the small(er) screen would look like - what ar...
www.greeningofstreaming.org - a not for profit members organisation focussing the streaming industry on developing joined-up engineering strategies to reduce energy waste in the delivery infrastructure - today welcomes COGNIZANT as a new member! Media companies have transformed from broadcasters - delivering a one-to-many experience at scale - to creating millions of individual and personalised experiences. This is a natural driver for energy intensive technologies and processes across the media value chain and it challenges the committed sustainability goals which are key for media brands to thrive. The Media and Entertainment practice invests in technology innovation that delivers commercially viable sustainability improvements - using technology to increase efficiency that reduces cost and emissions. "We believe that meaningful change in our industry will come from within, and we look forward to working with Greening of Streaming to drive sustainability across the entire value chain" - Manju Kygonahally Head of Communications, Media and Technology at Cognizant. Dom Robinson, Founder of Greening of Streaming, commented: "Cognizant are hugely influential in helping many in our sector think about their strategy, and we look forward to active input from Cognizant as we drive our various Working Groups forward. We are both very excited about the long term possibilities to help develop and drive energy related best practice through the streaming industry."
Amazon's lack of transparency is hurting itself and investors. After years of kneeling at the altar of long-term thinking and bold experimentation, Amazon (AMZN 1.29%) finally seems to be getting religion on the bottom line. The company is cutting costs like never before, taking an ax to formerly promising new ventures like Amazon Care and announcing its first major round of layoffs, dismissing 10,000 corporate employees. A sizable portion of those layoffs are coming from the company's Alexa and Devices division, where, according to Business Insider, the company is losing $10 billion a year on the voice-activated technology. The revelation that Alexa has become a financial quagmire begs the question of what else Amazon is wasting money on. Its international business seems like one target for savings. The division, which is mostly made up of e-commerce operations outside of North America, loses money in most years, and has posted an operating loss of $5.5 billion through the first three quarters of this year. Another division that seems deserving of more scrutiny is Prime Video, which Amazon doesn't directly account for on its financial statements. With the exception of a la carte spending on streaming rentals and sales, Amazon doesn't directly monetize Prime Video, using it instead as a perk to attract Prime members. The company spent $13 billion on video and music content in 2021, and video spending was estimated to increase to $15 billion in 2022, including sports. That's more than all of its peers, including even Netflix (NFLX 0.42%), which was expected to spend $13.6 billion on an amortized basis this year. Since video is only loosely connected to online shopping, investors should be asking if that $15 billion is money well spent. Image source: Getty Images. Prime-o-nomics Unlike peers like Netflix or Disney (DIS 0.28%), Amazon keeps its subscriber metrics close to the vest. Last April's shareholder letter said the company now has 200 million Prime members globally. Amazon charges different prices for Prime around the world, but if you assume every one of those members paid $139 a year, that would bring in $27.8 billion in annual membership fees. That's close to the $34.1 billion it brought in with subscriber fees over the last four quarters, which comes from Prime and other subscription businesses like Kindle Unlimited, Audible, and Amazon Music. Based on those numbers, $30 billion in Prime revenue seems to be a fair estimate. That means Amazon is spending half of its Prime membership revenue on Prime Video, leaving only $15 billion to be allocated toward other Prime benefits like two-day shipping and free returns, the biggest driver of membership. Amazon also spent $82.4 billion on shipping costs over the last year, and at least some of that is allocated to Prime. Of course, the argument for spending on Prime Video is that signing up new Prime members and incentivizing existing ones encourages them to do more online shopping on Amazon. Explaining the company's differentiated approach to Hollywood, founder Jeff Bezos once said: "When we win a Golden Globe, it helps us sell more shoes." That line of thinking also explains why the company is shelling out $1 billion annually on Thursday Night Football, which it said led to a record number of Prime sign-ups in a three-hour period. However, there are likely diminshing marginal returns to spending on Prime Video at this point. Anyone who is a frequent online shopper has probably already joined Prime, which has been around since 2005. The relationship between video streaming and online shopping is also tangential. If Amazon has $15 billion to spend to improve Prime benefits, would it not be better off spending it on faster delivery, better customer service, or lower prices? There's nothing magical about the relationship between Prime Video and online shopping. The streaming reality Amazon is ramping up video spending at a time when nearly all of its competitors are tightenin...
After years of kneeling at the altar of long-term thinking and bold experimentation, Amazon (AMZN -1.40%) finally seems to be getting religion on the . After years of kneeling at the altar of long-term thinking and bold experimentation, Amazon (AMZN -1.40%) finally seems to be getting religion on the bottom line. The company is cutting costs like never before, taking an ax to formerly promising new ventures like Amazon Care and announcing its first major round of layoffs, dismissing 10,000 corporate employees. A sizable portion of those layoffs are coming from the company's Alexa and Devices division, where, according to Business Insider, the company is losing $10 billion a year on the voice-activated technology. The revelation that Alexa has become a financial quagmire begs the question of what else Amazon is wasting money on. Its international business seems like one target for savings. The division, which is mostly made up of e-commerce operations outside of North America, loses money in most years, and has posted an operating loss of $5.5 billion through the first three quarters of this year. Another division that seems deserving of more scrutiny is Prime Video, which Amazon doesn't directly account for on its financial statements. With the exception of a la carte spending on streaming rentals and sales, Amazon doesn't directly monetize Prime Video, using it instead as a perk to attract Prime members. The company spent $13 billion on video and music content in 2021, and video spending was estimated to increase to $15 billion in 2022, including sports. That's more than all of its peers, including even Netflix (NFLX 3.14%), which was expected to spend $13.6 billion on an amortized basis this year. Since video is only loosely connected to online shopping, investors should be asking if that $15 billion is money well spent. Unlike peers like Netflix or Disney (DIS 0.90%), Amazon keeps its subscriber metrics close to the vest. Last April's shareholder letter said the company now has 200 million Prime members globally. Amazon charges different prices for Prime around the world, but if you assume every one of those members paid $139 a year, that would bring in $27.8 billion in annual membership fees. That's close to the $34.1 billion it brought in with subscriber fees over the last four quarters, which comes from Prime and other subscription businesses like Kindle Unlimited, Audible, and Amazon Music. Based on those numbers, $30 billion in Prime revenue seems to be a fair estimate. That means Amazon is spending half of its Prime membership revenue on Prime Video, leaving only $15 billion to be allocated toward other Prime benefits like two-day shipping and free returns, the biggest driver of membership. Amazon also spent $82.4 billion on shipping costs over the last year, and at least some of that is allocated to Prime. Of course, the argument for spending on Prime Video is that signing up new Prime members and incentivizing existing ones encourages them to do more online shopping on Amazon. Explaining the company's differentiated approach to Hollywood, founder Jeff Bezos once said: "When we win a Golden Globe, it helps us sell more shoes." That line of thinking also explains why the company is shelling out $1 billion annually on Thursday Night Football, which it said led to a record number of Prime sign-ups in a three-hour period. However, there are likely diminshing marginal returns to spending on Prime Video at this point. Anyone who is a frequent online shopper has probably already joined Prime, which has been around since 2005. The relationship between video streaming and online shopping is also tangential. If Amazon has $15 billion to spend to improve Prime benefits, would it not be better off spending it on faster delivery, better customer service, or lower prices? There's nothing magical about the relationship between Prime Video and online shopping. Amazon is ramping up video spending at a time when nearly all of its c...
In 1984 - the book, not the year - the means by which the evil totalitarian regime "Big Brother" retains its power is through something called "doublethink." It's the practice of holding contradictory beliefs in tandem: "war is peace," "freedom is slavery," "ignorance is strength," "2 + 2 = 5," to use the book's examples. It worked because when our minds - our sense of logic, our morality - become compromised, they're easier to control. Each week we'll send you the very best from the Vox Culture team, plus a special internet culture edition by Rebecca Jennings on Wednesdays. Sign up here. Considering the events of the last several months, you could also interpret doublethink to mean things like "the metaverse is the future," "people will pay millions of dollars for shitty art," or "this crypto billionaire definitely has my best interests in mind." It's a trite reference, but it's sort of the only one that makes sense. Somehow, somewhere along the way, the American public was duped into believing that these things could be true despite being, well, not. On November 11, the 30-year-old CEO of the cryptocurrency exchange FTX, Sam Bankman-Fried, resigned after his firm filed for bankruptcy. Prior to its implosion, Bankman-Fried (colloquially referred to as SBF) was regarded as a boy genius in the crypto world, not only because of his billionaire status but because he was widely considered to be "one of the good ones," someone who advocated for more government regulation of crypto and was a leader in the effective altruism space. Effective altruism (EA) is part philosophical movement, part subculture, but in general aims to create evidence-backed means of doing the most good for the most people. (Disclosure: This August, Bankman-Fried's philanthropic family foundation, Building a Stronger Future, awarded Vox's Future Perfect a grant for a 2023 reporting project. That project is now on pause.) Instead, Bankman-Fried did the opposite: He tanked the savings of more than a million people and may have committed fraud. In a conversation with Vox's Kelsey Piper, he essentially admitted that the do-gooder persona was all an act ("fuck regulators," he wrote, and said that he "had to be" good at talking about ethics because of "this dumb game we woke westerners play where we say all the right shibboleths and so everyone likes us"). In terms of corporate wrongdoing, the SBF disaster is arguably on par with Enron and Bernie Madoff. Here was a dude who marketed himself as a benevolent billionaire and convinced others to invest their money with him simply because he was worth $26 billion (at his peak). He partnered with celebrities like Tom Brady and Larry David to make crypto - a wildly risky investment that rests on shaky technology - seem like the only way forward. Both Brady and David, among several other famous people, are now being accused in a class-action suit of defrauding investors amid FTX's collapse. But there have been other examples of technological doublethink in recent history. Over the past year, Mark Zuckerberg has campaigned so hard for the mainstreaming of the "metaverse" that he changed the name of one of the world's most powerful companies to reflect his ambitions. His metaverse, though, called Horizon, would end up looking like a less-fun version of The Sims, a game that came out in the year 2000 (but even Sims had legs). The strategy has not, as of publication time, paid off. The company lost $800 billion. What's ironic, though, is that anyone with eyeballs and a brain could have simply told Zuckerberg that Horizon is terrible. Not only is it ugly and functionally useless, it's also expensive (VR headsets cost hundreds of dollars at minimum). People did, to be sure, tell him that - since its rollout, the platform has been widely mocked in the media and online - it's just that Zuckerberg hasn't listened. There's this thing in tech where entrepreneurs tell themselves that their job is to innovate. They are the builders, ...
BHITS 2022

BHITS 2022

2022-11-1400:31

Our site uses cookies so that we can offer you a better browsing experience, analyze site traffic, and personalize content. If you do not agree with our use of cookies, please change the current settings in your privacy preferences. You can also read more about cookies and similar technologies here Privacy Policy Terms & Conditions . If you continue to use this site, you consent to our use of cookies as they are currently set. Accept
Film, TV and online/VoD productions cause high CO2 emissions and consume many resources. A significant part of these CO2 emissions can be avoided by . Diese Cookies sind zur Funktion der Website erforderlich und können nicht deaktiviert werden. In der Regel werden diese Cookies nur als Reaktion auf von Ihnen getätigte Aktionen gesetzt, die einer Dienstanforderung entsprechen, wie etwa dem Festlegen Ihrer Datenschutzeinstellungen. Sie können Ihren Browser so einstellen, dass diese Cookies blockiert oder Sie über diese Cookies benachrichtigt werden. Einige Bereiche der Website funktionieren dann aber nicht. Diese Cookies speichern keine personenbezogenen Daten.
Well, Meta sure is in a bit of a mess. The company formerly known as Facebook rang in its one-year anniversary last week but had very little reason to celebrate. Instead, an unfortunate Q3 earnings report showed that, since its inception last October, the company has lost a gargantuan amount of money in its quest to create "the metaverse" - a hypothetical new realm where it wants all of us to live. How did we end up here, exactly? It all started twelve months ago, when, in the heat of a whistleblower scandal, it looked like Congress might actually crack down on Facebook. Leaked documents - what came to be known as the Facebook Papers - had revealed the company's harmful impact on young people, its ineptitude with misinformation, and its algorithmic toxicity. As a result, regulation talk was afoot. U.S. Representatives were threatening antitrust action and activists were demanding a break up. Things were looking pretty bad. It was then that a thunderclap of inspiration must've struck over a Menlo Park boardroom somewhere: if things were getting too hot to handle in the real world, why couldn't Facebook simply invent a new world? Yesss…a new world - this could be the pivot of a lifetime! And hey, the company had changed the rules of the game before - it could definitely innovate its way out of this. Thus, after a meeting I'm sure resembled some watered down version of that "change the conversation" scene from Mad Men, The Facebook Company became "Meta Platforms" and something called the "metaverse" was born. What was the metaverse? Zuck and his cohort envisioned a bold digitization of our world - supported by hardware and infrastructure that hadn't been built yet. It would be fueled by investments in the most emergent and exciting technologies, from virtual reality to augmented reality to holograms to cryptocurrency. As the leader of a push to transform the digital economy, Meta could be a pioneer - an explorer going where no tech firm had gone before. Sure, in a lot of cases, the tech wasn't quite there yet to actually build this world, but, in the meantime, such shortcomings could be obscured via advertising and animation and hyperbolic rhetoric. All of this could be used to sorta…paint the picture of what the metaverse would look like someday…maybe. Anyway, what did it matter? The point was this: the company had to do something big to make people look at it differently - and this was it. Yes, Facebook's transformation into Meta always had to be two things at once: a desperate optics shift and a genuine redirect in business strategy. Maybe the company had always envisioned broader investments in AR/VR but crisis forced it to accelerate? We don't really know. What we do know is that the company's massive pivot to a place called the "metaverse" seems to have only caused it more headaches over the past year: namely, billions spent on dubious investments, plummeting profits, worried investors, and a slew of hackneyed digital products that people don't actually want to use. In a word, Meta's "first year" has been terrible. Will things get better? That's unclear. Zuck certainly thinks so, though others have their doubts. We decided to take a look back at the past twelve months to highlight key events involving "the metaverse" - an imagined place that Meta has promised to build but that, as far as we can tell, still doesn't exist yet. OCTOBER, 2021 Our story begins in the dark days of early October, 2021, when the company known as Facebook is besieged on all sides. Frances Haugen, a former employee turned traitorous whistleblower, has leaked extensive documentation of the company's sins to the press. The Facebook Papers, first reported by the Wall Street Journal, expose a raft of concerns: antitrust issues, privacy issues, psychological health issues - the list seems endless. Meanwhile, a host of longstanding problems (legal complaints, public distrust, congressional inquiries) compound to place ever greater pressure on what was o...
loading
Comments 
loading