The Xiaomi Sound Pocket and the Xiaomi Sound Outdoor are the two new Bluetooth speakers that Xiaomi is introducing to the global market.
Specifications of the Xiaomi Sound Outdoor speaker:
Black, blue, and red are the available colors for the Xiaomi Sound Outdoor speaker. With a size that fits in most luggage, rubber feet, and a carrying strap, it’s designed to be both strong and simple to carry. The speaker’s integrated tweeter, two passive radiators, and subwoofer allow it to produce powerful sound even though it is compact. Its 30 W output.
With an IP67 classification for dust and water protection, the speaker is also made to survive challenging circumstances. It has a 2600 mAh battery that lasts up to 12 hours on a 50% volume level, control buttons for convenient operation, and Bluetooth 5.4 technology for seamless device connections. It may also be connected to up to 100 units to create a bigger sound system, or it can be connected to another speaker for stereo sound.
Conversely, the Xiaomi Sound Pocket provides a more portable option without sacrificing performance. This compact, black speaker is ideal for personal usage because it has an IP67-rated waterproof and dustproof shell with a 5W output. With a 40% volume, it can play for up to 10 hours and is lightweight and portable.
Although the corporation has not yet disclosed the precise dates of distribution for North America and Europe, we anticipate an announcement shortly.
In the early Universe, particle physics was supreme.
ESSENTIAL NOTES Scientists can recreate the circumstances of the early Universe with particle accelerators such as the Large Hadron Collider, providing insights into everything from the Big Bang to the birth of atoms. These investigations provide insight into the early stages of the universe and its development into a complex universe full of galaxies and stars. Our comprehension of the early epochs of the Universe has been significantly improved by this fusion of theoretical and experimental physics.
According to the Big Bang hypothesis, the universe was considerably hotter around 14 billion years ago. But how are we to know for sure what the Universe was like all those years ago? It might be possible to create a time machine, but that technology is undeveloped. The next best thing, therefore, is what scientists do: they utilize particle accelerators to simulate the early Universe’s circumstances in the lab. Data from particle physics experiments can provide a window into the early stages of the universe in this way.
It is vital to comprehend the capabilities and constraints of this methodology, nevertheless. According to the Big Bang hypothesis, there are several epochs, each with a distinct temperature and energy. These eras are not entirely fully understood. For example, the Universe’s early times are still mysterious. They are cloaked in mystery, and what we know about them is based on conjecture. But in an instant, the circumstances of the early Universe changed to ones that could be tested with current technology.
Replicate the early Universe
Protons are accelerated to almost the speed of light and collide head-on in the Large Hadron Collider (LHC), the most potent particle accelerator currently in use in the world. Heat energy from the velocity of the proton is transformed into heat that can reach temperatures 100,000 times hotter than the Sun’s center, which were last experienced by the universe less than a trillionth of a second after it first formed.
Other studies have looked at what happened to matter when the Universe cooled down enough to break the principles of particle physics and enter the nuclear physics era. The composition of the Universe and its governing principles were predetermined even before it existed for a few minutes. Though it would take hundreds of thousands of years for the Universe to cool down sufficiently to produce atomic hydrogen and helium, the nuclei of the primordial hydrogen and helium that comprised the first stars already existed three minutes after the Universe began. Gravitational forces predominated for hundreds of millions of years following the formation of atoms, which resulted in the birth of the earliest stars—a point when nuclear physics was essential once more.
What eras of the early Universe are so studied by particle accelerators? Let’s start the narrative in an era for which there is still plenty to learn about it. Cosmologists think that the Universe had a phase of expansion at speeds faster than light at a very early epoch, about 10-36 to 10-32 seconds after the Universe began. We refer to this as the inflationary period. Although there is a lot of indirect evidence to support this, inflation has not been shown to have happened. Inflation is still a theoretical concept as of this writing.
The universe was hot and dense toward the conclusion of inflation, and it was very different from what it is today. Atoms couldn’t exist since the universe was much too hot. Quarks, the particles that reside within protons and neutrons, and protons themselves all followed the same pattern. It is believed that neither mass nor electric charge ever existed. That means that massless, very energetic particles pervaded the whole universe.
From discoveries from experiments to theoretical physics
It is unclear to scientists what occurred in the universe before around 10^13 seconds. One explanation is that we don’t have the technology to focus our energies in a way that allows us to study those early periods. On the other hand, pairs of protons travelling at almost the speed of light can collide with each other at the LHC. Ten to thirteen seconds after the collision starts, the maximum energy produced in one of those collisions will produce temperatures that are last in the universe.
That skill allows us to have a far better grasp of how the Universe has evolved. The Higgs field is an energy field that was created at a time of around 10^12 seconds. Particles derived their mass from interactions between this field and the universe’s stuff. Electric charge was created at the same moment. Particles with mass began to exist in the universe instead of only massless energy. These were referred to as leptons and quarks. Quarks can only be discovered within protons and neutrons nowadays, and the electron is the most well-known lepton. In 2012, the Higgs boson—a vibration of the Higgs field—was found. (Disclosure: The writer was involved in that finding.)
Quarks, however, were not limited to being in protons and neutrons at that early period. Quarks were free to move about. Since the early Universe was too hot for protons and neutrons to exist, a proton would effectively melt and release its constituent quarks, much like when an ice cube is placed on a heated pavement and the heat melts the ice to let water to flow freely.
As time went on, the universe cooled and expanded even more. After the Universe cooled down to a temperature of one millionth of a second (10-6 s), quarks were no longer free to move about. Protons and neutrons are the result of strong interactions bringing quarks together. There were electrons and there was also this strange particle called a neutrino. Very low mass subatomic particles called neutrinos have very weak interactions with matter. These days, they are produced by nuclear processes and have less impact on the cosmos. Nonetheless, neutrinos interacted quite strongly with the protons, neutrons, and electrons that dominated the Universe at 10-6 seconds due to the Universe’s extreme density.
The Universe became enough less dense by the time it was one second old that neutrinos could not interact with other kinds of matter. In fact, neutrinos, which last interacted with matter a very short time after the Universe began, are abundant in our modern Universe. Within the next ten years, very sensitive instruments should be able to detect these primordial cosmic neutrinos.
The expansion of the universe cooled over the course of the following few minutes to a point where protons and neutrons could start to group together to form atomic nuclei. The nuclei of all known elements may have formed if the universe’s density had remained high. But only the most basic nuclei could develop due to the Universe’s sudden decrease in density. Together with uncommon isotopes of hydrogen (deuterium and tritium), hydrogen nuclei (single protons) and helium nuclei (two protons and two neutrons) had created by the time the universe was three minutes old.
About 75% of the universe was made up of hydrogen and 25% of helium by three minutes. By mass ratio, that is. Helium nuclei weigh four times as much as hydrogen nuclei, so if you only counted them, the ratio would be around 92% hydrogen and 8% helium.) While other chemicals were present in trace amounts, most other elements would not exist until they were synthesized in the center of stars.
This is the tale of how our understanding of the early Universe is shaped by particle accelerators. The first atoms were created when the universe cooled down to a point around 380,000 years after it started, allowing the nuclei of hydrogen and helium to absorb electrons. Furthermore, gravity gradually collected those atoms into heated clusters that eventually developed into stars and galaxies, thus the narrative was undoubtedly not finished.
From the birth of the Universe until a few minutes after it started, we have a fairly advanced knowledge of its nature based on precise observations. More crucially, sophisticated and thorough measurements rather than theoretical conjecture have led to our knowledge. Scientists can really reproduce the circumstances of the early Universe and see how things function by using enormous “atom smashers.”
The first accounts of the quest to discover the origins of our universe may be found in some of humankind’s oldest literature. Astronomical observations along with research carried out within massive particle accelerators are starting to provide a pretty clear picture of how it all started.
Total Solar Eclipse 2024: A solar eclipse occurs when the Moon passes between the Earth and the Sun, blocking the Sun’s light completely or partially.
An astronomical extravaganza is likely to enthrall skywatchers on April 8, when a total solar eclipse will transform day into darkness across North America. Total eclipses are stunning and darken the sky, but they can only be seen from a few sites. This is why seeing an eclipse is frequently described as a once-in-a-lifetime experience.
What is a solar eclipse?
A solar eclipse is an astronomical phenomena in which the Moon passes between the Earth and the Sun, blocking the Sun’s light completely or partially.
When the moon totally eclipses the sun, it throws a shadow on Earth, creating a “path of totality.” This path is a small band that travels across the surface. People standing inside this ring can see a total solar eclipse if the weather and clouds coincide. The sky will darken as if it were day or dusk along the path of totality, where the Moon totally covers the Sun.
Unless they are on the path of totality, they will only observe a partial eclipse. The sky will appear slightly darker to them after the eclipse, depending on how much the Moon blocks the Sun in their position.
Total Solar Eclipse Date and Timing:
The total solar eclipse of 2024 will take place on April 8. The total darkening of the sky, commonly known as totality, will be visible throughout a 185-kilometer area between Mexico, the United States and Canada. It will be shown in up to 18 different states in the United States. However, skywatchers in India would be unable to see it.
According to Indian Standard Time (IST), the total solar eclipse will begin at 9:12 p.m. on April 8, totality at 10:08 p.m., and terminate at 2:22 a.m. on April 9, 2024. Totality will initially occur on the Pacific coast of Mexico at 11:07 a.m. PDT, and it will exit Maine about 1:30 p.m. PDT.
Duration of the Total Solar Eclipse
Although totality will only persist for around four minutes, the entire process will take roughly two and a half hours. According to NASA, during the period of complete darkness, the greatest spectacle might endure up to 4 minutes and 27 seconds.
“The Great American Eclipse of August 21, 2017,” states that “the duration of totality will be up to 4 minutes and 27 seconds, almost double that.” The length of totality for most locations along the centerline (path of totality) will be between 3.5 and 4 minutes.
How Can I Watch the Eclipse Safely?
The brightness of the Sun’s surface is so great that staring at even a little fraction of it can cause individual retinal cells to be damaged. During its partial stages, skywatchers worldwide are encouraged to wear protective eyewear, such as authorized eclipse glasses. If you don’t, you risk burning the retinas of your eyes, causing irreversible damage or maybe blindness.
How Can I Watch a Total Solar Eclipse Online?
If you are unable to see the solar eclipse in person, you may watch the live feed from NASA. Beginning on April 8 at 5:00 p.m. GMT (10:30 p.m. IST), the space agency will broadcast live till 8:00 p.m. GMT (1:30 a.m. IST).
In addition, NASA will be conducting expert chats and offering telescopic views of the eclipse from several locations along the eclipse path during the program.
Additionally, you may see the live broadcast that the McDonald Observatory in Texas is hosting. On April 8, at 4:30 p.m. GMT (10:00 p.m. IST), the skywatching website timeanddate.com will also broadcast live coverage of the total solar eclipse on its YouTube page.
Samsung began handing out the update to One UI 6.1 for the Galaxy S23 series in late March, and since then, several consumers have reported touchscreen difficulties. To be more specific, the touchscreen on these devices occasionally fails to register touch input properly, necessitating several taps.
Undoubtedly, this is quite annoying, and Samsung has finally acknowledged the issue, but there’s a catch. According to the Korean company, this is primarily Google’s responsibility, as the problem appears to be caused by the Google app’s Discover feed, which can be displayed to the left of your home screen.
Samsung said Google is aware of the issue and is working to resolve it permanently. Until then, Samsung is providing a temporary solution to concerned users: erase the Google app’s data and restart their device. Once it is completed, the touchscreen should properly register all touch events.
To erase the Google app’s data, navigate to Settings > Apps > Google > Storage and press erase data. You may need to log back into your Google account after doing this.
Google is most likely to resolve the issue with an update to the Google app, so make sure you have automatic updates turned on in the Play Store. If you’re eager, you can manually update all apps in the Play Store by hitting on your profile icon, then selecting Manage apps and device, and then tapping on the Update all button.
When Apple introduced the iPhone 15 series in 2023, it elicited mixed reactions due to overheating difficulties that afflicted nearly every handset. The company initially blamed the problem on a startup error and released an update to fix it.
However, complaints have surfaced of iPhone 15 series customers experiencing high temperatures once more. Is this just another problem, or is there a fundamental weakness in the design that has shown as summer temperatures rise? Let’s find out.
iPhone 15 overheating: Is it a software fault or a design flaw?
Initially, the iPhone 15 series’ overheating problem was caused by software flaws, which led smartphones to run unusually hot when booted up. However, as summer temperatures rise, stories of iPhone overheating have resurfaced. While it is usual for phones to overheat in hot weather, the fact that iPhones become inoperable when temps rise is aggravating, to say the least.
The upgraded processor in the iPhone 15 series could be one of the causes of the heating problem. According to Notebookcheck, Apple expected these devices to run warmer than their predecessors. Although this information is unsubstantiated, it offers one plausible reason for the situation. Furthermore, Apple’s contemplation of a Graphene cooling option for the iPhone 16 series indicates that the heating issue is more serious than previously imagined.
How can I keep my iPhone 15 cool?
If the heating problem is caused by a cooling issue, there may not be a single solution. However, you can take numerous actions to guarantee that your iPhone remains within safe operating temperatures.
1. Close the Bluetooth and other settings.
If you feel your iPhone heating up while on the go, the first step is to turn off Bluetooth, Wi-Fi, and location services. While these functions are useful, they consume a large amount of computing power, adding to the device’s heat buildup.
2. Remove the casing.
Almost everyone uses a case to safeguard their phone, as repairs can be costly. However, because these cases form an enclosure around the gadget, they trap any heat generated by it. Remove the case in warmer areas to improve airflow and heat dissipation.
3. Turn on Airplane mode.
If your iPhone has already overheated and you need it to cool down immediately, turn on Airplane Mode. This disables all wireless signals and associated processes, saving up system resources and assisting with cooling.
4. Upgrade to the current version.
If the preceding procedures do not work, a software flaw could be to blame for the overheating. In such circumstances, updating to the most recent iOS firmware is recommended, as Apple frequently addresses these vulnerabilities.
Lenovo has just teased the 2024 YOGA Pro laptop series in China. The series includes models such as the YOGA Pro 14s High-Performance and the YOGA Pro 16s Supreme Edition. Both of these laptops feature amazing hardware, particularly the latest Intel Core Ultra CPUs, which lower power usage by up to 25% while still providing excellent performance.
They also incorporate Intel AI Boost technology, which is designed to accelerate AI-dependent operations like as picture and video editing, potentially providing up to 70% faster performance in generative AI tasks. These laptops contain six speakers that enable Dolby Atmos and a PURESIGHT display. They also claim to be optimized for professional creative software like Photoshop and Premiere Pro.
Lenovo Yoga Pro 16s Supreme Edition.
The YOGA Pro 16s Supreme Edition has a 16-inch 3200 x 2000 resolution, 165Hz Mini LED touch display with a maximum brightness of 1000 nits. It comes with two processor options: Intel Core Ultra 7 155H and Ultra 9 185H, as well as 16GB, 32GB, and 64GB of LPDDR5x RAM and up to four M.2 NVME SSDs.
The laptop will be available with three GPU options, including NVIDIA’s RTX 4050, 4060, and 4070, which, paired with the high-resolution, high-refresh rate display panel, make it more than capable of running recent AAA games. The Supreme Edition includes a 5MP webcam with face recognition for security and video calls. Depending on the specification, it includes an 84Wh battery and a 100W / 170W power adaptor.
Lenovo Yoga Pro 14s High-Performance
The high-performance variant includes Core Ultra 5 125H, Ultra 7 155H, and Ultra 9 185H CPUs, as well as RTX 4050 / 4060 GPUs. It includes four PCIe 4.0 slots for SSDs and two RAM options: 16GB or 32GB LPDDR5x. Customers will have three display options: a 14.5-inch 2.5K IPS panel, a 2.8K OLED OLED panel, and a 3K IPS touch screen.
It boasts a 1080p webcam that also supports Face ID detection. The onboard battery has a capacity of 73Wh and comes with a 100W or 140W power adaptor, depending on the model. Despite the muscular specifications, it will be relatively lightweight, weighing between 1.49kg and 1.59kg depending on the model.
Both of these laptops will be released on April 18 in China. We expect them to enter the worldwide market under a different name. Stay tuned for future updates.
LinkedIn revealed to TechCrunch on Wednesday that it is testing a new short-form video stream similar to TikTok. With this new test, LinkedIn joins a long list of major apps that have launched their own short-form video streams in response to TikTok’s popularity, including Instagram, YouTube, Snapchat, and Netflix.
LinkedIn’s Latest Experiment: TikTok-Style Video Feed for Professional Content
Austin Null, a strategy director at the influencer agency McKinney, discovered the feed first. Null shared a brief demonstration of the new feed on LinkedIn, which can be seen in the app’s navigation bar under the new “Video” option. When you tap the new Video button, you will be sent to a vertical feed of short movies, which you may swipe through. You can respond to a video by like it, leaving a comment, or sharing it. The corporation does not provide information regarding how the feed selects which films to show users.
The new feature is comparable to the vertical short-form video feeds found in other applications, but whereas other feeds contain a wide range of content from comedy to food videos, LinkedIn’s stream is clearly geared toward professions and professionalism. While it has always been possible to upload films on LinkedIn, the new dedicated feed is intended to increase engagement and discovery on the network by displaying bite-sized videos that users can rapidly skim through.
According to Microsoft-owned LinkedIn, videos are becoming one of its users’ preferred formats for learning from professionals and experts, thus the company is exploring a new way for users to discover relevant videos. Because the function is still in early stages of development, most users will not be able to use it right now.
The new feature’s release comes as many TikTok producers have built significant followings by offering advice and experiences on themes such as career growth, job hunts, and professional development. LinkedIn’s new feed would provide producers with a new platform to share their video material, perhaps reaching a larger audience. LinkedIn may eventually commercialize the stream in order to incentivize producers to upload their video content on the app.
Although the feature opens up new options for producers, some users may not perceive the new feed as a welcome addition to the app, since they may feel overwhelmed by the numerous short-form video streams available on popular apps.
Apple developers have published a new study claiming that their ReALM language model outperforms OpenAI’s GPT-4 at “reference resolution.”
Apple’s ReALM Language Model Beats GPT-4 in Reference Resolution Benchmark
Apple researchers submitted a preprint document on Friday for their ReALM big language model, claiming that it can “substantially outperform” OpenAI’s GPT-4 in specific benchmarks. ReALM is supposed to understand and manage a variety of scenarios. In theory, this will enable users to point to something on the screen or in the background and ask the language model about it.
Reference resolution is a language problem that involves determining what a specific expression refers to. For example, when we speak, we employ pronouns like “they” and “that.” Now, what these words are referring to may be evident to humans who comprehend context. However, a chatbot like ChatGPT may not always understand what you’re saying.
Chatbots would benefit greatly from being able to grasp precisely what is being said. According to Apple, the ability for users to refer to something on a screen using “that” or “it” or another word and have a chatbot comprehend it precisely is critical to delivering a genuinely hands-free screen experience.
This is Apple’s third AI paper in recent months, and while it is still too early to forecast anything, these papers can be viewed as an early preview of capabilities that the firm intends to incorporate in its software offerings such as iOS and macOS.
According to the study, researchers intend to employ ReALM to recognize and identify three types of entities: onscreen entities, conversational entities, and background entities. Onscreen entities are things that appear on the user’s screen. Conversational entities are those that contribute to the discourse. For example, if you ask a chatbot “what workouts am I supposed to do today?” it should be able to determine from past discussions that you are on a three-day workout program and what your daily routine is.
Background entities are objects that do not fit within the first two categories but are nonetheless relevant. For example, there could be a podcast playing in the background or a notification that just went off. Apple wants ReALM to recognize when a user refers to these as well.
“We show significant improvements over an existing system with comparable capability across several sorts of references, with our smallest model achieving absolute benefits of more than 5% for on-screen references. We also benchmark against GPT-3.5 and GPT-4, with our smallest model performing similarly to GPT-4 and our larger models significantly outperforming it,” said the researchers in their report.
However, keep in mind that GPT-3.5 only accepts text, therefore the researchers’ input was limited to the prompt. However, in GPT-4, they also included a screenshot for the task, which significantly improved performance.
“Please keep in mind that our ChatGPT prompt and prompt+image formulation are, to the best of our knowledge, innovative in their own right. The researchers suggest additional examination of a more complicated strategy, such as sampling semantically related utterances till the prompt length, to improve results. They leave this for future work.
So, while ReALM outperforms GPT-4 in this particular benchmark, saying that the former is a better model than the latter is far from correct. It’s only that ReALM outperformed GPT in a benchmark it was explicitly built to excel at. It is also unclear when or how Apple intends to integrate ReALM into its devices.
WhatsApp has made it much easier for you to operate the app with one hand on your Android phone.
WhatsApp is now rolling out a significant upgrade to the app’s user interface for Android handsets. The Meta-owned messaging service has announced that it has relocated the four navigation tabs from the top of the screen to the bottom. The redesigned visual interface has been enabled on the beta version of the app for several months and is now available to all users. The new design will also make it easier for users to flip between tabs while holding the phone in one hand.
WhatsApp announced the release of the new bottom navigation bar for Android via a post on X (previously Twitter). The service shared screenshots of the former UI, which included four tabs — communities, conversations, status, and calls — above the chat list. With the most recent update, users will notice that the tabs and icons have been relocated to the bottom, and the arrangement may have been modified.
This is one of the most significant updates to the WhatsApp for Android layout in a long time, and it brings the app’s primary navigation tabs within reach of your thumb, which is especially useful if you only have one hand free. However, if you want to search for something, you’ll still have to go to the top of the screen. WhatsApp for iOS likewise displays tabs at the bottom of the screen, with a fifth one for the settings menu.
While the bottom navigation tabs may finally be available to users following months of testing on the beta channel, WhatsApp is working on more improvements that may be added to the app in the future. The messaging service is apparently working on the ability to make foreign payments through the app utilizing the NPCI’s Unified Payments Interface (UPI) in India.
The messaging service is also planning to introduce new AI-powered capabilities to its app. WhatsApp is working on a feature that would allow users to submit text instructions to make stickers using artificial intelligence (AI) in the app. Similarly, WhatsApp was recently seen implementing AI-powered picture editing tools as well as a feature that allows users to ask Meta AI queries using the WhatsApp search box. These functionalities have yet to be rolled out to WhatsApp users in both stable and beta versions.
fresh week, fresh crypto horoscope for April 1st–7th.
This week will be marked by two transits.
Mercury retrograde in Aries begins on Monday, January 4th;
Venus enters Aries on Friday, May 4.
For several months, we have dedicated space to the crypto horoscope authored by Stefania Stimolo, an astrology and blockchain specialist. It is a weekly section with the horoscope for each zodiac sign that appears every Sunday on The Cryptonomist.
With this entertainment column, we intended to go deeper into the matter, sarcastically speaking, as part of our tagline “We Tell the Future”.
The Crypto Horoscope
We call it a crypto horoscope since it is based on business lingo.
Words like NFT, metaverse, and Over-The-Counter are used to describe actions and events, as well as trading language like bullish, bull run, bear market, or dump to define the attitude of each zodiac sign throughout the week.
Obviously, the renowned to-the-moon symbol must be present to express the tone of that sign!
In general, you may experience a time of “hard-fork,” also known as a “inner split,” or pass your lightning torch to the next zodiac sign, indicating that the Sun is moving into that sign.
Alternatively, you should consider specific cases where the planet is in discord with the zodiac sign, known as “verification.” Furthermore, with each new change of guard of the Sun across the constellations of the zodiac, the roadmap of each sign will take a new stride.
Obviously, no investing advice is offered; instead, it is purely for pleasure, just like any other horoscope. It must be noted that many newcomers to the industry have mastered the specialized crypto vocabulary owing to the horoscope on The Cryptonomist.
“Do not Trust, Verify”
Astrology is not a precise science, but it does attempt to foretell the future in its own manner. So why not include the traditional blockchain motto “Don’t Trust, Verify” here as well?
In reality, the author wishes to provide her interpretation of the planetary transits that occur during the week, detailing the reactions of each zodiac sign in accordance with the “logic” of conventional astrology.
Astrology enthusiasts may keep up to date by simply monitoring the weekly transits that are communicated and impact us in some way. Whether it’s Mercury Retrograde or the Full Moon.
Others, on the other hand, might go to the dedicated website, which is updated every Sunday, and read the horoscope for their zodiac sign, rising sign, or, why not, the horoscope of friends and loved ones.
So, don’t waste time and click here to see your weekly horoscope!