Dear User of asgardia.space!
We could not recognize you, sorry :( This voting is only for Residents of Asgardia. Please Log in or Sign up, and accept the Constitution if you want to become a Resident of Asgardia.Login or Signup Accept the Constitution
Your opinion is very important to determine and analyze the possible system of Solar conversion in open currency markets and its ratio to the main fully convertible currencies chosen by Asgardians for this purpose.
Week 20.09 -27.09
Asgardia always likes to keep an eye on new technology as it could help further our goal of creating habitable platforms in space. So today’s theme is Technology Thursday! Let’s take a look at some of the interesting technological developments that could improve our lives:
When it comes to machine learning, a tremendous amount of data is necessary to train a new algorithm. Not only that, but the information needs to be flawlessly labelled for the machine to learn how to sort similar inputs by itself. Therefore, if you lack access to a giant dataset, the algorithm is far from accurate, and might not even be useful at all.
Thus, researchers from Nvidia, a tech company that produces computer chips and video cards, joined forces with several hospitals to design an AI that can spit out realistic brain scans of nonexistent patients, complete with a variety of scary tumours.
So that future AIs can be trained even when researchers don’t have enough real data to instruct their algorithms. The researchers outlined their work in a paper uploaded to arXiv.
Furthermore, researchers from the University of British Columbia (UBC) have developed a new device, that’s the size of a band-aid, costs under $100, and can see inside the human body.
So what is this mysterious device? It’s essentially an ultrasound.
Their work has recently been published in the journal Microsystems & Nanoengineering, and this new research could help bring ultrasound technology to remote locations.
Moreover, the U.S. Department of Defense (DoD) just signed a deal to the tune of approximately $10 million to start field testing and rolling out the Molar Mic.
The Molar Mic, which comes from Sonitus Technologies, a wireless communication tech development company, is much like a typical Bluetooth headset but has one big difference: the mouthpiece, outfitted with a waterproof microphone, is custom-built to fit the user’s teeth, hence the name Molar.
The device sends the audio to a radio transmitter, which is on a loop around the user’s neck. This transmitter sends the audio to a second radio unit worn somewhere else on the body, which then carries it off to the intended recipient. Once the device has been fitted to the wearer’s teeth, they can speak normally.
However, receiving the audio is a bit more difficult. The mouthpiece translates the incoming audio into vibrations on the teeth. These vibrations travel via the bones in the jaw and skull to the inner ear, which automatically converts them back into sounds. The outcome is audio that sounds like its coming from within the person’s own head.
What’s more, augmented reality (AR) could be the way of the future. However, to date, there are still some obstacles to overcome. But now just two innovations — 5G networks and edge computing — could be the answer to those problems, at least that’s what AT&T was saying during its Spark technology conference.
If you haven’t heard about 5G, it’s the next generation of high-speed cellular networks, which many tech-savvy people have been excited about for years.
However, edge computing is less well-known but stands to be revolutionary.
So what is edge computing?
Essentially, edge computing is networks of smaller data centers, known as cloudlets, which are used to process some information physically closer to the source. It then sends the rest of that data on to the bigger data center if necessary. This results in less latency, and an enhanced AR experience.
Here’s another way edge computing technology is being used. In a blog post published by Microsoft, the company describes a pilot project starting at two Shell gas stations, one in Thailand and one in Singapore. They have built a new AI system that scans for signs of dangerous behaviour, like smoking at the gas pump and then warns staff, giving them with a chance to stop people from smoking before any potential explosions occur.
So how does it work so fast? The key to this quick turnaround is edge computing, which processes data near its site of origin instead of in the cloud. For example, footage analysis happens onsite at the gas stations using Microsoft’s Azure IoT Edge platform, the only frames that raise any red flags move on to the cloud for advanced processing.
Lastly, modern medicine and technology are causing the average human lifespan to increase, and people are now living longer, healthier lives. Of course, this is good news. However, the downside is that the elderly population is growing significantly and requires an entire industry of costly caregivers. Plus, it’s highly likely we’ll need even more caregivers in the future.
Therefore, care robots could be the answer to all that work. Researchers believe robots could help seniors with everything from staying active to remembering to take their medications.
Currently, researchers in Europe and Japan are working to ensure that those robots don’t offend the people they’re supposed to take care of. In fact, they’re making what they say are the world’s first robots with a sense of cultural norms.
This technology could prove useful to Asgardia, as a way to care for the elderly population of our space nation.
If you’re interested in technology, space, and science then join Asgardia today and connect with forward-looking people.