close
Technology

A New Way to Store Thermal Energy

A New Way to Store Thermal Energy

Thermal Storage – Phase Change Material – PCM

Researchers at MIT have developed a new concept in the form of new chemical composite providing a substitute of storing heat from the sun or any other source during the day in a type of thermal battery.  This could release the heat whenever required for instance in the case of cooking or heating after dark.  A common method to thermal storage is to utilise it as a phase change material – PCM where input heat tends to melt the material and its phase change from solid to liquid stores energy.

When the PCM is cooled back down lower than its melting point, it is inclined to turn back to solid wherein the point of stored energy is released as heat.  There are several instances of these materials comprising of waxes or fatty acids utilised for low temperature applications as well as molten salts utilised at high temperatures.

However presently PCMs tend to need a great amount of insulation and are inclined to pass through that phase change temperature wildly thereby dropping their stored heat comparatively quickly.  On the contrary the new system tends to utilise molecular switches which alters the shape with reaction to light.  When integrated into the PCM, the phase-change temperature of the hybrid material could be accustomed with light enabling the thermal energy of the phase change to be maintained well lower than the melting point of the original material.

Fatty Acids/Organic Compound  for Thermal Energy Storage

The new discovery by MIT postdocs Grace Hans and Huashan Li together with Professor Jeffrey Grossman have been reported in the journal Nature Communications.  Grossman has explained that the trouble with thermal energy is that it is hard to hold on to it.  His team then created what is essentially add-ons for the purpose of traditional phase change material or `little molecules which tend to undergo structural change when the light is said to shine on them.

He further added that the trick had been to locate a method of integrating these molecules with conventional PCM materials in order to release the stored energy as heat, when needed. He also mentioned that there were several applications which would be useful in storing thermal energy in a way that enables you to activate it whenever essential.

This was achieved by the researchers by merging the fatty acids with organic compound which are inclined to response to a pulse of light.  With the help of this method, the light-sensitive component is said to change the thermal properties of the other component that stores as well as discharges its energy. Hybrid material tends to melt when it is heated and thereafter when exposed to ultraviolet light; it tends to remain melted after it has been cooled back.

Solidification & Super-Cooling , Method for Thermal Storage

Thereafter when activated by anther pulse of light, the material re-solidifies and returns back the thermal phase-change energy.  Grossman, who is the Morton and Claire Goulder and Family Professor in Environmental Systems and also a professor of materials science and engineering, had stated that by integrating a light-activated molecule in the traditional image of latent heat a new type of control knob for properties like melting, solidification and super-cooling is added. According to Hans, the system could make use of any source of heat and not just solar.

Readiness of waste heat is extensive from industrial processes to solar heat as well as heat coming from vehicles which is generally wasted.  Connecting some of that waste could offer a means of recycling the heat for beneficial applications. Hans has explained that technically they are installing a new energy barrier so that the stored heat does not get released instantly.

He further mentioned that the energy tends to remain for a longer period till the optical trigger is activated, in its chemically stored system.  The stored heat, in its initial small scale lab version, had shown that it tends to stay steady for around 10 hours while a device of identical size storing heat directly would dispel it within a few minutes.

Thermal Energy Density – Significant

There is no fundamental reason as to why the same cannot be tuned to move higher. Grossman stated that in its preliminary proof-of-concept method, the temperature change or super-cooling which was accomplished for thermal storage material could go up to 10 degrees C (18 F) and they anticipate to still go higher.  Hans states that in this version the energy density is quite significant even though a conventional phase-change material is used.

The material tends to store around 200 joules per gram which according to her is very good for any organic phase-change material.  According to her, people have already shown interest in utilising this for cooking in rural India.  Such methods could be used in drying agricultural crops or for space heating.  Grossman had commented that their interest in this work was to portray a proof of concept though they were of the belief that there is plenty of potential for utilising light-activated materials to hijack the thermal storage properties of phase change materials.

Storing Thermal Energy – Scope of Prospective Applications

Junqia Wu, professor of materials science and engineering at the University of California at Berkeley who was not involved in the study, stated that this is a highly creative research, where the key is that the scientists mingle a thermally driven phase-change material with a photo-switching molecule, in order to build energy barrier to stabilize the thermal energy storage.  He further stated that he was of the opinion that the work is significant since it provides a practical way of storing thermal energy that had been challenging in the past.

The design of MIT team for storing thermal energy had scope of prospective applications and ultimately the technology can offer a solution for people in developing countries which do not have a traditional power grid.  Besides absorbing heat from the Sun the system has the capability of harnessing water energy from industrial procedures which means that it could absorb the energy released as heat from heavy machinery and thereby release that energy later to heat a living space. The work had been aided by the Tata Centre for Technology and Design within the Energy Initiative of MIT.

read more
Apps

Improving App Security On Google Play

Improving App Security On Google Play

Google will force to optimize App Security from the Play Store to the latest version of Android

Google wants to end the fragmentation of Android, also within its applications. From the company have established a new series of rules that will force developers to update and optimize their applications for the latest version of the operating system for its Security.

“Improving the App security and performance of Google Play”
Those are the two objectives of Google when implementing these new measures for the App Security, according to the headline in its official note. Specifically, three changes in the regulations aimed at strengthening these two areas will be applied. The first is that by the second half of 2018, Play Store will require new apps and their updates to use a recent Android API for App Security.

In the case of the first will be from August 2018, and in the case of the second as of November 2018. This means that the apps should be optimized for the latest version of Android for its App Security, which is currently Oreo. In this way, apps that avoided updating their API to avoid having to adjust to new performance parameters will be forced to do so for the better App Security. It will begin by targeting Android 8.1 Oreo, but every year that passes will force you to upgrade to the next version.
The second measure is that as of August 2019, Play will force apps to have native 64-bit libraries next to those of 32. This will require better performance and optimization on 64-bit devices. The third and final measure is that Play Store will begin to add App Security metadata to the apk to verify its App Security from the beginning of 2018. This measure seeks to avoid fake apps that pass themselves off as others and avoid App Security failures. It will be done from the side of Google and developers should not do anything.

Breaking Android’s fragmentation. The fragmentation of Android does not only exist at the level of system versions. Measures such as Project Treble help to end it, but it is also necessary for application developers to act. Google’s measures for the Play Store go in that direction. Thanks to them, applications such as Facebook or Snapchat should start applying energy saving and optimization solutions that they had avoided implementing by targeting APIs from lower versions of Android.

Especially noteworthy is the measure to achieve greater App Security in the Play Store. Beyond the implementation of Play Protect, it is another wall to avoid counterfeiting and App Security holes in the official store, something that will be very good for the average user who does not pay as much attention to what installs.

 

read more
Technology

Scientists Capture Colliding Organic Nanoparticles on Video for First Time

Scientists Capture Colliding Organic Nanoparticles on Video for First Time

Extraordinary View of nanoparticles in Motion

One of the first to capture the organic nanoparticles colliding and fusing together on video is Evanston, a North-western University research team.  This extraordinary view of chemistry in motion would be assisting North-western Nano scientists in the creation of new drug delivery techniques and also demonstrate to research all over the world on how an emerging imaging technique tends to open a fresh window on a very miniature world.

This is said to be unusual example of nanoparticles in motion wherein the dynamics are significant of two bubbles approaching together and integrating into one.  At first they tend to join and have a membrane between them and thereafter they are said to fuse and become one huge bubble.

Professor Nathan C. Gianneschi leading the interdisciplinary research and working at the intersection of nanotechnology and biomedicine stated that he had an image in mind though when he had viewed these fusing nanoparticles for the first time in black and white, had been amazed.

According to him it was literally a window opening up to the world one has always known but now eventually an image is portrayed which tends to be identical to that of Jupiter’s moons seen through a telescope and nothing can be compared to actually seeing it.

Technique – Observing Transformation / Characterize dynamics of nanoparticles       

Gianneschi is said to be the Jacob and Rosaline Cohn Profess I n the department of chemistry in Weinberg College of Arts and Sciences as well as in the departments of materials science and engineering together with biomedical engineering in the McCormick School of Engineering.

The research that comprises of videos of various nanoparticles fusion events had been published by the Journal of the American Chemical Society on November 17. Liquid-cell transmission electron microscopy had been utilised by the research team in order to image directly on how polymer-based nanoparticles or micelles, which Gianneschi has been developing in his lab for the treatment of cancer and heart attacked tend to change over a period of time. This amazing technique has assisted the scientists in directly observing, the transformation as well as to characterize the dynamics of the particles. Lucas R. Parent, the first author of the paper as well as a National Institutes of Health Postdoctoral Companion in the research group of Gianneschi, had commented that they can visualize on the molecular level how the polymeric matter rearranges when the particles tend to fuse into one object.

First Study – Dynamic Phenomena in Organic Material System nanoparticles

This is the first study of several to come wherein researcher would be utilising this system in order to view all types of dynamic phenomena in organic materials system on the nanoscale.

Organic particles in water, in the North-western research, tends to bounce off each other and some are inclined to collide and merge, thereby going through a physical transformation.  The researchers had the capability of capturing the action by shining an electron beam through the sample wherein the tiny particle, which the largest seem to be only around 200 nanometers in diameter.

Member of North-western’s International Institute for Nanotechnology, Gianneschi had commented that they had observed classical fusion behaviour on the nanoscale and capturing the fundamental growth as well as the evolution processes of these particles in motion had supported them greatly in their work with synthetic materials together with their interactions with biological systems.

read more
Technology

Robotics Researchers Track Autonomous Underground Mining Vehicles

Robotics Researchers Track Autonomous Underground Mining Vehicles

New Tech to Track Underground Autonomous Mining Vehicles

Working underground can sometimes prove to be fateful given the shaky perilous terrain and the lack of oxygen when mining. So, autonomous vehicles were developed to scope the terrain.  But the terrain again proves to be a problem as tracking the underground autonomous mining vehicle becomes difficult.

Problems with tracking underground autonomous mining vehicles:

Navigating underground can be a difficult task as there is no light, a lot of dust and camera blur. Things such as GPS navigation is also difficult because there is no signal or network underground so it is not only a problem for the vehicle itself to navigate underground but is also a problem for those trying to keep track of the underground mining vehicles from land above.

Autonomous mining vehicles have to often traverse rocky, dust filled terrain, navigate narrow mazes and overall a very harsh terrain. With the poor lighting in mining tunnels and the dust that is there, often cameras also become useless in such terrain.

Problems with current methods of tracking autonomous mining vehicles:

The current methods of tracking underground autonomous mining vehicles are not only expensive but also involve major modifications to infrastructure and costly sensing.

For any mining enterprise keeping track of all the companies expensive autonomous underground mining vehicles is not only very important but on top of that if you add expensive ways of keeping track of the vehicles and a need to make major modifications to the existing infrastructure then that makes the whole enterprise all the more unprofitable.

As mentioned earlier, Global Positioning Systems or GPS cannot be used because of the rocky terrain and wireless Sensor networks also prove to be worthless in such terrains because of interference from the rocks in the terrain. All in all tracking underground autonomous mining vehicles is not an easy or even cheap task. That is until now.

New tech to track underground autonomous mining vehicles:

Since sensors are expensive and making major adjustments to the infrastructure is out of the question, researchers have found a cheaper alternative to navigating through such a terrain using cameras.

Mathematics and biologically-inspired algorithms were the winning combination when it came to tracking underground autonomous mining vehicles. A camera mounted vehicle was able to track the autonomous vehicle in underground tunnels to within a few meters.

Previously, experiments concerning the camera mounted vehicle were unsuccessful. Researchers had to add artificial intelligence to the camera so that it was able to discern the terrain it was navigating and to also disregard images that were blurry, dust filled or affected by the poor lighting in an underground setup.

Now researchers have tested tracking the underground autonomous mining vehicle on two occasions in Australia and have a third expedition lined up. So far the intelligent camera has proved highly useful and the researchers hope that it continues with the same track record in their third mining job as well.

If an underground autonomous vehicle can be tracked within a few centimeters then tracking the mining asset later on is no problem.

read more
Technology

Data Analytics is Great, But There’s Still Room for Improvement

Data Analytics is Great, But There’s Still Room for Improvement

From its earliest inception, the concept of data analytics or business intelligence has proceeded forward at a rapid rate. But as the capabilities increased the accessibility has not always kept up. Data analytics is more in demand than ever before, but its impact has yet to reach its full potential.

To understand why and when this game-changing capability will reach full maturity, explore this quick overview of the past and future:

Stage One – 1990s

The first forms of data analytics relied on huge stacks of data. This data existed in databases built for transactions rather than analytics, which made them expensive to establish and difficult to manage. Only the largest and most forward-thinking companies were making the effort.

The concept gained little traction at the time mostly because it took a huge effort to deliver what we would now consider fairly shallow insights. Users were constrained to a very limited number of queries – usually those predetermined by an analyst – and data was segmented and siloed rigidly. The potential was there, but the technology was not yet advanced enough to make analytics broadly or deeply valuable.

Stage Two – 2000s

The second stage is characterized by the rise of self-service data analytics. Instead of relying on database managers, users were supposed to be empowered to access the right data on their own terms. The problem was that governance broke down and data quickly became a disorganized disaster.

Many of the problems found in stage one were only exacerbated in stage two. It ultimately took more time and effort to manage data, and the resulting insights were even less reliable. The takeaway for all involved was that data had to have structure and standards in place before it could become intuitive and accessible.

Stage Three – Present

The current trend in data analytics is platforming. With the rise of cheap, cloud-based databases it became possible to integrate real-time data from multiple disparate locations in one place. The time spent looking for data was minimized, and users were empowered to ask as many queries as they wanted.

With something like an embeddable analytics dashboard the dream of true self-serve analytics has been realized. The user is in complete control, and she is able to dive deeply into data without needing specialized tools or training. Not only that, she can also insert the dashboard to a website, a customer portal or wherever she needs it most.

Analytics is more accessible to all, from the end user to the IT managers maintaining the back end.

Stage Four – Future

Everyone is excited about what data analytics is already delivering. But higher hopes hang on the future when data collection, storage, and analysis all kick into a higher gear. With the rise of connected sensors and other digital touch-points it’s possible to collect ever increasing amounts of data. As storage and processing capabilities advance, it will become possible to conduct analytics on a much grander scale than we do already.

That increase in power will be matched by improvements in the user experience. As analytics becomes a standard tool for professionals in all industries and at all levels the means of accessing data-driven insights will improve dramatically. Expect a near future where every question is immediately followed by an accurate answer.

As exciting as the future of data analytics is, there is the real risk of being left behind. The companies that are implementing and experimenting with this technology now are the ones in the best position for the future. Those that are not will find it harder to implement analytics into workflows and company culture no matter how intuitive future capabilities are. Success down the road depends on preparation and action now.

 

read more
1 2 3 38
Page 1 of 38