Search results for: analog circuits
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 472

Search results for: analog circuits

82 A Low Cost Education Proposal Using Strain Gauges and Arduino to Develop a Balance

Authors: Thais Cavalheri Santos, Pedro Jose Gabriel Ferreira, Alexandre Daliberto Frugoli, Lucio Leonardo, Pedro Americo Frugoli

Abstract:

This paper presents a low cost education proposal to be used in engineering courses. The engineering education in universities of a developing country that is in need of an increasing number of engineers carried out with quality and affordably, pose a difficult problem to solve. In Brazil, the political and economic scenario requires academic managers able to reduce costs without compromising the quality of education. Within this context, the elaboration of a physics principles teaching method with the construction of an electronic balance is proposed. First, a method to develop and construct a load cell through which the students can understand the physical principle of strain gauges and bridge circuit will be proposed. The load cell structure was made with aluminum 6351T6, in dimensions of 80 mm x 13 mm x 13 mm and for its instrumentation, a complete Wheatstone Bridge was assembled with strain gauges of 350 ohms. Additionally, the process involves the use of a software tool to document the prototypes (design circuits), the conditioning of the signal, a microcontroller, C language programming as well as the development of the prototype. The project also intends to use an open-source I/O board (Arduino Microcontroller). To design the circuit, the Fritizing software will be used and, to program the controller, an open-source software named IDE®. A load cell was chosen because strain gauges have accuracy and their use has several applications in the industry. A prototype was developed for this study, and it confirmed the affordability of this educational idea. Furthermore, the goal of this proposal is to motivate the students to understand the several possible applications in high technology of the use of load cells and microcontroller.

Keywords: Arduino, load cell, low-cost education, strain gauge

Procedia PDF Downloads 278
81 Record Peak Current Density in AlN/GaN Double-Barrier Resonant Tunneling Diodes on Free-Standing Gan Substrates by Modulating Barrier Thickness

Authors: Fang Liu, Jia Jia Yao, Guan Lin Wu, Ren Jie Liu, Zhuang Guo

Abstract:

Leveraging plasma-assisted molecular beam epitaxy (PA-MBE) on c-plane free-standing GaN substrates, this work demonstrates high-performance AlN/GaN double-barrier resonant tunneling diodes (RTDs) featuring stable and repeatable negative differential resistance (NDR) characteristics at room temperature. By scaling down the barrier thickness of AlN and the lateral mesa size of collector, a record peak current density of 1551 kA/cm2 is achieved, accompanied by a peak-to-valley current ratio (PVCR) of 1.24. This can be attributed to the reduced resonant tunneling time under thinner AlN barrier and the suppressed external incoherent valley current by reducing the dislocation number contained in the RTD device with the smaller size of collector. Statistical analysis of the NDR performance of RTD devices with different AlN barrier thicknesses reveals that, as the AlN barrier thickness decreases from 1.5 nm to 1.25 nm, the average peak current density increases from 145.7 kA/cm2 to 1215.1 kA/cm2, while the average PVCR decreases from 1.45 to 1.1, and the peak voltage drops from 6.89 V to 5.49 V. The peak current density obtained in this work represents the highest value reported for nitride-based RTDs to date, while maintaining a high PVCR value simultaneously. This illustrates that an ultra-scaled RTD based on a vertical quantum-well structure and lateral collector size is a valuable approach for the development of nitride-based RTDs with excellent NDR characteristics, revealing their great potential applications in high-frequency oscillation sources and high-speed switch circuits.

Keywords: GaN resonant tunneling diode, peak current density, peak-to-valley current ratio, negative differential resistance

Procedia PDF Downloads 26
80 Low-Complex, High-Fidelity Two-Grades Cyclo-Olefin Copolymer (COC) Based Thermal Bonding Technique for Sealing a Thermoplastic Microfluidic Biosensor

Authors: Jorge Prada, Christina Cordes, Carsten Harms, Walter Lang

Abstract:

The development of microfluidic-based biosensors over the last years has shown an increasing employ of thermoplastic polymers as constitutive material. Their low-cost production, high replication fidelity, biocompatibility and optical-mechanical properties are sought after for the implementation of disposable albeit functional lab-on-chip solutions. Among the range of thermoplastic materials on use, the Cyclo-Olefin Copolymer (COC) stands out due to its optical transparency, which makes it a frequent choice as manufacturing material for fluorescence-based biosensors. Moreover, several processing techniques to complete a closed COC microfluidic biosensor have been discussed in the literature. The reported techniques differ however in their implementation, and therefore potentially add more or less complexity when using it in a mass production process. This work introduces and reports results on the application of a purely thermal bonding process between COC substrates, which were produced by the hot-embossing process, and COC foils containing screen-printed circuits. The proposed procedure takes advantage of the transition temperature difference between two COC grades foils to accomplish the sealing of the microfluidic channels. Patterned heat injection to the COC foil through the COC substrate is applied, resulting in consistent channel geometry uniformity. Measurements on bond strength and bursting pressure are shown, suggesting that this purely thermal bonding process potentially renders a technique which can be easily adapted into the thermoplastic microfluidic chip production workflow, while enables a low-cost as well as high-quality COC biosensor manufacturing process.

Keywords: biosensor, cyclo-olefin copolymer, hot embossing, thermal bonding, thermoplastics

Procedia PDF Downloads 219
79 Holographic Art as an Approach to Enhance Visual Communication in Egyptian Community: Experimental Study

Authors: Diaa Ahmed Mohamed Ahmedien

Abstract:

Nowadays, it cannot be denied that the most important interactive arts trends have appeared as a result of significant scientific mutations in the modern sciences, and holographic art is not an exception, where it is considered as a one of the most important major contemporary interactive arts trends in visual arts. Holographic technique had been evoked through the modern physics application in late 1940s, for the improvement of the quality of electron microscope images by Denis Gabor, until it had arrived to Margaret Benyon’s art exhibitions, and then it passed through a lot of procedures to enhance its quality and artistic applications technically and visually more over 70 years in visual arts. As a modest extension to these great efforts, this research aimed to invoke extraordinary attempt to enroll sample of normal people in Egyptian community in holographic recording program to record their appreciated objects or antiques, therefore examine their abilities to interact with modern techniques in visual communication arts. So this research tried to answer to main three questions: 'can we use the analog holographic techniques to unleash new theoretical and practical knowledge in interactive arts for public in Egyptian community?', 'to what extent holographic art can be familiar with public and make them able to produce interactive artistic samples?', 'are there possibilities to build holographic interactive program for normal people which lead them to enhance their understanding to visual communication in public and, be aware of interactive arts trends?' This research was depending in its first part on experimental methods, where it conducted in Laser lab at Cairo University, using Nd: Yag Laser 532 nm, and holographic optical layout, with selected samples of Egyptian people that they have been asked to record their appreciated object, after they had already learned recording methods, and in its second part on a lot of discussion panel had conducted to discuss the result and how participants felt towards their holographic artistic products through survey, questionnaires, take notes and critiquing holographic artworks. Our practical experiments and final discussions have already lead us to say that this experimental research was able to make most of participants pass through paradigm shift in their visual and conceptual experiences towards more interaction with contemporary visual arts trends, as an attempt to emphasize to the role of mature relationship between the art, science and technology, to spread interactive arts out in our community through the latest scientific and artistic mutations around the world and the role of this relationship in our societies particularly with those who have never been enrolled in practical arts programs before.

Keywords: Egyptian community, holographic art, laser art, visual art

Procedia PDF Downloads 457
78 Digitalization, Economic Growth and Financial Sector Development in Africa

Authors: Abdul Ganiyu Iddrisu

Abstract:

Digitization is the process of transforming analog material into digital form, especially for storage and use in a computer. Significant development of information and communication technology (ICT) over the past years has encouraged many researchers to investigate its contribution to promoting economic growth, and reducing poverty. Yet compelling empirical evidence on the effects of digitization on economic growth remains weak, particularly in Africa. This is because extant studies that explicitly evaluate digitization and economic growth nexus are mostly reports and desk reviews. This points out an empirical knowledge gap in the literature. Hypothetically, digitization influences financial sector development which in turn influences economic growth. Digitization has changed the financial sector and its operating environment. Obstacles to access to financing, for instance, physical distance, minimum balance requirements, low-income flows among others can be circumvented. Savings have increased, micro-savers have opened bank accounts, and banks are now able to price short-term loans. This has the potential to develop the financial sector, however, empirical evidence on digitization-financial development nexus is dearth. On the other hand, a number of studies maintained that financial sector development greatly influences growth of economies. We therefore argue that financial sector development is one of the transmission mechanisms through which digitization affects economic growth. Employing macro-country-level data from African countries and using fixed effects, random effects and Hausman-Taylor estimation approaches, this paper contributes to the literature by analysing economic growth in Africa focusing on the role of digitization, and financial sector development. First, we assess how digitization influence financial sector development in Africa. From an economic policy perspective, it is important to identify digitization determinants of financial sector development so that action can be taken to reduce the economic shocks associated with financial sector distortions. This nexus is rarely examined empirically in the literature. Secondly, we examine the effect of domestic credit to private sector and stock market capitalization as a percentage of GDP as used to proxy for financial sector development on 2 economic growth. Digitization is represented by the volume of digital/ICT equipment imported and GDP growth is used to proxy economic growth. Finally, we examine the effect of digitization on economic growth in the light of financial sector development. The following key results were found; first, digitalization propels financial sector development in Africa. Second, financial sector development enhances economic growth. Finally, contrary to our expectation, the results also indicate that digitalization conditioned on financial sector development tends to reduce economic growth in Africa. However, results of the net effects suggest that digitalization, overall, improves economic growth in Africa. We, therefore, conclude that, digitalization in Africa does not only develop the financial sector but unconditionally contributes the growth of the continent’s economies.

Keywords: digitalization, economic growth, financial sector development, Africa

Procedia PDF Downloads 77
77 Tree Dress and the Internet of Living Things

Authors: Vibeke Sorensen, Nagaraju Thummanapalli, J. Stephen Lansing

Abstract:

Inspired by the indigenous people of Borneo, Indonesia and their traditional bark cloth, artist and professor Vibeke Sorensen executed a “digital unwrapping” of several trees in Southeast Asia using a digital panorama camera and digitally “stitched” them together for printing onto sustainable silk and fashioning into the “Tree Dress”. This dress is a symbolic “un-wrapping” and “re-wrapping” of the tree’s bark onto a person as a second skin. The “digital bark” is directly responsive to the real tree through embedded and networked electronics that connect in real-time to sensors at the physical site of the living tree. LEDs and circuits inserted into the dress display the continuous measurement of the O2 / CO2, temperature, humidity, and light conditions at the tree. It is an “Internet of Living Things” (IOLT) textile that can be worn to track and interact with it. The computer system connecting the dress and the tree converts the gas emission data at the site of the real tree into sound and music as sonification. This communicates not only the scientific data but also translates it into a poetic representation. The wearer of the garment can symbolically identify with the tree, or “become one” with it by adorning its “skin.” In this way, the wearer also becomes a human agent for the tree, bringing its actual condition to direct perception of the wearer and others who may engage it. This project is an attempt to bring greater awareness to issues of deforestation by providing a direct access to living things separated by physical distance, and hopefully, to increase empathy for them by providing a way to sense individual trees and their daily existential condition through remote monitoring of data. Further extensions to this project and related issues of sustainability include the use of recycled and alternative plant materials such as bamboo and air plants, among others.

Keywords: IOLT, sonification, sustainability, tree, wearable technology

Procedia PDF Downloads 110
76 Digitization and Economic Growth in Africa: The Role of Financial Sector Development

Authors: Abdul Ganiyu Iddrisu, Bei Chen

Abstract:

Digitization is the process of transforming analog material into digital form, especially for storage and use in a computer. Significant development of information and communication technology (ICT) over the past years has encouraged many researchers to investigate its contribution to promoting economic growth and reducing poverty. Yet the compelling empirical evidence on the effects of digitization on economic growth remains weak, particularly in Africa. This is because extant studies that explicitly evaluate digitization and economic growth nexus are mostly reports and desk reviews. This points out an empirical knowledge gap in the literature. Hypothetically, digitization influences financial sector development which in turn influences economic growth. Digitization has changed the financial sector and its operating environment. Obstacles to access to financing, for instance, physical distance, minimum balance requirements, and low-income flows, among others can be circumvented. Savings have increased, micro-savers have opened bank accounts, and banks are now able to price short-term loans. This has the potential to develop the financial sector. However, empirical evidence on the digitization-financial development nexus is dearth. On the other hand, a number of studies maintained that financial sector development greatly influences growth of economies. We, therefore, argue that financial sector development is one of the transmission mechanisms through which digitization affects economic growth. Employing macro-country-level data from African countries and using fixed effects, random effects and Hausman-Taylor estimation approaches, this paper contributes to the literature by analysing economic growth in Africa, focusing on the role of digitization and financial sector development. First, we assess how digitization influences financial sector development in Africa. From an economic policy perspective, it is important to identify digitization determinants of financial sector development so that action can be taken to reduce the economic shocks associated with financial sector distortions. This nexus is rarely examined empirically in the literature. Secondly, we examine the effect of domestic credit to the private sector and stock market capitalization as a percentage of GDP as used to proxy for financial sector development on economic growth. Digitization is represented by the volume of digital/ICT equipment imported and GDP growth is used to proxy economic growth. Finally, we examine the effect of digitization on economic growth in the light of financial sector development. The following key results were found; first, digitalization propels financial sector development in Africa. Second, financial sector development enhances economic growth. Finally, contrary to our expectation, the results also indicate that digitalization conditioned on financial sector development tends to reduce economic growth in Africa. However, results of the net effects suggest that digitalization, overall, improve economic growth in Africa. We, therefore, conclude that, digitalization in Africa does not only develop the financial sector but unconditionally contributes the growth of the continent’s economies.

Keywords: digitalization, financial sector development, Africa, economic growth

Procedia PDF Downloads 109
75 Modeling and Design of E-mode GaN High Electron Mobility Transistors

Authors: Samson Mil'shtein, Dhawal Asthana, Benjamin Sullivan

Abstract:

The wide energy gap of GaN is the major parameter justifying the design and fabrication of high-power electronic components made of this material. However, the existence of a piezo-electrics in nature sheet charge at the AlGaN/GaN interface complicates the control of carrier injection into the intrinsic channel of GaN HEMTs (High Electron Mobility Transistors). As a result, most of the transistors created as R&D prototypes and all of the designs used for mass production are D-mode devices which introduce challenges in the design of integrated circuits. This research presents the design and modeling of an E-mode GaN HEMT with a very low turn-on voltage. The proposed device includes two critical elements allowing the transistor to achieve zero conductance across the channel when Vg = 0V. This is accomplished through the inclusion of an extremely thin, 2.5nm intrinsic Ga₀.₇₄Al₀.₂₆N spacer layer. The added spacer layer does not create piezoelectric strain but rather elastically follows the variations of the crystal structure of the adjacent GaN channel. The second important factor is the design of a gate metal with a high work function. The use of a metal gate with a work function (Ni in this research) greater than 5.3eV positioned on top of n-type doped (Nd=10¹⁷cm⁻³) Ga₀.₇₄Al₀.₂₆N creates the necessary built-in potential, which controls the injection of electrons into the intrinsic channel as the gate voltage is increased. The 5µm long transistor with a 0.18µm long gate and a channel width of 30µm operate at Vd=10V. At Vg =1V, the device reaches the maximum drain current of 0.6mA, which indicates a high current density. The presented device is operational at frequencies greater than 10GHz and exhibits a stable transconductance over the full range of operational gate voltages.

Keywords: compound semiconductors, device modeling, enhancement mode HEMT, gallium nitride

Procedia PDF Downloads 242
74 Molecular Dynamics Study of Ferrocene in Low and Room Temperatures

Authors: Feng Wang, Vladislav Vasilyev

Abstract:

Ferrocene (Fe(C5H5)2, i.e., di-cyclopentadienyle iron (FeCp2) or Fc) is a unique example of ‘wrong but seminal’ in chemistry history. It has significant applications in a number of areas such as homogeneous catalysis, polymer chemistry, molecular sensing, and nonlinear optical materials. However, the ‘molecular carousel’ has been a ‘notoriously difficult example’ and subject to long debate for its conformation and properties. Ferrocene is a dynamic molecule. As a result, understanding of the dynamical properties of ferrocene is very important to understand the conformational properties of Fc. In the present study, molecular dynamic (MD) simulations are performed. In the simulation, we use 5 geometrical parameters to define the overall conformation of Fc and all the rest is a thermal noise. The five parameters are defined as: three parameters d---the distance between two Cp planes, α and δ to define the relative positions of the Cp planes, in which α is the angle of the Cp tilt and δ the angle the two Cp plane rotation like a carousel. Two parameters to position the Fe atom between two Cps, i.e., d1 for Fe-Cp1 and d2 for Fe-Cp2 distances. Our preliminary MD simulation discovered the five parameters behave differently. Distances of Fe to the Cp planes show that they are independent, practically identical without correlation. The relative position of two Cp rings, α, indicates that the two Cp planes are most likely not in a parallel position, rather, they tilt in a small angle α≠ 0°. The mean plane dihedral angle δ ≠ 0°. Moreover, δ is neither 0° nor 36°, indicating under those conditions, Fc is neither in a perfect eclipsed structure nor a perfect staggered structure. The simulations show that when the temperature is above 80K, the conformers are virtually in free rotations, A very interesting result from the MD simulation is the five C-Fe bond distances from the same Cp ring. They are surprisingly not identical but in three groups of 2, 2 and 1. We describe the pentagon formed by five carbon atoms as ‘turtle swimming’ for the motion of the Cp rings of Fc as shown in their dynamical animation video. The Fe- C(1) and Fe-C(2) which are identical as ‘the turtle back legs’, Fe-C(3) and Fe-C(4) which are also identical as turtle front paws’, and Fe-C(5) ---’the turtle head’. Such as ‘turtle swimming’ analog may be able to explain the single substituted derivatives of Fc. Again, the mean Fe-C distance obtained from MD simulation is larger than the quantum mechanically calculated Fe-C distances for eclipsed and staggered Fc, with larger deviation with respect to the eclipsed Fc than the staggered Fc. The same trend is obtained for the five Fe-C-H angles from same Cp ring of Fc. The simulated mean IR spectrum at 7K shows split spectral peaks at approximately 470 cm-1 and 488 cm-1, in excellent agreement with quantum mechanically calculated gas phase IR spectrum for eclipsed Fc. As the temperature increases over 80K, the clearly splitting IR spectrum become a very board single peak. Preliminary MD results will be presented.

Keywords: ferrocene conformation, molecular dynamics simulation, conformer orientation, eclipsed and staggered ferrocene

Procedia PDF Downloads 190
73 Hybrid Precoder Design Based on Iterative Hard Thresholding Algorithm for Millimeter Wave Multiple-Input-Multiple-Output Systems

Authors: Ameni Mejri, Moufida Hajjaj, Salem Hasnaoui, Ridha Bouallegue

Abstract:

The technology advances have most lately made the millimeter wave (mmWave) communication possible. Due to the huge amount of spectrum that is available in MmWave frequency bands, this promising candidate is considered as a key technology for the deployment of 5G cellular networks. In order to enhance system capacity and achieve spectral efficiency, very large antenna arrays are employed at mmWave systems by exploiting array gain. However, it has been shown that conventional beamforming strategies are not suitable for mmWave hardware implementation. Therefore, new features are required for mmWave cellular applications. Unlike traditional multiple-input-multiple-output (MIMO) systems for which only digital precoders are essential to accomplish precoding, MIMO technology seems to be different at mmWave because of digital precoding limitations. Moreover, precoding implements a greater number of radio frequency (RF) chains supporting more signal mixers and analog-to-digital converters. As RF chain cost and power consumption is increasing, we need to resort to another alternative. Although the hybrid precoding architecture has been regarded as the best solution based on a combination between a baseband precoder and an RF precoder, we still do not get the optimal design of hybrid precoders. According to the mapping strategies from RF chains to the different antenna elements, there are two main categories of hybrid precoding architecture. Given as a hybrid precoding sub-array architecture, the partially-connected structure reduces hardware complexity by using a less number of phase shifters, whereas it sacrifices some beamforming gain. In this paper, we treat the hybrid precoder design in mmWave MIMO systems as a problem of matrix factorization. Thus, we adopt the alternating minimization principle in order to solve the design problem. Further, we present our proposed algorithm for the partially-connected structure, which is based on the iterative hard thresholding method. Through simulation results, we show that our hybrid precoding algorithm provides significant performance gains over existing algorithms. We also show that the proposed approach reduces significantly the computational complexity. Furthermore, valuable design insights are provided when we use the proposed algorithm to make simulation comparisons between the hybrid precoding partially-connected structure and the fully-connected structure.

Keywords: alternating minimization, hybrid precoding, iterative hard thresholding, low-complexity, millimeter wave communication, partially-connected structure

Procedia PDF Downloads 297
72 Linearly Polarized Single Photon Emission from Nonpolar, Semipolar and Polar Quantum Dots in GaN/InGaN Nanowires

Authors: Snezana Lazic, Zarko Gacevic, Mark Holmes, Ekaterina Chernysheva, Marcus Müller, Peter Veit, Frank Bertram, Juergen Christen, Yasuhiko Arakawa, Enrique Calleja

Abstract:

The study reports how the pencil-like morphology of a homoepitaxially grown GaN nanowire can be exploited for the fabrication of a thin conformal InGaN nanoshell, hosting nonpolar, semipolar and polar single photon sources (SPSs). All three SPS types exhibit narrow emission lines (FWHM~0.35 - 2 meV) and high degrees of linear optical polarization (P > 70%) in the low-temperature micro-photoluminescence (µ-PL) experiments and are characterized by a pronounced antibunching in the photon correlation measurements (gcorrected(2)(0) < 0.3). The quantum-dot-like exciton localization centers induced by compositional fluctuations within the InGaN nanoshell are identified as the driving mechanism for the single photon emission. As confirmed by the low-temperature transmission electron microscopy combined with cathodoluminescence (TEM-CL) study, the crystal region (i.e. non-polar m-, semi-polar r- and polar c-facets) hosting the single photon emitters strongly affects their emission wavelength, which ranges from ultra-violet for the non-polar to visible for the polar SPSs. The photon emission lifetime is also found to be facet-dependent and varies from sub-nanosecond time scales for the non- and semi-polar SPSs to a few nanoseconds for the polar ones. These differences are mainly attributed to facet-dependent indium content and electric field distribution across the hosting InGaN nanoshell. The hereby reported pencil-like InGaN nanoshell is the first single nanostructure able to host all three types of single photon emitters and is thus a promising building block for tunable quantum light devices integrated into future photonic and optoelectronic circuits.

Keywords: GaN nanowire, InGaN nanoshell, linear polarization, nonpolar, semipolar, polar quantum dots, single-photon sources

Procedia PDF Downloads 366
71 The French Ekang Ethnographic Dictionary. The Quantum Approach

Authors: Henda Gnakate Biba, Ndassa Mouafon Issa

Abstract:

Dictionaries modeled on the Western model [tonic accent languages] are not suitable and do not account for tonal languages phonologically, which is why the [prosodic and phonological] ethnographic dictionary was designed. It is a glossary that expresses the tones and the rhythm of words. It recreates exactly the speaking or singing of a tonal language, and allows the non-speaker of this language to pronounce the words as if they were a native. It is a dictionary adapted to tonal languages. It was built from ethnomusicological theorems and phonological processes, according to Jean. J. Rousseau 1776 hypothesis /To say and to sing were once the same thing/. Each word in the French dictionary finds its corresponding language, ekaη. And each word ekaη is written on a musical staff. This ethnographic dictionary is also an inventive, original and innovative research thesis, but it is also an inventive, original and innovative research thesis. A contribution to the theoretical, musicological, ethno musicological and linguistic conceptualization of languages, giving rise to the practice of interlocution between the social and cognitive sciences, the activities of artistic creation and the question of modeling in the human sciences: mathematics, computer science, translation automation and artificial intelligence. When you apply this theory to any text of a folksong of a world-tone language, you do not only piece together the exact melody, rhythm, and harmonies of that song as if you knew it in advance but also the exact speaking of this language. The author believes that the issue of the disappearance of tonal languages and their preservation has been structurally resolved, as well as one of the greatest cultural equations related to the composition and creation of tonal, polytonal and random music. The experimentation confirming the theorization designed a semi-digital, semi-analog application which translates the tonal languages of Africa (about 2,100 languages) into blues, jazz, world music, polyphonic music, tonal and anatonal music and deterministic and random music). To test this application, I use a music reading and writing software that allows me to collect the data extracted from my mother tongue, which is already modeled in the musical staves saved in the ethnographic (semiotic) dictionary for automatic translation ( volume 2 of the book). Translation is done (from writing to writing, from writing to speech and from writing to music). Mode of operation: you type a text on your computer, a structured song (chorus-verse), and you command the machine a melody of blues, jazz and, world music or, variety etc. The software runs, giving you the option to choose harmonies, and then you select your melody.

Keywords: music, language, entenglement, science, research

Procedia PDF Downloads 38
70 Education-based, Graphical User Interface Design for Analyzing Phase Winding Inter-Turn Faults in Permanent Magnet Synchronous Motors

Authors: Emir Alaca, Hasbi Apaydin, Rohullah Rahmatullah, Necibe Fusun Oyman Serteller

Abstract:

In recent years, Permanent Magnet Synchronous Motors (PMSMs) have found extensive applications in various industrial sectors, including electric vehicles, wind turbines, and robotics, due to their high performance and low losses. Accurate mathematical modeling of PMSMs is crucial for advanced studies in electric machines. To enhance the effectiveness of graduate-level education, incorporating virtual or real experiments becomes essential to reinforce acquired knowledge. Virtual laboratories have gained popularity as cost-effective alternatives to physical testing, mitigating the risks associated with electrical machine experiments. This study presents a MATLAB-based Graphical User Interface (GUI) for PMSMs. The GUI offers a visual interface that allows users to observe variations in motor outputs corresponding to different input parameters. It enables users to explore healthy motor conditions and the effects of short-circuit faults in the one-phase winding. Additionally, the interface includes menus through which users can access equivalent circuits related to the motor and gain hands-on experience with the mathematical equations used in synchronous motor calculations. The primary objective of this paper is to enhance the learning experience of graduate and doctoral students by providing a GUI-based approach in laboratory studies. This interactive platform empowers students to examine and analyze motor outputs by manipulating input parameters, facilitating a deeper understanding of PMSM operation and control.

Keywords: magnet synchronous motor, mathematical modelling, education tools, winding inter-turn fault

Procedia PDF Downloads 27
69 Cardiac Biosignal and Adaptation in Confined Nuclear Submarine Patrol

Authors: B. Lefranc, C. Aufauvre-Poupon, C. Martin-Krumm, M. Trousselard

Abstract:

Isolated and confined environments (ICE) present several challenges which may adversely affect human’s psychology and physiology. Submariners in Sub-Surface Ballistic Nuclear (SSBN) mission exposed to these environmental constraints must be able to perform complex tasks as part of their normal duties, as well as during crisis periods when emergency actions are required or imminent. The operational and environmental constraints they face contribute to challenge human adaptability. The impact of such a constrained environment has yet to be explored. Establishing a knowledge framework is a determining factor, particularly in view of the next long space travels. Ensuring that the crews are maintained in optimal operational conditions is a real challenge because the success of the mission depends on them. This study focused on the evaluation of the impact of stress on mental health and sensory degradation of submariners during a mission on SSBN using cardiac biosignal (heart rate variability, HRV) clustering. This is a pragmatic exploratory study of a prospective cohort included 19 submariner volunteers. HRV was recorded at baseline to classify by clustering the submariners according to their stress level based on parasympathetic (Pa) activity. Impacts of high Pa (HPa) versus low Pa (LPa) level at baseline were assessed on emotional state and sensory perception (interoception and exteroception) as a cardiac biosignal during the patrol and at a recovery time one month after. Whatever the time, no significant difference was found in mental health between groups. There are significant differences in the interoceptive, exteroceptive and physiological functioning during the patrol and at recovery time. To sum up, compared to the LPa group, the HPa maintains a higher level in psychosensory functioning during the patrol and at recovery but exhibits a decrease in Pa level. The HPa group has less adaptable HRV characteristics, less unpredictability and flexibility of cardiac biosignals while the LPa group increases them during the patrol and at recovery time. This dissociation between psychosensory and physiological adaptation suggests two treatment modalities for ICE environments. To our best knowledge, our results are the first to highlight the impact of physiological differences in the HRV profile on the adaptability of submariners. Further studies are needed to evaluate the negative emotional and cognitive effects of ICEs based on the cardiac profile. Artificial intelligence offers a promising future for maintaining high level of operational conditions. These future perspectives will not only allow submariners to be better prepared, but also to design feasible countermeasures that will help support analog environments that bring us closer to a trip to Mars.

Keywords: adaptation, exteroception, HRV, ICE, interoception, SSBN

Procedia PDF Downloads 151
68 Microfabrication of Three-Dimensional SU-8 Structures Using Positive SPR Photoresist as a Sacrificial Layer for Integration of Microfluidic Components on Biosensors

Authors: Su Yin Chiam, Qing Xin Zhang, Jaehoon Chung

Abstract:

Complementary metal-oxide-semiconductor (CMOS) integrated circuits (ICs) have obtained increased attention in the biosensor community because CMOS technology provides cost-effective and high-performance signal processing at a mass-production level. In order to supply biological samples and reagents effectively to the sensing elements, there are increasing demands for seamless integration of microfluidic components on the fabricated CMOS wafers by post-processing. Although the PDMS microfluidic channels replicated from separately prepared silicon mold can be typically aligned and bonded onto the CMOS wafers, it remains challenging owing the inherently limited aligning accuracy ( > ± 10 μm) between the two layers. Here we present a new post-processing method to create three-dimensional microfluidic components using two different polarities of photoresists, an epoxy-based negative SU-8 photoresist and positive SPR220-7 photoresist. The positive photoresist serves as a sacrificial layer and the negative photoresist was utilized as a structural material to generate three-dimensional structures. Because both photoresists are patterned using a standard photolithography technology, the dimensions of the structures can be effectively controlled as well as the alignment accuracy, moreover, is dramatically improved (< ± 2 μm) and appropriately can be adopted as an alternative post-processing method. To validate the proposed processing method, we applied this technique to build cell-trapping structures. The SU8 photoresist was mainly used to generate structures and the SPR photoresist was used as a sacrificial layer to generate sub-channel in the SU8, allowing fluid to pass through. The sub-channel generated by etching the sacrificial layer works as a cell-capturing site. The well-controlled dimensions enabled single-cell capturing on each site and high-accuracy alignment made cells trapped exactly on the sensing units of CMOS biosensors.

Keywords: SU-8, microfluidic, MEMS, microfabrication

Procedia PDF Downloads 491
67 Realizing Teleportation Using Black-White Hole Capsule Constructed by Space-Time Microstrip Circuit Control

Authors: Mapatsakon Sarapat, Mongkol Ketwongsa, Somchat Sonasang, Preecha Yupapin

Abstract:

The designed and performed preliminary tests on a space-time control circuit using a two-level system circuit with a 4-5 cm diameter microstrip for realistic teleportation have been demonstrated. It begins by calculating the parameters that allow a circuit that uses the alternative current (AC) at a specified frequency as the input signal. A method that causes electrons to move along the circuit perimeter starting at the speed of light, which found satisfaction based on the wave-particle duality. It is able to establish the supersonic speed (faster than light) for the electron cloud in the middle of the circuit, creating a timeline and propulsive force as well. The timeline is formed by the stretching and shrinking time cancellation in the relativistic regime, in which the absolute time has vanished. In fact, both black holes and white holes are created from time signals at the beginning, where the speed of electrons travels close to the speed of light. They entangle together like a capsule until they reach the point where they collapse and cancel each other out, which is controlled by the frequency of the circuit. Therefore, we can apply this method to large-scale circuits such as potassium, from which the same method can be applied to form the system to teleport living things. In fact, the black hole is a hibernation system environment that allows living things to live and travel to the destination of teleportation, which can be controlled from position and time relative to the speed of light. When the capsule reaches its destination, it increases the frequency of the black holes and white holes canceling each other out to a balanced environment. Therefore, life can safely teleport to the destination. Therefore, there must be the same system at the origin and destination, which could be a network. Moreover, it can also be applied to space travel as well. The design system will be tested on a small system using a microstrip circuit system that we can create in the laboratory on a limited budget that can be used in both wired and wireless systems.

Keywords: quantum teleportation, black-white hole, time, timeline, relativistic electronics

Procedia PDF Downloads 51
66 The MHz Frequency Range EM Induction Device Development and Experimental Study for Low Conductive Objects Detection

Authors: D. Kakulia, L. Shoshiashvili, G. Sapharishvili

Abstract:

The results of the study are related to the direction of plastic mine detection research using electromagnetic induction, the development of appropriate equipment, and the evaluation of expected results. Electromagnetic induction sensing is effectively used in the detection of metal objects in the soil and in the discrimination of unexploded ordnances. Metal objects interact well with a low-frequency alternating magnetic field. Their electromagnetic response can be detected at the low-frequency range even when they are placed in the ground. Detection of plastic things such as plastic mines by electromagnetic induction is associated with difficulties. The interaction of non-conducting bodies or low-conductive objects with a low-frequency alternating magnetic field is very weak. At the high-frequency range where already wave processes take place, the interaction increases. Interactions with other distant objects also increase. A complex interference picture is formed, and extraction of useful information also meets difficulties. Sensing by electromagnetic induction at the intermediate MHz frequency range is the subject of research. The concept of detecting plastic mines in this range can be based on the study of the electromagnetic response of non-conductive cavity in a low-conductivity environment or the detection of small metal components in plastic mines, taking into account constructive features. The detector node based on the amplitude and phase detector 'Analog Devices ad8302' has been developed for experimental studies. The node has two inputs. At one of the inputs, the node receives a sinusoidal signal from the generator, to which a transmitting coil is also connected. The receiver coil is attached to the second input of the node. The additional circuit provides an option to amplify the signal output from the receiver coil by 20 dB. The node has two outputs. The voltages obtained at the output reflect the ratio of the amplitudes and the phase difference of the input harmonic signals. Experimental measurements were performed in different positions of the transmitter and receiver coils at the frequency range 1-20 MHz. Arbitrary/Function Generator Tektronix AFG3052C and the eight-channel high-resolution oscilloscope PICOSCOPE 4824 were used in the experiments. Experimental measurements were also performed with a low-conductive test object. The results of the measurements and comparative analysis show the capabilities of the simple detector node and the prospects for its further development in this direction. The results of the experimental measurements are compared and analyzed with the results of appropriate computer modeling based on the method of auxiliary sources (MAS). The experimental measurements are driven using the MATLAB environment. Acknowledgment -This work was supported by Shota Rustaveli National Science Foundation (SRNSF) (Grant number: NFR 17_523).

Keywords: EM induction sensing, detector, plastic mines, remote sensing

Procedia PDF Downloads 127
65 Experimental Investigation of Nucleate Pool Boiling Heat Transfer Characteristics on Copper Surface with Laser-Textured Stepped Microstructures

Authors: Luvindran Sugumaran, Mohd Nashrul Mohd Zubir, Kazi Md Salim Newaz, Tuan Zaharinie Tuan Zahari, Suazlan Mt Aznam, Aiman Mohd Halil

Abstract:

Due to the rapid advancement of integrated circuits and the increasing trend towards miniaturizing electronic devices, the amount of heat produced by electronic devices has consistently exceeded the maximum limit for heat dissipation. Currently, the two-phase cooling technique based on phase change pool boiling heat transfer has received a lot of attention because of its potential to fully utilize the latent heat of the fluid and produce a highly effective heat dissipation capacity while keeping the equipment's operating temperature within an acceptable range. There are numerous strategies available for the alteration of heating surfaces, but to find the best, simplest, and most dependable one remains a challenge. Lately, surface texturing via laser ablation has been used in a variety of investigations, demonstrating its significant potential for enhancing the pool boiling heat transfer performance. In this research, the nucleate pool boiling heat transfer performance of laser-textured copper surfaces of different patterns was investigated. The bare copper surface serves as a reference to compare the performance of laser-structured surfaces. It was observed that the heat transfer coefficients were increased with the increase of surface area ratio and the ratio of the peak-to-valley height of the microstructure. Laser machined grain structure produced extra nucleation sites, which ultimately caused the improved pool boiling performance. Due to an increase in nucleation site density and surface area, the enhanced nucleate boiling served as the primary heat transfer mechanism. The pool boiling performance of the laser-textured copper surfaces is superior to the bare copper surface in all aspects.

Keywords: heat transfer coefficient, laser texturing, micro structured surface, pool boiling

Procedia PDF Downloads 67
64 Microfluidic Chambers with Fluid Walls for Cell Biology

Authors: Cristian Soitu, Alexander Feuerborn, Cyril Deroy, Alfonso Castrejon-Pita, Peter R. Cook, Edmond J. Walsh

Abstract:

Microfluidics now stands as an academically mature technology after a quarter of a century research activities have delivered a vast array of proof of concepts for many biological workflows. However, translation to industry remains poor, with only a handful of notable exceptions – e.g. digital PCR, DNA sequencing – mainly because of biocompatibility issues, limited range of readouts supported or complex operation required. This technology exploits the domination of interfacial forces over gravitational ones at the microscale, replacing solid walls with fluid ones as building blocks for cell micro-environments. By employing only materials used by biologists for decades, the system is shown to be biocompatible, and easy to manufacture and operate. The method consists in displacing a continuous fluid layer into a pattern of isolated chambers overlaid with an immiscible liquid to prevent evaporation. The resulting fluid arrangements can be arrays of micro-chambers with rectangular footprint, which use the maximum surface area available, or structures with irregular patterns. Pliant, self-healing fluid walls confine volumes as small as 1 nl. Such fluidic structures can be reconfigured during the assays, giving the platform an unprecedented level of flexibility. Common workflows in cell biology are demonstrated – e.g. cell growth and retrieval, cloning, cryopreservation, fixation and immunolabeling, CRISPR-Cas9 gene editing, and proof-of-concept drug tests. This fluid-shaping technology is shown to have potential for high-throughput cell- and organism-based assays. The ability to make and reconfigure on-demand microfluidic circuits on standard Petri dishes should find many applications in biology, and yield more relevant phenotypic and genotypic responses when compared to standard microfluidic assays.

Keywords: fluid walls, micro-chambers, reconfigurable, freestyle

Procedia PDF Downloads 168
63 Comparative Study on Efficacy and Clinical Outcomes in Minimally Invasive Surgery Transforaminal Interbody Fusion vs Minimally Invasive Surgery Lateral Interbody Fusion

Authors: Sundaresan Soundararajan, George Ezekiel Silvananthan, Chor Ngee Tan

Abstract:

Introduction: Transforaminal Interbody Fusion (TLIF) has been adopted for many decades now, however, XLIF, still in relative infancy, has grown to be accepted as a new Minimally Invasive Surgery (MIS) option. There is a paucity of reports directly comparing lateral approach surgery to other MIS options such as TLIF in the treatment of lumbar degenerative disc diseases. Aims/Objectives: The objective of this study was to compare the efficacy and clinical outcomes between Minimally Invasive Transforaminal Interbody Fusion (TLIF) and Minimally Invasive Lateral Interbody Fusion (XLIF) in the treatment of patients with degenerative disc disease of the lumbar spine. Methods: A single center, retrospective cohort study involving a total of 38 patients undergoing surgical intervention between 2010 and 2013 for degenerative disc disease of lumbar spine at single L4/L5 level. 18 patients were treated with MIS TLIF, and 20 patients were treated with XLIF. Results: The XLIF group showed shorter duration of surgery compared to the TLIF group (176 mins vs. 208.3 mins, P = 0.03). Length of hospital stay was also significantly shorter in XLIF group (5.9 days vs. 9 days, p = 0.03). Intraoperative blood loss was favouring XLIF as 85% patients had blood loss less than 100cc compared to 58% in the TLIF group (P = 0.03). Radiologically, disc height was significantly improved post operatively in the XLIF group compared to the TLIF group (0.56mm vs. 0.39mm, P = 0.01). Foraminal height increment was also higher in the XLIF group (0.58mm vs. 0.45mm , P = 0.06). Clinically, back pain and leg pain improved in 85% of patients in the XLIF group and 78% in the TLIF group. Post op hip flexion weakness was more common in the XLIF group (40%) than in the TLIF group (0%). However, this weakness resolved within 6 months post operatively. There was one case of dural tear and surgical site infection in the TLIF group respectively and none in the XLIF group. Visual Analog Scale (VAS) score 6 months post operatively showed comparable reduction in both groups. TLIF group had Owsterty Disability Index (ODI) improvement on 67% while XLIF group showed improvement of 70% of its patients. Conclusions: Lateral approach surgery shows comparable clinical outcomes in resolution of back pain and radiculopathy to conventional MIS techniques such as TLIF. With significantly shorter duration of surgical time, minimal blood loss and shorter hospital stay, XLIF seems to be a reasonable MIS option compared to other MIS techniques in treating degenerative lumbar disc diseases.

Keywords: extreme lateral interbody fusion, lateral approach, minimally invasive, XLIF

Procedia PDF Downloads 187
62 From Design, Experience and Play Framework to Common Design Thinking Tools: Using Serious Modern Board Games

Authors: Micael Sousa

Abstract:

Board games (BGs) are thriving as new designs emerge from the hobby community to greater audiences all around the world. Although digital games are gathering most of the attention in game studies and serious games research fields, the post-digital movement helps to explain why in the world dominated by digital technologies, the analog experiences are still unique and irreplaceable to users, allowing innovation in new hybrid environments. The BG’s new designs are part of these post-digital and hybrid movements because they result from the use of powerful digital tools that enable the production and knowledge sharing about the BGs and their face-to-face unique social experiences. These new BGs, defined as modern by many authors, are providing innovative designs and unique game mechanics that are still not yet fully explored by the main serious games (SG) approaches. Even the most established frameworks settled to address SG, as fun games implemented to achieve predefined goals need more development, especially when considering modern BGs. Despite the many anecdotic perceptions, researchers are only now starting to rediscover BGs and demonstrating their potentials. They are proving that BGs are easy to adapt and to grasp by non-expert players in experimental approaches, with the possibility of easy-going adaptation to players’ profiles and serious objectives even during gameplay. Although there are many design thinking (DT) models and practices, their relations with SG frameworks are also underdeveloped, mostly because this is a new research field, lacking theoretical development and the systematization of the experimental practices. Using BG as case studies promise to help develop these frameworks. Departing from the Design, Experience, and Play (DPE) framework and considering the Common Design Think Tools (CDST), this paper proposes a new experimental framework for the adaptation and development of modern BG design for DT: the Design, Experience, and Play for Think (DPET) experimental framework. This is done through the systematization of the DPE and CDST approaches applied in two case studies, where two different sequences of adapted BG were employed to establish a DT collaborative process. These two sessions occurred with different participants and in different contexts, also using different sequences of games for the same DT approach. The first session took place at the Faculty of Economics at the University of Coimbra in a training session of serious games for project development. The second session took place in the Casa do Impacto through The Great Village Design Jam light. Both sessions had the same duration and were designed to progressively achieve DT goals, using BGs as SGs in a collaborative process. The results from the sessions show that a sequence of BGs, when properly adapted to address the DPET framework, can generate a viable and innovative process of collaborative DT that is productive, fun, and engaging. The DPET proposed framework intents to help establish how new SG solutions could be defined for new goals through flexible DT. Applications in other areas of research and development can also benefit from these findings.

Keywords: board games, design thinking, methodology, serious games

Procedia PDF Downloads 91
61 Behavioral and EEG Reactions in Native Turkic-Speaking Inhabitants of Siberia and Siberian Russians during Recognition of Syntactic Errors in Sentences in Native and Foreign Languages

Authors: Tatiana N. Astakhova, Alexander E. Saprygin, Tatyana A. Golovko, Alexander N. Savostyanov, Mikhail S. Vlasov, Natalia V. Borisova, Alexandera G. Karpova, Urana N. Kavai-ool, Elena D. Mokur-ool, Nikolay A. Kolchanov, Lubomir I. Aftanas

Abstract:

The aim of the study is to compare behaviorally and EEG reactions in Turkic-speaking inhabitants of Siberia (Tuvinians and Yakuts) and Russians during the recognition of syntax errors in native and foreign languages. 63 healthy aboriginals of the Tyva Republic, 29 inhabitants of the Sakha (Yakutia) Republic, and 55 Russians from Novosibirsk participated in the study. All participants completed a linguistic task, in which they had to find a syntax error in the written sentences. Russian participants completed the task in Russian and in English. Tuvinian and Yakut participants completed the task in Russian, English, and Tuvinian or Yakut, respectively. EEG’s were recorded during the solving of tasks. For Russian participants, EEG's were recorded using 128-channels. The electrodes were placed according to the extended International 10-10 system, and the signals were amplified using ‘Neuroscan (USA)’ amplifiers. For Tuvinians and Yakuts EEG's were recorded using 64-channels and amplifiers Brain Products, Germany. In all groups 0.3-100 Hz analog filtering, sampling rate 1000 Hz were used. Response speed and the accuracy of recognition error were used as parameters of behavioral reactions. Event-related potentials (ERP) responses P300 and P600 were used as indicators of brain activity. The accuracy of solving tasks and response speed in Russians were higher for Russian than for English. The P300 amplitudes in Russians were higher for English; the P600 amplitudes in the left temporal cortex were higher for the Russian language. Both Tuvinians and Yakuts have no difference in accuracy of solving tasks in Russian and in their respective national languages (Tuvinian and Yakut). However, the response speed was faster for tasks in Russian than for tasks in their national language. Tuvinians and Yakuts showed bad accuracy in English, but the response speed was higher for English than for Russian and the national languages. With Tuvinians, there were no differences in the P300 and P600 amplitudes and in cortical topology for Russian and Tuvinian, but there was a difference for English. In Yakuts, the P300 and P600 amplitudes and topology of ERP for Russian were the same as Russians had for Russian. In Yakuts, brain reactions during Yakut and English comprehension had no difference and were reflected foreign language comprehension -while the Russian language comprehension was reflected native language comprehension. We found out that the Tuvinians recognized both Russian and Tuvinian as native languages, and English as a foreign language. The Yakuts recognized both English and Yakut as a foreign language, only Russian as a native language. According to the inquirer, both Tuvinians and Yakuts use the national language as a spoken language, whereas they don’t use it for writing. It can well be a reason that Yakuts perceive the Yakut writing language as a foreign language while writing Russian as their native.

Keywords: EEG, language comprehension, native and foreign languages, Siberian inhabitants

Procedia PDF Downloads 514
60 Analysis of Standard Tramway Surge Protection Methods Based on Real Cases

Authors: Alain Rousseau, Alfred Aragones, Gilles Rougier

Abstract:

The study is based on lightning and surge standards mainly the EN series 62305 for facility protection, EN series 61643 for Low Voltage Surge Protective Devices, High Voltage surge arrester standard en 60099-4 and the traction arrester standards namely EN 50526-1 and 50526-1 dealing respectively with railway applications fixed installations D.C. surge arresters and voltage limiting devices. The more severe stress for tramways installations is caused by direct lightning on the catenary line. In such case, the surge current propagates towards the various poles and sparkover the insulators leading to a lower stress. If the impact point is near enough, a significant surge current will flow towards the traction surge arrester that is installed on the catenary at the location the substation is connected. Another surge arrester can be installed at the entrance of the substation or even inside the rectifier to avoid insulation damages. In addition, surge arresters can be installed between + and – to avoid damaging sensitive circuits. Based on disturbances encountered in a substation following a lighting event, the engineering department of RATP has decided to investigate the cause of such damage and more generally to question the efficiency of the various possible protection means. Based on the example of a recent tramway line the paper present the result of a lightning study based on direct lightning strikes. As a matter of fact, the induced surges on the catenary are much more frequent but much less damaging. First, a lightning risk assessment is performed for the substations that takes into account direct lightning and induced lightning both on the substation and its connected lines such as the catenary. Then the paper deals with efficiency of the various surge arresters is discussed based on field experience and calculations. The efficiency of the earthing system used at the bottom of the pole is also addressed based on high frequency earthing measurement. As a conclusion, the paper is making recommendations for an enhanced efficiency of existing protection means.

Keywords: surge arrester, traction, lightning, risk, surge protective device

Procedia PDF Downloads 234
59 AI/ML Atmospheric Parameters Retrieval Using the “Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN)”

Authors: Thomas Monahan, Nicolas Gorius, Thanh Nguyen

Abstract:

Exoplanet atmospheric parameters retrieval is a complex, computationally intensive, inverse modeling problem in which an exoplanet’s atmospheric composition is extracted from an observed spectrum. Traditional Bayesian sampling methods require extensive time and computation, involving algorithms that compare large numbers of known atmospheric models to the input spectral data. Runtimes are directly proportional to the number of parameters under consideration. These increased power and runtime requirements are difficult to accommodate in space missions where model size, speed, and power consumption are of particular importance. The use of traditional Bayesian sampling methods, therefore, compromise model complexity or sampling accuracy. The Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN) is a deep convolutional generative adversarial network that improves on the previous model’s speed and accuracy. We demonstrate the efficacy of artificial intelligence to quickly and reliably predict atmospheric parameters and present it as a viable alternative to slow and computationally heavy Bayesian methods. In addition to its broad applicability across instruments and planetary types, ARcGAN has been designed to function on low power application-specific integrated circuits. The application of edge computing to atmospheric retrievals allows for real or near-real-time quantification of atmospheric constituents at the instrument level. Additionally, edge computing provides both high-performance and power-efficient computing for AI applications, both of which are critical for space missions. With the edge computing chip implementation, ArcGAN serves as a strong basis for the development of a similar machine-learning algorithm to reduce the downlinked data volume from the Compact Ultraviolet to Visible Imaging Spectrometer (CUVIS) onboard the DAVINCI mission to Venus.

Keywords: deep learning, generative adversarial network, edge computing, atmospheric parameters retrieval

Procedia PDF Downloads 149
58 Negotiating Strangeness: Narratives of Forced Return Migration and the Construction of Identities

Authors: Cheryl-Ann Sarita Boodram

Abstract:

Historically, the movement of people has been the subject of socio-political and economic regulatory policies which congeal to regulate human mobility and establish geopolitical and spatial identities and borderlands. As migratory practices evolved, so too has the problematization associated with movement, migration and citizenship. The emerging trends have led to active development of immigration technology governing human mobility and the naming of migratory practices. One such named phenomenon is ‘deportation’ or the forced removal of individuals from their adopted country. Deportation, has gained much attention within the human mobility landscape in the past twenty years following the September 2001 terrorist attack on the World Trade Centre in New York. In a reactionary move, several metropolitan countries, including Canada and the United Kingdom enacted or reviewed immigration laws which further enabled the removal of foreign born criminals to the land of their birth in the global south. Existing studies fall short of understanding the multiple textures of the forced returned migration experiences and the social injustices resulting from deportation displacement. This study brings together indigenous research methodologies through the use of participatory action research and social work with returned migrants in Trinidad and Tobago to uncover the experiences of displacement of deported nationals. The study explores the experiences of negotiating life as a ‘stranger’ and how return has influenced the construction of identities of returned nationals. Findings from this study reveal that deportation has led to inequalities and facilitated ‘othering’ of this group within their own country of birth. The study further highlighted that deportation leads to circuits of dispossession, and perpetuates inequalities. This study provides original insights into the way returned migrants negotiate, map and embody ‘strangeness’ and manage their return to a soil they consider unfamiliar and alien.

Keywords: stranger, alien geographies, displacement, deportation, negotiating strangeness, identity, otherness, alien landscapes

Procedia PDF Downloads 478
57 Thickness-Tunable Optical, Magnetic, and Dielectric Response of Lithium Ferrite Thin Film Synthesized by Pulsed Laser Deposition

Authors: Prajna Paramita Mohapatra, Pamu Dobbidi

Abstract:

Lithium ferrite (LiFe5O8) has potential applications as a component of microwave magnetic devices such as circulators and monolithic integrated circuits. For efficient device applications, spinel ferrites in the form of thin films are highly required. It is necessary to improve their magnetic and dielectric behavior by optimizing the processing parameters during deposition. The lithium ferrite thin films are deposited on Pt/Si substrate using the pulsed laser deposition technique (PLD). As controlling the film thickness is the easiest parameter to tailor the strain, we deposited the thin films having different film thicknesses (160 nm, 200 nm, 240 nm) at oxygen partial pressure of 0.001 mbar. The formation of single phase with spinel structure (space group - P4132) is confirmed by the XRD pattern and the Rietveld analysis. The optical bandgap is decreased with the increase in thickness. FESEM confirmed the formation of uniform grains having well separated grain boundaries. Further, the film growth and the roughness are analyzed by AFM. The root-mean-square (RMS) surface roughness is decreased from 13.52 nm (160 nm) to 9.34 nm (240 nm). The room temperature magnetization is measured with a maximum field of 10 kOe. The saturation magnetization is enhanced monotonically with an increase in thickness. The magnetic resonance linewidth is obtained in the range of 450 – 780 Oe. The dielectric response is measured in the frequency range of 104 – 106 Hz and in the temperature range of 303 – 473 K. With an increase in frequency, the dielectric constant and the loss tangent of all the samples decreased continuously, which is a typical behavior of conventional dielectric material. The real part of the dielectric constant and the dielectric loss is increased with an increase in thickness. The contribution of grain and grain boundaries is also analyzed by employing the equivalent circuit model. The highest dielectric constant is obtained for the film having a thickness of 240 nm at 104 Hz. The obtained results demonstrate that desired response can be obtained by tailoring the film thickness for the microwave magnetic devices.

Keywords: PLD, optical response, thin films, magnetic response, dielectric response

Procedia PDF Downloads 77
56 A Next Generation Multi-Scale Modeling Theatre for in silico Oncology

Authors: Safee Chaudhary, Mahnoor Naseer Gondal, Hira Anees Awan, Abdul Rehman, Ammar Arif, Risham Hussain, Huma Khawar, Zainab Arshad, Muhammad Faizyab Ali Chaudhary, Waleed Ahmed, Muhammad Umer Sultan, Bibi Amina, Salaar Khan, Muhammad Moaz Ahmad, Osama Shiraz Shah, Hadia Hameed, Muhammad Farooq Ahmad Butt, Muhammad Ahmad, Sameer Ahmed, Fayyaz Ahmed, Omer Ishaq, Waqar Nabi, Wim Vanderbauwhede, Bilal Wajid, Huma Shehwana, Muhammad Tariq, Amir Faisal

Abstract:

Cancer is a manifestation of multifactorial deregulations in biomolecular pathways. These deregulations arise from the complex multi-scale interplay between cellular and extracellular factors. Such multifactorial aberrations at gene, protein, and extracellular scales need to be investigated systematically towards decoding the underlying mechanisms and orchestrating therapeutic interventions for patient treatment. In this work, we propose ‘TISON’, a next-generation web-based multiscale modeling platform for clinical systems oncology. TISON’s unique modeling abstraction allows a seamless coupling of information from biomolecular networks, cell decision circuits, extra-cellular environments, and tissue geometries. The platform can undertake multiscale sensitivity analysis towards in silico biomarker identification and drug evaluation on cellular phenotypes in user-defined tissue geometries. Furthermore, integration of cancer expression databases such as The Cancer Genome Atlas (TCGA) and Human Proteome Atlas (HPA) facilitates in the development of personalized therapeutics. TISON is the next-evolution of multiscale cancer modeling and simulation platforms and provides a ‘zero-code’ model development, simulation, and analysis environment for application in clinical settings.

Keywords: systems oncology, cancer systems biology, cancer therapeutics, personalized therapeutics, cancer modelling

Procedia PDF Downloads 186
55 Meditation and Insight Interpretation Using Quantum Circle Based-on Experiment and Quantum Relativity Formalism

Authors: Somnath Bhattachryya, Montree Bunruangses, Somchat Sonasang, Preecha Yupapin

Abstract:

In this study and research on meditation and insight, the design and experiment with electronic circuits to manipulate the meditators' mental circles that call the chakras to have the same size is proposed. The shape of the circuit is 4-ports, called an add-drop multiplexer, that studies the meditation structure called the four-mindfulness foundation, then uses an AC power signal as an input instead of the meditation time function, where various behaviors with the method of re-filtering the signal (successive filtering), like eight noble paths. Start by inputting a signal at a frequency that causes the velocity of the wave on the perimeter of the circuit to cause particles to have the speed of light in a vacuum. The signal changes from electromagnetic waves and matter waves according to the velocity (frequency) until it reaches the point of the relativistic limit. The electromagnetic waves are transformed into photons with properties of wave-particle overcoming the limits of the speed of light. As for the matter wave, it will travel to the other side and cannot pass through the relativistic limit, called a shadow signal (echo) that can have power from increasing speed but cannot create speed faster than light or insight. In the experiment, the only the side where the velocity is positive, only where the speed above light or the corresponding frequency indicates intelligence. Other side(echo) can be done by changing the input signal to the other side of the circuit to get the same result. But there is no intelligence or speed beyond light. It is also used to study the stretching, contraction of time and wormholes that can be applied for teleporting, Bose-Einstein condensate and teleprinting, quantum telephone. The teleporting can happen throughout the system with wave-particle and echo, which is when the speed of the particle is faster than the stretching or contraction of time, the particle will submerge in the wormhole, when the destination and time are determined, will travel through the wormhole. In a wormhole, time can determine in the future and the past. The experimental results using the microstrip circuit have been found to be by the principle of quantum relativity, which can be further developed for both tools and meditation practitioners for quantum technology.

Keywords: quantu meditation, insight picture, quantum circuit, absolute time, teleportation

Procedia PDF Downloads 38
54 Identification and Quantification of Lisinopril from Pure, Formulated and Urine Samples by Micellar Thin Layer Chromatography

Authors: Sudhanshu Sharma

Abstract:

Lisinopril, 1-[N-{(s)-I-carboxy-3 phenyl propyl}-L-proline dehydrate is a lysine analog of enalaprilat, the active metabolite of enalapril. It is long-acting, non-sulhydryl angiotensin-converting enzyme (ACE) inhibitor that is used for the treatment of hypertension and congestive heart failure in daily dosage 10-80 mg. Pharmacological activity of lisinopril has been proved in various experimental and clinical studies. Owing to its importance and widespread use, efforts have been made towards the development of simple and reliable analytical methods. As per our literature survey, lisinopril in pharmaceutical formulations has been determined by various analytical methodologies like polaragraphy, potentiometry, and spectrophotometry, but most of these analytical methods are not too suitable for the Identification of lisinopril from clinical samples because of the interferences caused by the amino acids and amino groups containing metabolites present in biological samples. This report is an attempt in the direction of developing a simple and reliable method for on plate identification and quantification of lisinopril in pharmaceutical formulations as well as from human urine samples using silica gel H layers developed with a new mobile phase comprising of micellar solutions of N-cetyl-N, N, N-trimethylammonium bromide (CTAB). Micellar solutions have found numerous practical applications in many areas of separation science. Micellar liquid chromatography (MLC) has gained immense popularity and wider applicability due to operational simplicity, cost effectiveness, relatively non-toxicity and enhanced separation efficiency, low aggressiveness. Incorporation of aqueous micellar solutions as mobile phase was pioneered by Armstrong and Terrill as they accentuated the importance of TLC where simultaneous separation of ionic or non-ionic species in a variety of matrices is required. A peculiarity of the micellar mobile phases (MMPs) is that they have no macroscopic analogues, as a result the typical separations can be easily achieved by using MMPs than aqueous organic mobile phases. Previously MMPs were successfully employed in TLC based critical separations of aromatic hydrocarbons, nucleotides, vitamin K1 and K5, o-, m- and p- aminophenol, amino acids, separation of penicillins. The human urine analysis for identification of selected drugs and their metabolites has emerged as an important investigation tool in forensic drug analysis. Among all chromatographic methods available only thin layer chromatography (TLC) enables a simple fast and effective separation of the complex mixtures present in various biological samples and is recommended as an approved testing for forensic drug analysis by federal Law. TLC proved its applicability during successful separation of bio-active amines, carbohydrates, enzymes, porphyrins, and their precursors, alkaloid and drugs from urine samples.

Keywords: lisnopril, surfactant, chromatography, micellar solutions

Procedia PDF Downloads 341
53 The Quantum Theory of Music and Languages

Authors: Mballa Abanda Serge, Henda Gnakate Biba, Romaric Guemno Kuate, Akono Rufine Nicole, Petfiang Sidonie, Bella Sidonie

Abstract:

The main hypotheses proposed around the definition of the syllable and of music, of the common origin of music and language, should lead the reader to reflect on the cross-cutting questions raised by the debate on the notion of universals in linguistics and musicology. These are objects of controversy, and there lies its interest: the debate raises questions that are at the heart of theories on language. It is an inventive, original and innovative research thesis. The main hypotheses proposed around the definition of the syllable and of music, of the common origin of music and language, should lead the reader to reflect on the cross-cutting questions raised by the debate on the notion of universals in linguistics and musicology. These are objects of controversy, and there lies its interest: the debate raises questions that are at the heart of theories on language. It is an inventive, original and innovative research thesis. A contribution to the theoretical, musicological, ethno musicological and linguistic conceptualization of languages, giving rise to the practice of interlocution between the social and cognitive sciences, the activities of artistic creation and the question of modeling in the human sciences: mathematics, computer science, translation automation and artificial intelligence. When you apply this theory to any text of a folksong of a world-tone language, you do not only piece together the exact melody, rhythm, and harmonies of that song as if you knew it in advance but also the exact speaking of this language. The author believes that the issue of the disappearance of tonal languages and their preservation has been structurally resolved, as well as one of the greatest cultural equations related to the composition and creation of tonal, polytonal and random music. The experimentation confirming the theorization, It designed a semi-digital, semi-analog application which translates the tonal languages of Africa (about 2,100 languages) into blues, jazz, world music, polyphonic music, tonal and anatonal music and deterministic and random music). To test this application, I use a music reading and writing software that allows me to collect the data extracted from my mother tongue, which is already modeled in the musical staves saved in the ethnographic (semiotic) dictionary for automatic translation ( volume 2 of the book). Translation is done (from writing to writing, from writing to speech and from writing to music). Mode of operation: you type a text on your computer, a structured song (chorus-verse), and you command the machine a melody of blues, jazz and world music or variety etc. The software runs, giving you the option to choose harmonies, and then you select your melody.

Keywords: music, entanglement, langauge, science

Procedia PDF Downloads 54