Search results for: offline signature verification
239 Controlled Doping of Graphene Monolayer
Authors: Vedanki Khandenwal, Pawan Srivastava, Kartick Tarafder, Subhasis Ghosh
Abstract:
We present here the experimental realization of controlled doping of graphene monolayers through charge transfer by trapping selected organic molecules between the graphene layer and underlying substrates. This charge transfer between graphene and trapped molecule leads to controlled n-type or p-type doping in monolayer graphene (MLG), depending on whether the trapped molecule acts as an electron donor or an electron acceptor. Doping controllability has been validated by a shift in corresponding Raman peak positions and a shift in Dirac points. In the transfer characteristics of field effect transistors, a significant shift of Dirac point towards positive or negative gate voltage region provides the signature of p-type or n-type doping of graphene, respectively, as a result of the charge transfer between graphene and the organic molecules trapped within it. In order to facilitate the charge transfer interaction, it is crucial for the trapped molecules to be situated in close proximity to the graphene surface, as demonstrated by findings in Raman and infrared spectroscopies. However, the mechanism responsible for this charge transfer interaction has remained unclear at the microscopic level. Generally, it is accepted that the dipole moment of adsorbed molecules plays a crucial role in determining the charge-transfer interaction between molecules and graphene. However, our findings clearly illustrate that the doping effect primarily depends on the reactivity of the constituent atoms in the adsorbed molecules rather than just their dipole moment. This has been illustrated by trapping various molecules at the graphene−substrate interface. Dopant molecules such as acetone (containing highly reactive oxygen atoms) promote adsorption across the entire graphene surface. In contrast, molecules with less reactive atoms, such as acetonitrile, tend to adsorb at the edges due to the presence of reactive dangling bonds. In the case of low-dipole moment molecules like toluene, there is a lack of substantial adsorption anywhere on the graphene surface. Observation of (i) the emergence of the Raman D peak exclusively at the edges for trapped molecules without reactive atoms and throughout the entire basal plane for those with reactive atoms, and (ii) variations in the density of attached molecules (with and without reactive atoms) to graphene with their respective dipole moments provides compelling evidence to support our claim. Additionally, these observations were supported by first principle density functional calculations.Keywords: graphene, doping, charge transfer, liquid phase exfoliation
Procedia PDF Downloads 65238 Artificial Intelligence-Aided Extended Kalman Filter for Magnetometer-Based Orbit Determination
Authors: Gilberto Goracci, Fabio Curti
Abstract:
This work presents a robust, light, and inexpensive algorithm to perform autonomous orbit determination using onboard magnetometer data in real-time. Magnetometers are low-cost and reliable sensors typically available on a spacecraft for attitude determination purposes, thus representing an interesting choice to perform real-time orbit determination without the need to add additional sensors to the spacecraft itself. Magnetic field measurements can be exploited by Extended/Unscented Kalman Filters (EKF/UKF) for orbit determination purposes to make up for GPS outages, yielding errors of a few kilometers and tens of meters per second in the position and velocity of a spacecraft, respectively. While this level of accuracy shows that Kalman filtering represents a solid baseline for autonomous orbit determination, it is not enough to provide a reliable state estimation in the absence of GPS signals. This work combines the solidity and reliability of the EKF with the versatility of a Recurrent Neural Network (RNN) architecture to further increase the precision of the state estimation. Deep learning models, in fact, can grasp nonlinear relations between the inputs, in this case, the magnetometer data and the EKF state estimations, and the targets, namely the true position, and velocity of the spacecraft. The model has been pre-trained on Sun-Synchronous orbits (SSO) up to 2126 kilometers of altitude with different initial conditions and levels of noise to cover a wide range of possible real-case scenarios. The orbits have been propagated considering J2-level dynamics, and the geomagnetic field has been modeled using the International Geomagnetic Reference Field (IGRF) coefficients up to the 13th order. The training of the module can be completed offline using the expected orbit of the spacecraft to heavily reduce the onboard computational burden. Once the spacecraft is launched, the model can use the GPS signal, if available, to fine-tune the parameters on the actual orbit onboard in real-time and work autonomously during GPS outages. In this way, the provided module shows versatility, as it can be applied to any mission operating in SSO, but at the same time, the training is completed and eventually fine-tuned, on the specific orbit, increasing performances and reliability. The results provided by this study show an increase of one order of magnitude in the precision of state estimate with respect to the use of the EKF alone. Tests on simulated and real data will be shown.Keywords: artificial intelligence, extended Kalman filter, orbit determination, magnetic field
Procedia PDF Downloads 105237 Modal Composition and Tectonic Provenance of the Sandstones of Ecca Group, Karoo Supergroup in the Eastern Cape Province, South Africa
Authors: Christopher Baiyegunhi, Kuiwu Liu, Oswald Gwavava
Abstract:
Petrography of the sandstones of Ecca Group, Karoo Supergroup in the Eastern Cape Province of South Africa have been investigated on composition, provenance and influence of weathering conditions. Petrographic studies based on quantitative analysis of the detrital minerals revealed that the sandstones are composed mostly of quartz, feldspar and lithic fragments of metamorphic and sedimentary rocks. The sandstones have an average framework composition of 24.3% quartz, 19.3% feldspar, 26.1% rock fragments, and 81.33% of the quartz grains are monocrystalline. These sandstones are generally very fine to fine grained, moderate to well sorted, and subangular to subrounded in shape. In addition, they are compositionally immature and can be classified as feldspathic wacke and lithic wacke. The absence of major petrographically distinctive compositional variations in the sandstones perhaps indicate homogeneity of their source. As a result of this, it is inferred that the transportation distance from the source area was quite short and the main mechanism of transportation was by river systems to the basin. The QFL ternary diagrams revealed dissected and transitional arc provenance pointing to an active margin and uplifted basement preserving the signature of a recycled provenance. This is an indication that the sandstones were derived from a magmatic arc provenance. Since magmatic provenance includes transitional arc and dissected arc, it also shows that the source area of the Ecca sediments had a secondary sedimentary and metasedimentary rocks from a marginal belt that developed as a result of rifting. The weathering diagrams and semi-quantitative weathering index indicate that the Ecca sandstones are mostly from a plutonic source area, with climatic conditions ranging from arid to humid. The compositional immaturity of the sandstones is suggested to be due to weathering or recycling and low relief or short transport from the source area. The detrital modal compositions of these sandstones are related to back arc to island and continental margin arc. The origin and deposition of the Ecca sandstones are due to low-moderate weathering, recycling of pre-existing rocks, erosion and transportation of debris from the orogeny of the Cape Fold Belt.Keywords: petrography, tectonic setting, provenance, Ecca Group, Karoo Basin
Procedia PDF Downloads 434236 Manufacturing of Race Car Case Study AGH Racing
Authors: Hanna Faron, Wojciech Marcinkowski, Daniel Prusak
Abstract:
The aim of this article is to familiarize with the activity of AGH Racing scientific circle, pertaining to the international project -Formula Student, giving the opportunity to young engineers from all around the world to validate their talent and knowledge in the real world conditions, under the pressure of time, and the design requirements. Every year, the team begins the process of building a race car from the formation of human resources. In case of the public sector, to which public universities can be included, the scientific circles represent the structure uniting students with the common interests and level of determination. Due to the scientific nature of the project which simulates the market conditions, they have a chance to verify previously acquired knowledge in practice. High level of the innovation and competitiveness of participating in the project Formula Student teams, requires an intelligent organizational system, which is characterized by a high dynamics. It is connected with the necessity of separation of duties, setting priorities, selecting optimal solutions which is often a compromise between the available technology and a limited budget. Proper selection of the adequate guidelines in the design phase allows an efficient transition to the implementation stage, which is process-oriented implementation of the project. Four dynamic and three static competitions are the main verification and evaluation of year-round work and effort put into the process of building a race car. Acquired feedback flowing during the race is a very important part while monitoring the effectiveness of AGH Racing scientific circle, as well as the main criterion while determining long-term goals and all the necessary improvements in the team.Keywords: SAE, formula student, race car, public sector, automotive industry
Procedia PDF Downloads 348235 The Use of Whatsapp Platform in Spreading Fake News among Mass Communication Students of Abdu Gusau Polytechnic, Talata Mafara
Authors: Aliyu Damri
Abstract:
In every educational institution, students of mass communication receive training to report events and issues accurately and objectively in accordance with official controls. However, the complex nature of society today made it possible to use WhatsApp platform that revolutionizes the means of sharing information, ideas, and experiences. This paper examined how students in the Department of Mass Communication, Abdu Gusau Polytechnic, Talata Mafara used WhatsApp platform in spreading fake news. It used in depth interview techniques and focus group discussion with students as well as the use of published materials to gather related and relevant data. Also, the paper used procedures involved to analyze long interview content. This procedure includes observation of a useful utterance, development of expanded observation, examination of interconnection of observed comments, collective scrutiny of observation for patterns and themes, and review and analysis of the themes across all interviews for development of thesis. The result indicated that inadequate and absent of official controls guiding the conduct of online information sharing, inaccuracies and poor source verification, lack of gate keeping procedures to ensure ethical and legal provisions, bringing users into the process, sharing all information, availability of misinformation, disinformation and rumor and problem of conversation strongly encouraged the emergence of fake news. Surprisingly, the idea of information as a commodity has increased, and transparency of a source as new ethics emerged.Keywords: disinformation, fake news, group, mass communication, misinformation, WhatsApp
Procedia PDF Downloads 143234 Holographic Visualisation of 3D Point Clouds in Real-time Measurements: A Proof of Concept Study
Authors: Henrique Fernandes, Sofia Catalucci, Richard Leach, Kapil Sugand
Abstract:
Background: Holograms are 3D images formed by the interference of light beams from a laser or other coherent light source. Pepper’s ghost is a form of hologram conceptualised in the 18th century. This Holographic visualisation with metrology measuring techniques by displaying measurements taken in real-time in holographic form can assist in research and education. New structural designs such as the Plexiglass Stand and the Hologram Box can optimise the holographic experience. Method: The equipment used included: (i) Zeiss’s ATOS Core 300 optical coordinate measuring instrument that scanned real-world objects; (ii) Cloud Compare, open-source software used for point cloud processing; and (iii) Hologram Box, designed and manufactured during this research to provide the blackout environment needed to display 3D point clouds in real-time measurements in holographic format, in addition to a portability aspect to holograms. The equipment was tailored to realise the goal of displaying measurements in an innovative technique and to improve on conventional methods. Three test scans were completed before doing a holographic conversion. Results: The outcome was a precise recreation of the original object in the holographic form presented with dense point clouds and surface density features in a colour map. Conclusion: This work establishes a way to visualise data in a point cloud system. To our understanding, this is a work that has never been attempted. This achievement provides an advancement in holographic visualisation. The Hologram Box could be used as a feedback tool for measurement quality control and verification in future smart factories.Keywords: holography, 3D scans, hologram box, metrology, point cloud
Procedia PDF Downloads 89233 Evolution of Web Development Progress in Modern Information Technology
Authors: Abdul Basit Kiani
Abstract:
Web development, the art of creating and maintaining websites, has witnessed remarkable advancements. The aim is to provide an overview of some of the cutting-edge developments in the field. Firstly, the rise of responsive web design has revolutionized user experiences across devices. With the increasing prevalence of smartphones and tablets, web developers have adapted to ensure seamless browsing experiences, regardless of screen size. This progress has greatly enhanced accessibility and usability, catering to the diverse needs of users worldwide. Additionally, the evolution of web frameworks and libraries has significantly streamlined the development process. Tools such as React, Angular, and Vue.js have empowered developers to build dynamic and interactive web applications with ease. These frameworks not only enhance efficiency but also bolster scalability, allowing for the creation of complex and feature-rich web solutions. Furthermore, the emergence of progressive web applications (PWAs) has bridged the gap between native mobile apps and web development. PWAs leverage modern web technologies to deliver app-like experiences, including offline functionality, push notifications, and seamless installation. This innovation has transformed the way users interact with websites, blurring the boundaries between traditional web and mobile applications. Moreover, the integration of artificial intelligence (AI) and machine learning (ML) has opened new horizons in web development. Chatbots, intelligent recommendation systems, and personalization algorithms have become integral components of modern websites. These AI-powered features enhance user engagement, provide personalized experiences, and streamline customer support processes, revolutionizing the way businesses interact with their audiences. Lastly, the emphasis on web security and privacy has been a pivotal area of progress. With the increasing incidents of cyber threats, web developers have implemented robust security measures to safeguard user data and ensure secure transactions. Innovations such as HTTPS protocol, two-factor authentication, and advanced encryption techniques have bolstered the overall security of web applications, fostering trust and confidence among users. Hence, recent progress in web development has propelled the industry forward, enabling developers to craft innovative and immersive digital experiences. From responsive design to AI integration and enhanced security, the landscape of web development continues to evolve, promising a future filled with endless possibilities.Keywords: progressive web applications (PWAs), web security, machine learning (ML), web frameworks, advancement responsive web design
Procedia PDF Downloads 54232 The Instrumentalization of Digital Media in the Context of Sexualized Violence
Authors: Katharina Kargel, Frederic Vobbe
Abstract:
Sexual online grooming is generally defined as digital interactions for the purpose of sexual exploitation of children or minors, i.e. as a process for preparing and framing sexual child abuse. Due to its conceptual history, sexual online grooming is often associated with perpetrators who are previously unknown to those affected. While the strategies of perpetrators and the perception of those affected are increasingly being investigated, the instrumentalisation of digital media has not yet been researched much. Therefore, the present paper aims at contributing to this research gap by examining in what kind of ways perpetrators instrumentalise digital media. Our analyses draw on 46 case documentations and 18 interviews with those affected. The cases and the partly narrative interviews were collected by ten cooperating specialist centers working on sexualized violence in childhood and youth. For this purpose, we designed a documentation grid allowing for a detailed case reconstruction i.e. including information on the violence, digital media use and those affected. By using Reflexive Grounded Theory, our analyses emphasize a) the subjective benchmark of professional practitioners as well as those affected and b) the interpretative implications resulting from our researchers’ subjective and emotional interaction with the data material. It should first be noted that sexualized online grooming can result in both online and offline sexualized violence as well as hybrid forms. Furthermore, the perpetrators either come from the immediate social environment of those affected or are unknown to them. The perpetrator-victim relationship plays a more important role with regard to the question of the instrumentalisation of digital media than the question of the space (on vs. off) in which the primary violence is committed. Perpetrators unknown to those affected instrumentalise digital media primarily to establish a sexualized system of norms, which is usually embedded in a supposed love relationship. In some cases, after an initial exchange of sexualized images or video recordings, a latent play on the position of power takes place. In the course of the grooming process, perpetrators from the immediate social environment increasingly instrumentalise digital media to establish an explicit relationship of power and dependence, which is directly determined by coercion, threats and blackmail. The knowledge of possible vulnerabilities is strategically used in the course of maintaining contact. The above explanations lead to the conclusion that the motive for the crime plays an essential role in the question of the instrumentalisation of digital media. It is therefore not surprising that it is mostly the near-field perpetrators without commercial motives who initiate a spiral of violence and stress by digitally distributing sexualized (violent) images and video recordings within the reference system of those affected.Keywords: sexualized violence, children and youth, grooming, offender strategies, digital media
Procedia PDF Downloads 186231 Nondecoupling Signatures of Supersymmetry and an Lμ-Lτ Gauge Boson at Belle-II
Authors: Heerak Banerjee, Sourov Roy
Abstract:
Supersymmetry, one of the most celebrated fields of study for explaining experimental observations where the standard model (SM) falls short, is reeling from the lack of experimental vindication. At the same time, the idea of additional gauge symmetry, in particular, the gauged Lμ-Lτ symmetric models have also generated significant interest. They have been extensively proposed in order to explain the tantalizing discrepancy in the predicted and measured value of the muon anomalous magnetic moment alongside several other issues plaguing the SM. While very little parameter space within these models remain unconstrained, this work finds that the γ + Missing Energy (ME) signal at the Belle-II detector will be a smoking gun for supersymmetry (SUSY) in the presence of a gauged U(1)Lμ-Lτ symmetry. A remarkable consequence of breaking the enhanced symmetry appearing in the limit of degenerate (s)leptons is the nondecoupling of the radiative contribution of heavy charged sleptons to the γ-Z΄ kinetic mixing. The signal process, e⁺e⁻ →γZ΄→γ+ME, is an outcome of this ubiquitous feature. Taking the severe constraints on gauged Lμ-Lτ models by several low energy observables into account, it is shown that any significant excess in all but the highest photon energy bin would be an undeniable signature of such heavy scalar fields in SUSY coupling to the additional gauge boson Z΄. The number of signal events depends crucially on the logarithm of the ratio of stau to smuon mass in the presence of SUSY. In addition, the number is also inversely proportional to the e⁺e⁻ collision energy, making a low-energy, high-luminosity collider like Belle-II an ideal testing ground for this channel. This process can probe large swathes of the hitherto free slepton mass ratio vs. additional gauge coupling (gₓ) parameter space. More importantly, it can explore the narrow slice of Z΄ mass (MZ΄) vs. gₓ parameter space still allowed in gauged U(1)Lμ-Lτ models for superheavy sparticles. The spectacular finding that the signal significance is independent of individual slepton masses is an exciting prospect indeed. Further, the prospect that signatures of even superheavy SUSY particles that may have escaped detection at the LHC may show up at the Belle-II detector is an invigorating revelation.Keywords: additional gauge symmetry, electron-positron collider, kinetic mixing, nondecoupling radiative effect, supersymmetry
Procedia PDF Downloads 128230 Verification of Satellite and Observation Measurements to Build Solar Energy Projects in North Africa
Authors: Samy A. Khalil, U. Ali Rahoma
Abstract:
The measurements of solar radiation, satellite data has been routinely utilize to estimate solar energy. However, the temporal coverage of satellite data has some limits. The reanalysis, also known as "retrospective analysis" of the atmosphere's parameters, is produce by fusing the output of NWP (Numerical Weather Prediction) models with observation data from a variety of sources, including ground, and satellite, ship, and aircraft observation. The result is a comprehensive record of the parameters affecting weather and climate. The effectiveness of reanalysis datasets (ERA-5) for North Africa was evaluate against high-quality surfaces measured using statistical analysis. Estimating the distribution of global solar radiation (GSR) over five chosen areas in North Africa through ten-years during the period time from 2011 to 2020. To investigate seasonal change in dataset performance, a seasonal statistical analysis was conduct, which showed a considerable difference in mistakes throughout the year. By altering the temporal resolution of the data used for comparison, the performance of the dataset is alter. Better performance is indicate by the data's monthly mean values, but data accuracy is degraded. Solar resource assessment and power estimation are discuses using the ERA-5 solar radiation data. The average values of mean bias error (MBE), root mean square error (RMSE) and mean absolute error (MAE) of the reanalysis data of solar radiation vary from 0.079 to 0.222, 0.055 to 0.178, and 0.0145 to 0.198 respectively during the period time in the present research. The correlation coefficient (R2) varies from 0.93 to 99% during the period time in the present research. This research's objective is to provide a reliable representation of the world's solar radiation to aid in the use of solar energy in all sectors.Keywords: solar energy, ERA-5 analysis data, global solar radiation, North Africa
Procedia PDF Downloads 101229 Creativity and Expressive Interpretation of Musical Drama in Children with Special Needs (Down Syndrome) in Special Schools Yayasan Pendidikan Anak Cacat, Medan, North Sumatera
Authors: Junita Batubara
Abstract:
Children with special needs, especially those with disability in mental, physical or social/emotional interactions, are marginalized. Many people still view them as troublesome, inconvenience, having learning difficulties, unproductive and burdensome to society. This study intends to investigate; how musical drama can develop the ability to control the coordination of mental functions; how musical dramas can assist children to work together; how musical dramas can assist to maintain the child's emotional and physical health; how musical dramas can improve children creativity. The objectives of the research are: To know whether musical drama can control the coordination of mental function of children; to know whether musical drama can improve communication ability and expression of children; to know whether musical drama can help children work with people around them; to find out if musical dramas can develop the child's emotional and physical health; to find out if musical drama can improve children's creativity. The study employed a qualitative research approach. Data was collecting by listening, observing in depth through public hearings that select the key informants who were teachers and principals, parents and children. The data obtained from each public hearing was then processed (reduced), conclusion drawing/verification, presentation of data (data display). Furthermore, the model obtained was implementing for musical performance, where the benefits of the show are: musical drama can improve language skills; musical dramas are capable of developing memory and storage of information; developing communication skills and express themselves; helping children work together; assisting emotional and physical health; enhancing creativity.Keywords: children Down syndrome, music, drama script, performance
Procedia PDF Downloads 244228 Real-Time Monitoring of Drinking Water Quality Using Advanced Devices
Authors: Amani Abdallah, Isam Shahrour
Abstract:
The quality of drinking water is a major concern of public health. The control of this quality is generally performed in the laboratory, which requires a long time. This type of control is not adapted for accidental pollution from sudden events, which can have serious consequences on population health. Therefore, it is of major interest to develop real-time innovative solutions for the detection of accidental contamination in drinking water systems This paper presents researches conducted within the SunRise Demonstrator for ‘Smart and Sustainable Cities’ with a particular focus on the supervision of the water quality. This work aims at (i) implementing a smart water system in a large water network (Campus of the University Lille1) including innovative equipment for real-time detection of abnormal events, such as those related to the contamination of drinking water and (ii) develop a numerical modeling of the contamination diffusion in the water distribution system. The first step included verification of the water quality sensors and their effectiveness on a network prototype of 50m length. This part included the evaluation of the efficiency of these sensors in the detection both bacterial and chemical contamination events in drinking water distribution systems. An on-line optical sensor integral with a laboratory-scale distribution system (LDS) was shown to respond rapidly to changes in refractive index induced by injected loads of chemical (cadmium, mercury) and biological contaminations (Escherichia coli). All injected substances were detected by the sensor; the magnitude of the response depends on the type of contaminant introduced and it is proportional to the injected substance concentration.Keywords: distribution system, drinking water, refraction index, sensor, real-time
Procedia PDF Downloads 357227 Advancements in Autonomous Drones for Enhanced Healthcare Logistics
Authors: Bhaargav Gupta P., Vignesh N., Nithish Kumar R., Rahul J., Nivetha Ruvah D.
Abstract:
Delivering essential medical supplies to rural and underserved areas is challenging due to infrastructure limitations and logistical barriers, often resulting in inefficiencies and delays. Traditional delivery methods are hindered by poor road networks, long distances, and difficult terrains, compromising timely access to vital resources, especially in emergencies. This paper introduces an autonomous drone system engineered to optimize last-mile delivery. By utilizing advanced navigation and object-detection algorithms, such as region-based convolutional neural networks (R-CNN), our drones efficiently avoid obstacles, identify safe landing zones, and adapt dynamically to varying environments. Equipped with high-precision GPS and autonomous capabilities, the drones effectively navigate complex, remote areas with minimal dependence on established infrastructure. The system includes a dedicated mobile application for secure order placement and real-time tracking, and a secure payload box with OTP verification ensures tamper-resistant delivery to authorized recipients. This project demonstrates the potential of automated drone technology in healthcare logistics, offering a scalable and eco-friendly approach to enhance accessibility and service delivery in underserved regions. By addressing logistical gaps through advanced automation, this system represents a significant advancement toward sustainable, accessible healthcare in remote areas.Keywords: region-based convolutional neural network, one time password, global positioning system, autonomous drones, healthcare logistics
Procedia PDF Downloads 16226 Nonlinear Aerodynamic Parameter Estimation of a Supersonic Air to Air Missile by Using Artificial Neural Networks
Authors: Tugba Bayoglu
Abstract:
Aerodynamic parameter estimation is very crucial in missile design phase, since accurate high fidelity aerodynamic model is required for designing high performance and robust control system, developing high fidelity flight simulations and verification of computational and wind tunnel test results. However, in literature, there is not enough missile aerodynamic parameter identification study for three main reasons: (1) most air to air missiles cannot fly with constant speed, (2) missile flight test number and flight duration are much less than that of fixed wing aircraft, (3) variation of the missile aerodynamic parameters with respect to Mach number is higher than that of fixed wing aircraft. In addition to these challenges, identification of aerodynamic parameters for high wind angles by using classical estimation techniques brings another difficulty in the estimation process. The reason for this, most of the estimation techniques require employing polynomials or splines to model the behavior of the aerodynamics. However, for the missiles with a large variation of aerodynamic parameters with respect to flight variables, the order of the proposed model increases, which brings computational burden and complexity. Therefore, in this study, it is aimed to solve nonlinear aerodynamic parameter identification problem for a supersonic air to air missile by using Artificial Neural Networks. The method proposed will be tested by using simulated data which will be generated with a six degree of freedom missile model, involving a nonlinear aerodynamic database. The data will be corrupted by adding noise to the measurement model. Then, by using the flight variables and measurements, the parameters will be estimated. Finally, the prediction accuracy will be investigated.Keywords: air to air missile, artificial neural networks, open loop simulation, parameter identification
Procedia PDF Downloads 281225 Trinary Affinity—Mathematic Verification and Application (1): Construction of Formulas for the Composite and Prime Numbers
Authors: Liang Ming Zhong, Yu Zhong, Wen Zhong, Fei Fei Yin
Abstract:
Trinary affinity is a description of existence: every object exists as it is known and spoken of, in a system of 2 differences (denoted dif1, dif₂) and 1 similarity (Sim), equivalently expressed as dif₁ / Sim / dif₂ and kn / 0 / tkn (kn = the known, tkn = the 'to be known', 0 = the zero point of knowing). They are mathematically verified and illustrated in this paper by the arrangement of all integers onto 3 columns, where each number exists as a difference in relation to another number as another difference, and the 2 difs as arbitrated by a third number as the Sim, resulting in a trinary affinity or trinity of 3 numbers, of which one is the known, the other the 'to be known', and the third the zero (0) from which both the kn and tkn are measured and specified. Consequently, any number is horizontally specified either as 3n, or as '3n – 1' or '3n + 1', and vertically as 'Cn + c', so that any number seems to occur at the intersection of its X and Y axes and represented by its X and Y coordinates, as any point on Earth’s surface by its latitude and longitude. Technically, i) primes are viewed and treated as progenitors, and composites as descending from them, forming families of composites, each capable of being measured and specified from its own zero called in this paper the realistic zero (denoted 0r, as contrasted to the mathematic zero, 0m), which corresponds to the constant c, and the nature of which separates the composite and prime numbers, and ii) any number is considered as having a magnitude as well as a position, so that a number is verified as a prime first by referring to its descriptive formula and then by making sure that no composite number can possibly occur on its position, by dividing it with factors provided by the composite number formulas. The paper consists of 3 parts: 1) a brief explanation of the trinary affinity of things, 2) the 8 formulas that represent ALL the primes, and 3) families of composite numbers, each represented by a formula. A composite number family is described as 3n + f₁‧f₂. Since there are an infinitely large number of composite number families, to verify the primality of a great probable prime, we have to have it divided with several or many a f₁ from a range of composite number formulas, a procedure that is as laborious as it is the surest way to verifying a great number’s primality. (So, it is possible to substitute planned division for trial division.)Keywords: trinary affinity, difference, similarity, realistic zero
Procedia PDF Downloads 212224 Improving Electrical Safety through Enhanced Work Permits
Authors: Nuwan Karunarathna, Hemali Seneviratne
Abstract:
Distribution Utilities inherently present electrical hazards for their workers in addition to the general public especially due to bare overhead lines spreading out over a large geographical area. Therefore, certain procedures such as; de-energization, verification of de-energization, isolation, lock-out tag-out and earthing are carried out to ensure safe working conditions when conducting maintenance work on de-energized overhead lines. However, measures must be taken to coordinate the above procedures and to ensure successful and accurate execution of those procedures. Issuing of 'Work Permits' is such a measure that is used by the Distribution Utility considered in this paper. Unfortunately, the Work Permit method adopted by the Distribution Utility concerned here has not been successful in creating the safe working conditions as expected which was evidenced by four (4) number of fatalities of workers due to electrocution occurred in the Distribution Utility from 2016 to 2018. Therefore, this paper attempts to identify deficiencies in the Work Permit method and related contributing factors through careful analysis of the four (4) fatalities and work place practices to rectify the short comings to prevent future incidents. The analysis shows that the present level of coordination between the 'Authorized Person' who issues the work permit and the 'Competent Person' who performs the actual work is grossly inadequate to achieve the intended safe working conditions. The paper identifies the need of active participation of a 'Control Person' who oversees the whole operation from a bird’s eye perspective and recommends further measures that are derived through the analysis of the fatalities to address the identified lapses in the current work permit system.Keywords: authorized person, competent person, control person, de-energization, distribution utility, isolation, lock-out tag-out, overhead lines, work permit
Procedia PDF Downloads 133223 Space Weather and Earthquakes: A Case Study of Solar Flare X9.3 Class on September 6, 2017
Authors: Viktor Novikov, Yuri Ruzhin
Abstract:
The studies completed to-date on a relation of the Earth's seismicity and solar processes provide the fuzzy and contradictory results. For verification of an idea that solar flares can trigger earthquakes, we have analyzed a case of a powerful surge of solar flash activity early in September 2017 during approaching the minimum of 24th solar cycle was accompanied by significant disturbances of space weather. On September 6, 2017, a group of sunspots AR2673 generated a large solar flare of X9.3 class, the strongest flare over the past twelve years. Its explosion produced a coronal mass ejection partially directed towards the Earth. We carried out a statistical analysis of the catalogs of earthquakes USGS and EMSC for determination of the effect of solar flares on global seismic activity. New evidence of earthquake triggering due to the Sun-Earth interaction has been demonstrated by simple comparison of behavior of Earth's seismicity before and after the strong solar flare. The global number of earthquakes with magnitude of 2.5 to 5.5 within 11 days after the solar flare has increased by 30 to 100%. A possibility of electric/electromagnetic triggering of earthquake due to space weather disturbances is supported by results of field and laboratory studies, where the earthquakes (both natural and laboratory) were initiated by injection of electrical current into the Earth crust. For the specific case of artificial electric earthquake triggering the current density at a depth of earthquake, sources are comparable with estimations of a density of telluric currents induced by variation of space weather conditions due to solar flares. Acknowledgment: The work was supported by RFBR grant No. 18-05-00255.Keywords: solar flare, earthquake activity, earthquake triggering, solar-terrestrial relations
Procedia PDF Downloads 144222 Ethicality of Algorithmic Pricing and Consumers’ Resistance
Authors: Zainab Atia, Hongwei He, Panagiotis Sarantopoulos
Abstract:
Over the past few years, firms have witnessed a massive increase in sophisticated algorithmic deployment, which has become quite pervasive in today’s modern society. With the wide availability of data for retailers, the ability to track consumers using algorithmic pricing has become an integral option in online platforms. As more companies are transforming their businesses and relying more on massive technological advancement, pricing algorithmic systems have brought attention and given rise to its wide adoption, with many accompanying benefits and challenges to be found within its usage. With the overall aim of increasing profits by organizations, algorithmic pricing is becoming a sound option by enabling suppliers to cut costs, allowing better services, improving efficiency and product availability, and enhancing overall consumer experiences. The adoption of algorithms in retail has been pioneered and widely used in literature across varied fields, including marketing, computer science, engineering, economics, and public policy. However, what is more, alarming today is the comprehensive understanding and focus of this technology and its associated ethical influence on consumers’ perceptions and behaviours. Indeed, due to algorithmic ethical concerns, consumers are found to be reluctant in some instances to share their personal data with retailers, which reduces their retention and leads to negative consumer outcomes in some instances. This, in its turn, raises the question of whether firms can still manifest the acceptance of such technologies by consumers while minimizing the ethical transgressions accompanied by their deployment. As recent modest research within the area of marketing and consumer behavior, the current research advances the literature on algorithmic pricing, pricing ethics, consumers’ perceptions, and price fairness literature. With its empirical focus, this paper aims to contribute to the literature by applying the distinction of the two common types of algorithmic pricing, dynamic and personalized, while measuring their relative effect on consumers’ behavioural outcomes. From a managerial perspective, this research offers significant implications that pertain to providing a better human-machine interactive environment (whether online or offline) to improve both businesses’ overall performance and consumers’ wellbeing. Therefore, by allowing more transparent pricing systems, businesses can harness their generated ethical strategies, which fosters consumers’ loyalty and extend their post-purchase behaviour. Thus, by defining the correct balance of pricing and right measures, whether using dynamic or personalized (or both), managers can hence approach consumers more ethically while taking their expectations and responses at a critical stance.Keywords: algorithmic pricing, dynamic pricing, personalized pricing, price ethicality
Procedia PDF Downloads 92221 Analysis of Surface Hardness, Surface Roughness and near Surface Microstructure of AISI 4140 Steel Worked with Turn-Assisted Deep Cold Rolling Process
Authors: P. R. Prabhu, S. M. Kulkarni, S. S. Sharma, K. Jagannath, Achutha Kini U.
Abstract:
In the present study, response surface methodology has been used to optimize turn-assisted deep cold rolling process of AISI 4140 steel. A regression model is developed to predict surface hardness and surface roughness using response surface methodology and central composite design. In the development of predictive model, deep cold rolling force, ball diameter, initial roughness of the workpiece, and number of tool passes are considered as model variables. The rolling force and the ball diameter are the significant factors on the surface hardness and ball diameter and numbers of tool passes are found to be significant for surface roughness. The predicted surface hardness and surface roughness values and the subsequent verification experiments under the optimal operating conditions confirmed the validity of the predicted model. The absolute average error between the experimental and predicted values at the optimal combination of parameter settings for surface hardness and surface roughness is calculated as 0.16% and 1.58% respectively. Using the optimal processing parameters, the hardness is improved from 225 to 306 HV, which resulted in an increase in the near surface hardness by about 36% and the surface roughness is improved from 4.84µm to 0.252 µm, which resulted in decrease in the surface roughness by about 95%. The depth of compression is found to be more than 300µm from the microstructure analysis and this is in correlation with the results obtained from the microhardness measurements. Taylor Hobson Talysurf tester, micro Vickers hardness tester, optical microscopy and X-ray diffractometer are used to characterize the modified surface layer.Keywords: hardness, response surface methodology, microstructure, central composite design, deep cold rolling, surface roughness
Procedia PDF Downloads 423220 Simo-syl: A Computer-Based Tool to Identify Language Fragilities in Italian Pre-Schoolers
Authors: Marinella Majorano, Rachele Ferrari, Tamara Bastianello
Abstract:
The recent technological advance allows for applying innovative and multimedia screen-based assessment tools to test children's language and early literacy skills, monitor their growth over the preschool years, and test their readiness for primary school. Several are the advantages that a computer-based assessment tool offers with respect to paper-based tools. Firstly, computer-based tools which provide the use of games, videos, and audio may be more motivating and engaging for children, especially for those with language difficulties. Secondly, computer-based assessments are generally less time-consuming than traditional paper-based assessments: this makes them less demanding for children and provides clinicians and researchers, but also teachers, with the opportunity to test children multiple times over the same school year and, thus, to monitor their language growth more systematically. Finally, while paper-based tools require offline coding, computer-based tools sometimes allow obtaining automatically calculated scores, thus producing less subjective evaluations of the assessed skills and provide immediate feedback. Nonetheless, using computer-based assessment tools to test meta-phonological and language skills in children is not yet common practice in Italy. The present contribution aims to estimate the internal consistency of a computer-based assessment (i.e., the Simo-syl assessment). Sixty-three Italian pre-schoolers aged between 4;10 and 5;9 years were tested at the beginning of the last year of the preschool through paper-based standardised tools in their lexical (Peabody Picture Vocabulary Test), morpho-syntactical (Grammar Repetition Test for Children), meta-phonological (Meta-Phonological skills Evaluation test), and phono-articulatory skills (non-word repetition). The same children were tested through Simo-syl assessment on their phonological and meta-phonological skills (e.g., recognise syllables and vowels and read syllables and words). The internal consistency of the computer-based tool was acceptable (Cronbach's alpha = .799). Children's scores obtained in the paper-based assessment and scores obtained in each task of the computer-based assessment were correlated. Significant and positive correlations emerged between all the tasks of the computer-based assessment and the scores obtained in the CMF (r = .287 - .311, p < .05) and in the correct sentences in the RCGB (r = .360 - .481, p < .01); non-word repetition standardised test significantly correlates with the reading tasks only (r = .329 - .350, p < .05). Further tasks should be included in the current version of Simo-syl to have a comprehensive and multi-dimensional approach when assessing children. However, such a tool represents a good chance for the teachers to early identifying language-related problems even in the school environment.Keywords: assessment, computer-based, early identification, language-related skills
Procedia PDF Downloads 185219 Advanced Biosensor Characterization of Phage-Mediated Lysis in Real-Time and under Native Conditions
Authors: Radka Obořilová, Hana Šimečková, Matěj Pastucha, Jan Přibyl, Petr Skládal, Ivana Mašlaňová, Zdeněk Farka
Abstract:
Due to the spreading of antimicrobial resistance, alternative approaches to combat superinfections are being sought, both in the field of lysing agents and methods for studying bacterial lysis. A suitable alternative to antibiotics is phage therapy and enzybiotics, for which it is also necessary to study the mechanism of their action. Biosensor-based techniques allow rapid detection of pathogens in real time, verification of sensitivity to commonly used antimicrobial agents, and selection of suitable lysis agents. The detection of lysis takes place on the surface of the biosensor with immobilized bacteria, which has the potential to be used to study biofilms. An example of such a biosensor is surface plasmon resonance (SPR), which records the kinetics of bacterial lysis based on a change in the resonance angle. The bacteria are immobilized on the surface of the SPR chip, and the action of phage as the mass loss is monitored after a typical lytic cycle delay. Atomic force microscopy (AFM) is a technique for imaging of samples on the surface. In contrast to electron microscopy, it has the advantage of real-time imaging in the native conditions of the nutrient medium. In our case, Staphylococcus aureus was lysed using the enzyme lysostaphin and phage P68 from the familyPodoviridae at 37 ° C. In addition to visualization, AFM was used to study changes in mechanical properties during lysis, which resulted in a reduction of Young’s modulus (E) after disruption of the bacterial wall. Changes in E reflect the stiffness of the bacterium. These advanced methods provide deeper insight into bacterial lysis and can help to fight against bacterial diseases.Keywords: biosensors, atomic force microscopy, surface plasmon resonance, bacterial lysis, staphylococcus aureus, phage P68
Procedia PDF Downloads 134218 Sprinting Beyond Sexism and Gender Stereotypes: Indian Women Fans' Experiences in the Sports Fandom
Authors: Siddhi Deshpande, Jo Jo Chacko Eapen
Abstract:
Despite almost half of India’s female population engages in watching sports, their experiences in the sports fandom are concealed by ‘traditional masculinity,’ leading to potential exclusion and harassment. To explore these experiences in-depth, this qualitative study aims to understand what coping strategies Indian women fans employ, to sustain their team identification. Employing criterion sampling, participants were screened using The Sports Spectators Identification Scale (SSIS) to assess team identification and a Brief Sexism Questionnaire to confirm participants’ experience with sexism as it aligns with the purpose of the study. The participants were Indian women who had been following any sport for more than eight years, were fluent in English, and were not professionals in Sports. Ten highly identified fans with gendered experiences were recruited for one-on-one semi-structured, in-depth interviews. The data was analyzed using Interpretive Phenomenological Analysis (IPA) to understand the lived-in experiences of women fans experiencing sexism and gender stereotypes, revealing superordinate themes of (1) Ontogenesis and Emotional Investment; (2) Gendered Expectations and Sexism; (3) Coping Strategies and Resilience; (4) Identity, Femininity, Empowerment; (5) Advocacy for Equality and Inclusivity. The findings reflect that Indian women fans experience social exclusion, harassment, sexualization, and commodification, in both online and offline fandoms, where they are disproportionately targeted with threats, misogynistic comments, and attraction-based assumptions, questioning their ‘authenticity’ as fans due to their gender. Women fans interchange between proactive strategies of assertiveness, humor, and knowledge demonstration with defensive strategies of selective engagement, self-regulatory censorship, and desensitization to deal with sexism. In this interplay, the integration of women’s ‘fan identity’ with their self-concept showcases how being a sports fan adds meaning to their lives, despite the constant scrutiny in a male-dominated space, reflecting that femininity and sports should coexist. As a result, they find refuge in female fan communities due to their similar experiences in the fandom and advocate for an equal and inclusive environment where sports are above gender, and not the other way around. A key practical implication of this research is enabling sports organizations to develop inclusive fan engagement policies that actively encourage female fan participation. This includes sensitizing stadium staff and security personnel, promoting gender-neutral language, and, most importantly, establishing safety protocols to protect female fans from adverse experiences in the fandom.Keywords: coping strategies, female sports fans, femininity, gendered experiences, team identification
Procedia PDF Downloads 60217 Promoting 21st Century Skills through Telecollaborative Learning
Authors: Saliha Ozcan
Abstract:
Technology has become an integral part of our lives, aiding individuals in accessing higher order competencies, such as global awareness, creativity, collaborative problem solving, and self-directed learning. Students need to acquire these competencies, often referred to as 21st century skills, in order to adapt to a fast changing world. Today, an ever-increasing number of schools are exploring how engagement through telecollaboration can support language learning and promote 21st century skill development in classrooms. However, little is known regarding how telecollaboration may influence the way students acquire 21st century skills. In this paper, we aim to shed light to the potential implications of telecollaborative practices in acquisition of 21st century skills. In our context, telecollaboration, which might be carried out in a variety of settings both synchronously or asynchronously, is considered as the process of communicating and working together with other people or groups from different locations through online digital tools or offline activities to co-produce a desired work output. The study presented here will describe and analyse the implementation of a telecollaborative project between two high school classes, one in Spain and the other in Sweden. The students in these classes were asked to carry out some joint activities, including creating an online platform, aimed at raising awareness of the situation of the Syrian refugees. We conduct a qualitative study in order to explore how language, culture, communication, and technology merge into the co-construction of knowledge, as well as supporting the attainment of the 21st century skills needed for network-mediated communication. To this end, we collected a significant amount of audio-visual data, including video recordings of classroom interaction and external Skype meetings. By analysing this data, we verify whether the initial pedagogical design and intended objectives of the telecollaborative project coincide with what emerges from the actual implementation of the tasks. Our findings indicate that, as well as planned activities, unplanned classroom interactions may lead to acquisition of certain 21st century skills, such as collaborative problem solving and self-directed learning. This work is part of a wider project (KONECT, EDU2013-43932-P; Spanish Ministry of Economy and Finance), which aims to explore innovative, cross-competency based teaching that can address the current gaps between today’s educational practices and the needs of informed citizens in tomorrow’s interconnected, globalised world.Keywords: 21st century skills, telecollaboration, language learning, network mediated communication
Procedia PDF Downloads 125216 Foreign Language Faculty Mentorship in Vietnam: An Interpretive Qualitative Study
Authors: Hung Tran
Abstract:
This interpretive qualitative study employed three theoretical lenses: Bronfenbrenner’s (1979) Ecological System of Human Development, Vygotsky’s (1978) Sociocultural Theory of Development, and Knowles’s (1970) Adult Learning Theory as the theoretical framework in connection with the constructivist research paradigm to investigate into positive and negative aspects of the extant English as a Foreign Language (EFL) faculty mentoring programs at four higher education institutions (HEIs) in the Mekong River Delta (MRD) of Vietnam. Four apprentice faculty members (mentees), four experienced faculty members (mentors), and two associate deans (administrators) from these HEIs participated in two tape-recorded individual interviews in the Vietnamese language. Twenty interviews were transcribed verbatim and translated into English with verification. The initial analysis of data reveals that the mentoring program, which is mandated by Vietnam’s Ministry of Education and Training, has been implemented differently at these HEIs due to a lack of officially-documented mentoring guidance. Other general themes emerging from the data include essentials of the mentoring program, approaches of the mentoring practice, the mentee – mentor relationship, and lifelong learning beyond the mentoring program. Practically, this study offers stakeholders in the mentoring cycle description of benefits and best practices of tertiary EFL mentorship and a suggested mentoring program that is metaphorically depicted as “a lifebuoy” for its current and potential administrators and mentors to help their mentees survive in the first years of teaching. Theoretically, this study contributes to the world’s growing knowledge of post-secondary mentorship by enriching the modest literature on Asian tertiary EFL mentorship.Keywords: faculty mentorship, mentees, mentors, administrator, the MRD, Vietnam
Procedia PDF Downloads 125215 Establishment of a Test Bed for Integrated Map of Underground Space and Verification of GPR Exploration Equipment
Authors: Jisong Ryu, Woosik Lee, Yonggu Jang
Abstract:
The paper discusses the process of establishing a reliable test bed for verifying the usability of Ground Penetrating Radar (GPR) exploration equipment based on an integrated underground spatial map in Korea. The aim of this study is to construct a test bed consisting of metal and non-metal pipelines to verify the performance of GPR equipment and improve the accuracy of the underground spatial integrated map. The study involved the design and construction of a test bed for metal and non-metal pipe detecting tests. The test bed was built in the SOC Demonstration Research Center (Yeoncheon) of the Korea Institute of Civil Engineering and Building Technology, burying metal and non-metal pipelines up to a depth of 5m. The test bed was designed in both vehicle-type and cart-type GPR-mounted equipment. The study collected data through the construction of the test bed and conducting metal and non-metal pipe detecting tests. The study analyzed the reliability of GPR detecting results by comparing them with the basic drawings, such as the underground space integrated map. The study contributes to the improvement of GPR equipment performance evaluation and the accuracy of the underground spatial integrated map, which is essential for urban planning and construction. The study addressed the question of how to verify the usability of GPR exploration equipment based on an integrated underground spatial map and improve its performance. The study found that the test bed is reliable for verifying the performance of GPR exploration equipment and accurately detecting metal and non-metal pipelines using an integrated underground spatial map. The study concludes that the establishment of a test bed for verifying the usability of GPR exploration equipment based on an integrated underground spatial map is essential. The proposed Korean-style test bed can be used for the evaluation of GPR equipment performance and support the construction of a national non-metal pipeline exploration equipment performance evaluation center in Korea.Keywords: Korea-style GPR testbed, GPR, metal pipe detecting, non-metal pipe detecting
Procedia PDF Downloads 102214 Service Interactions Coordination Using a Declarative Approach: Focuses on Deontic Rule from Semantics of Business Vocabulary and Rules Models
Authors: Nurulhuda A. Manaf, Nor Najihah Zainal Abidin, Nur Amalina Jamaludin
Abstract:
Coordinating service interactions are a vital part of developing distributed applications that are built up as networks of autonomous participants, e.g., software components, web services, online resources, involve a collaboration between a diverse number of participant services on different providers. The complexity in coordinating service interactions reflects how important the techniques and approaches require for designing and coordinating the interaction between participant services to ensure the overall goal of a collaboration between participant services is achieved. The objective of this research is to develop capability of steering a complex service interaction towards a desired outcome. Therefore, an efficient technique for modelling, generating, and verifying the coordination of service interactions is developed. The developed model describes service interactions using service choreographies approach and focusing on a declarative approach, advocating an Object Management Group (OMG) standard, Semantics of Business Vocabulary and Rules (SBVR). This model, namely, SBVR model for service choreographies focuses on a declarative deontic rule expressing both obligation and prohibition, which can be more useful in working with coordinating service interactions. The generated SBVR model is then be formulated and be transformed into Alloy model using Alloy Analyzer for verifying the generated SBVR model. The transformation of SBVR into Alloy allows to automatically generate the corresponding coordination of service interactions (service choreography), hence producing an immediate instance of execution that satisfies the constraints of the specification and verifies whether a specific request can be realised in the given choreography in the generated choreography.Keywords: service choreography, service coordination, behavioural modelling, complex interactions, declarative specification, verification, model transformation, semantics of business vocabulary and rules, SBVR
Procedia PDF Downloads 156213 FE Modelling of Structural Effects of Alkali-Silica Reaction in Reinforced Concrete Beams
Authors: Mehdi Habibagahi, Shami Nejadi, Ata Aminfar
Abstract:
A significant degradation factor that impacts the durability of concrete structures is the alkali-silica reaction. Engineers are frequently charged with the challenges of conducting a thorough safety assessment of concrete structures that have been impacted by ASR. The alkali-silica reaction has a major influence on the structural capacities of structures. In most cases, the reduction in compressive strength, tensile strength, and modulus of elasticity is expressed as a function of free expansion and crack widths. Predicting the effect of ASR on flexural strength is also relevant. In this paper, a nonlinear three-dimensional (3D) finite-element model was proposed to describe the flexural strength degradation induced byASR.Initial strains, initial stresses, initial cracks, and deterioration of material characteristics were all considered ASR factors in this model. The effects of ASR on structural performance were evaluated by focusing on initial flexural stiffness, force–deformation curve, and load-carrying capacity. Degradation of concrete mechanical properties was correlated with ASR growth using material test data conducted at Tech Lab, UTS, and implemented into the FEM for various expansions. The finite element study revealed a better understanding of the ASR-affected RC beam's failure mechanism and capacity reduction as a function of ASR expansion. Furthermore, in this study, decreasing of the residual mechanical properties due to ASRisreviewed, using as input data for the FEM model. Finally, analysis techniques and a comparison of the analysis and the experiment results are discussed. Verification is also provided through analyses of reinforced concrete beams with behavior governed by either flexural or shear mechanisms.Keywords: alkali-silica reaction, analysis, assessment, finite element, nonlinear analysis, reinforced concrete
Procedia PDF Downloads 159212 Development of an EEG-Based Real-Time Emotion Recognition System on Edge AI
Authors: James Rigor Camacho, Wansu Lim
Abstract:
Over the last few years, the development of new wearable and processing technologies has accelerated in order to harness physiological data such as electroencephalograms (EEGs) for EEG-based applications. EEG has been demonstrated to be a source of emotion recognition signals with the highest classification accuracy among physiological signals. However, when emotion recognition systems are used for real-time classification, the training unit is frequently left to run offline or in the cloud rather than working locally on the edge. That strategy has hampered research, and the full potential of using an edge AI device has yet to be realized. Edge AI devices are computers with high performance that can process complex algorithms. It is capable of collecting, processing, and storing data on its own. It can also analyze and apply complicated algorithms like localization, detection, and recognition on a real-time application, making it a powerful embedded device. The NVIDIA Jetson series, specifically the Jetson Nano device, was used in the implementation. The cEEGrid, which is integrated to the open-source brain computer-interface platform (OpenBCI), is used to collect EEG signals. An EEG-based real-time emotion recognition system on Edge AI is proposed in this paper. To perform graphical spectrogram categorization of EEG signals and to predict emotional states based on input data properties, machine learning-based classifiers were used. Until the emotional state was identified, the EEG signals were analyzed using the K-Nearest Neighbor (KNN) technique, which is a supervised learning system. In EEG signal processing, after each EEG signal has been received in real-time and translated from time to frequency domain, the Fast Fourier Transform (FFT) technique is utilized to observe the frequency bands in each EEG signal. To appropriately show the variance of each EEG frequency band, power density, standard deviation, and mean are calculated and employed. The next stage is to identify the features that have been chosen to predict emotion in EEG data using the K-Nearest Neighbors (KNN) technique. Arousal and valence datasets are used to train the parameters defined by the KNN technique.Because classification and recognition of specific classes, as well as emotion prediction, are conducted both online and locally on the edge, the KNN technique increased the performance of the emotion recognition system on the NVIDIA Jetson Nano. Finally, this implementation aims to bridge the research gap on cost-effective and efficient real-time emotion recognition using a resource constrained hardware device, like the NVIDIA Jetson Nano. On the cutting edge of AI, EEG-based emotion identification can be employed in applications that can rapidly expand the research and implementation industry's use.Keywords: edge AI device, EEG, emotion recognition system, supervised learning algorithm, sensors
Procedia PDF Downloads 107211 Meet Automotive Software Safety and Security Standards Expectations More Quickly
Authors: Jean-François Pouilly
Abstract:
This study addresses the growing complexity of embedded systems and the critical need for secure, reliable software. Traditional cybersecurity testing methods, often conducted late in the development cycle, struggle to keep pace. This talk explores how formal methods, integrated with advanced analysis tools, empower C/C++ developers to 1) Proactively address vulnerabilities and bugs, which includes formal methods and abstract interpretation techniques to identify potential weaknesses early in the development process, reducing the reliance on penetration and fuzz testing in later stages. 2) Streamline development by focusing on bugs that matter, with close to no false positives and catching flaws earlier, the need for rework and retesting is minimized, leading to faster development cycles, improved efficiency and cost savings. 3) Enhance software dependability which includes combining static analysis using abstract interpretation with full context sensitivity, with hardware memory awareness allows for a more comprehensive understanding of potential vulnerabilities, leading to more dependable and secure software. This approach aligns with industry best practices (ISO2626 or ISO 21434) and empowers C/C++ developers to deliver robust, secure embedded systems that meet the demands of today's and tomorrow's applications. We will illustrate this approach with the TrustInSoft analyzer to show how it accelerates verification for complex cases, reduces user fatigue, and improves developer efficiency, cost-effectiveness, and software cybersecurity. In summary, integrating formal methods and sound Analyzers enhances software reliability and cybersecurity, streamlining development in an increasingly complex environment.Keywords: safety, cybersecurity, ISO26262, ISO24434, formal methods
Procedia PDF Downloads 22210 The Influence of Market Attractiveness and Core Competence on Value Creation Strategy and Competitive Advantage and Its Implication on Business Performance
Authors: Firsan Nova
Abstract:
The average Indonesian watches 5.5 hours of TV a day. With a population of 242 million people and a Free-to-Air (FTA) TV penetration rate of 56%, that equates to 745 million hours of television watched each day. With such potential, it is no wonder that many companies are now attempting to get into the Pay TV market. Research firm Media Partner Asia has forecast in its study that the number of Indonesian pay-television subscribers will climb from 2.4 million in 2012 to 8.7 million by 2020, with penetration scaling up from 7 percent to 21 percent. Key drivers of market growth, the study says, include macro trends built around higher disposable income and a rising middle class, with leading players continuing to invest significantly in sales, distribution and content. New entrants, in the meantime, will boost overall prospects. This study aims to examine and analyze the effect of Market Attractiveness and the Core Competence on Value Creation and Competitive Advantage and its impact to Business Performance in the pay TV industry in Indonesia. The study using strategic management science approach with the census method in which all members of the population are as sample. Verification method is used to examine the relationship between variables. The unit of analysis in this research is all Indonesian Pay TV business units totaling 19 business units. The unit of observation is the director and managers of each business unit. Hypothesis testing is performed by using statistical Partial Least Square (PLS). The conclusion of the study shows that the market attractiveness affects business performance through value creation and competitive advantage. The appropriate value creation comes from the company ability to optimize its core competence and exploit market attractiveness. Value creation affects competitive advantage. The competitive advantage can be determined based on the company's ability to create value for customers and the competitive advantage has an impact on business performance.Keywords: market attractiveness, core competence, value creation, competitive advantage, business performance
Procedia PDF Downloads 349