Search results for: image encryption algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4603

Search results for: image encryption algorithms

2653 Automatic Approach for Estimating the Protection Elements of Electric Power Plants

Authors: Mahmoud Mohammad Salem Al-Suod, Ushkarenko O. Alexander, Dorogan I. Olga

Abstract:

New algorithms using microprocessor systems have been proposed for protection the diesel-generator unit in autonomous power systems. The software structure is designed to enhance the control automata of the system, in which every protection module of diesel-generator encapsulates the finite state machine.

Keywords: diesel-generator unit, protection, state diagram, control system, algorithm, software components

Procedia PDF Downloads 412
2652 Vehicle Speed Estimation Using Image Processing

Authors: Prodipta Bhowmik, Poulami Saha, Preety Mehra, Yogesh Soni, Triloki Nath Jha

Abstract:

In India, the smart city concept is growing day by day. So, for smart city development, a better traffic management and monitoring system is a very important requirement. Nowadays, road accidents increase due to more vehicles on the road. Reckless driving is mainly responsible for a huge number of accidents. So, an efficient traffic management system is required for all kinds of roads to control the traffic speed. The speed limit varies from road to road basis. Previously, there was a radar system but due to high cost and less precision, the radar system is unable to become favorable in a traffic management system. Traffic management system faces different types of problems every day and it has become a researchable topic on how to solve this problem. This paper proposed a computer vision and machine learning-based automated system for multiple vehicle detection, tracking, and speed estimation of vehicles using image processing. Detection of vehicles and estimating their speed from a real-time video is tough work to do. The objective of this paper is to detect vehicles and estimate their speed as accurately as possible. So for this, a real-time video is first captured, then the frames are extracted from that video, then from that frames, the vehicles are detected, and thereafter, the tracking of vehicles starts, and finally, the speed of the moving vehicles is estimated. The goal of this method is to develop a cost-friendly system that can able to detect multiple types of vehicles at the same time.

Keywords: OpenCV, Haar Cascade classifier, DLIB, YOLOV3, centroid tracker, vehicle detection, vehicle tracking, vehicle speed estimation, computer vision

Procedia PDF Downloads 75
2651 The Image Redefinition of Urban Destinations: The Case of Madrid and Barcelona

Authors: Montserrat Crespi Vallbona, Marta Domínguez Pérez

Abstract:

Globalization impacts on cities and especially on their centers, especially on those spaces more visible and coveted. Changes are involved in processes such as touristification, gentrification or studentification, in addition of shop trendiness. The city becomes a good of interchange rather than a communal good for its inhabitants and consequently, its value is monetized. So, these different tendencies are analyzed: on one hand, the presence of tourists, the home rental increase, the explosion of businesses related to tourism; on the other hand; the return of middle classes or gentries to the center in a socio-spatial model that has changed highlighting the centers by their culture and their opportunities as well as by the value of public space and centrality; then, the interest of students (national and international) to be part of these city centers as dynamic groups and emerging classes with a higher purchasing power and better cultural capital than in the past; and finally, the conversion of old stores into modern ones, where vintage trend and the renewal of antiquity is the essence. All these transforming processes impact the European cities and redefine their image. All these trends reinforce the impression and brand of the urban center as an attractive space for investment, keeping such nonsense meaningful. These four tendencies have been spreading correlatively impacting the centers and transforming them involving the displacement of former residents of these spaces and revitalizing the center that is financed and commercialized in parallel. The cases of Madrid and Barcelona as spaces of greater evidence in Spain of these tendencies serve to illustrate these processes and represent the spearhead. Useful recommendations are presented to urban planners to find the conciliation of communal and commercialized spaces.

Keywords: gentrification, shop trendiness, studentification, touristification

Procedia PDF Downloads 166
2650 Planning a Haemodialysis Process by Minimum Time Control of Hybrid Systems with Sliding Motion

Authors: Radoslaw Pytlak, Damian Suski

Abstract:

The aim of the paper is to provide a computational tool for planning a haemodialysis process. It is shown that optimization methods can be used to obtain the most effective treatment focused on removing both urea and phosphorus during the process. In order to achieve that, the IV–compartment model of phosphorus kinetics is applied. This kinetics model takes into account a rebound phenomenon that can occur during haemodialysis and results in a hybrid model of the process. Furthermore, vector fields associated with the model equations are such that it is very likely that using the most intuitive objective functions in the planning problem could lead to solutions which include sliding motions. Therefore, building computational tools for solving the problem of planning a haemodialysis process has required constructing numerical algorithms for solving optimal control problems with hybrid systems. The paper concentrates on minimum time control of hybrid systems since this control objective is the most suitable for the haemodialysis process considered in the paper. The presented approach to optimal control problems with hybrid systems is different from the others in several aspects. First of all, it is assumed that a hybrid system can exhibit sliding modes. Secondly, the system’s motion on the switching surface is described by index 2 differential–algebraic equations, and that guarantees accurate tracking of the sliding motion surface. Thirdly, the gradients of the problem’s functionals are evaluated with the help of adjoint equations. The adjoint equations presented in the paper take into account sliding motion and exhibit jump conditions at transition times. The optimality conditions in the form of the weak maximum principle for optimal control problems with hybrid systems exhibiting sliding modes and with piecewise constant controls are stated. The presented sensitivity analysis can be used to construct globally convergent algorithms for solving considered problems. The paper presents numerical results of solving the haemodialysis planning problem.

Keywords: haemodialysis planning process, hybrid systems, optimal control, sliding motion

Procedia PDF Downloads 191
2649 Animated Poetry-Film: Poetry in Action

Authors: Linette van der Merwe

Abstract:

It is known that visual artists, performing artists, and literary artists have inspired each other since time immemorial. The enduring, symbiotic relationship between the various art genres is evident where words, colours, lines, and sounds act as metaphors, a physical separation of the transcendental reality of art. Simonides of Keos (c. 556-468 BC) confirmed this, stating that a poem is a talking picture, or, in a more modern expression, a picture is worth a thousand words. It can be seen as an ancient relationship, originating from the epigram (tombstone or artefact inscriptions), the carmen figuratum (figure poem), and the ekphrasis (a description in the form of a poem of a work of art). Visual artists, including Michelangelo, Leonardo da Vinci, and Goethe, wrote poems and songs. Goya, Degas, and Picasso are famous for their works of art and for trying their hands at poetry. Afrikaans writers whose fine art is often published together with their writing, as in the case of Andries Bezuidenhout, Breyten Breytenbach, Sheila Cussons, Hennie Meyer, Carina Stander, and Johan van Wyk, among others, are not a strange phenomenon either. Imitating one art form into another art form is a form of translation, transposition, contemplation, and discovery of artistic impressions, showing parallel interpretations rather than physical comparison. It is especially about the harmony that exists between the different art genres, i.e., a poem that describes a painting or a visual text that portrays a poem that becomes a translation, interpretation, and rediscovery of the verbal text, or rather, from the word text to the image text. Poetry-film, as a form of such a translation of the word text into an image text, can be considered a hybrid, transdisciplinary art form that connects poetry and film. Poetry-film is regarded as an intertwined entity of word, sound, and visual image. It is an attempt to transpose and transform a poem into a new artwork that makes the poem more accessible to people who are not necessarily open to the written word and will, in effect, attract a larger audience to a genre that usually has a limited market. Poetry-film is considered a creative expression of an inverted ekphrastic inspiration, a visual description, interpretation, and expression of a poem. Research also emphasises that animated poetry-film is not widely regarded as a genre of anything and is thus severely under-theorized. This paper will focus on Afrikaans animated poetry-films as a multimodal transposition of a poem text to an animated poetry film, with specific reference to animated poetry-films in Filmverse I (2014) and Filmverse II (2016).

Keywords: poetry film, animated poetry film, poetic metaphor, conceptual metaphor, monomodal metaphor, multimodal metaphor, semiotic metaphor, multimodality, metaphor analysis, target domain, source domain

Procedia PDF Downloads 58
2648 On Lie-Central Derivations and Almost Inner Lie-Derivations of Leibniz Algebras

Authors: Natalia Pacheco Rego

Abstract:

The Liezation functor is a map from the category of Leibniz algebras to the category of Lie algebras, which assigns a Leibniz algebra to the Lie algebra given by the quotient of the Leibniz algebra by the ideal spanned by the square elements of the Leibniz algebra. This functor is left adjoint to the inclusion functor that considers a Lie algebra as a Leibniz algebra. This environment fits in the framework of central extensions and commutators in semi-abelian categories with respect to a Birkhoff subcategory, where classical or absolute notions are relative to the abelianization functor. Classical properties of Leibniz algebras (properties relative to the abelianization functor) were adapted to the relative setting (with respect to the Liezation functor); in general, absolute properties have the corresponding relative ones, but not all absolute properties immediately hold in the relative case, so new requirements are needed. Following this line of research, it was conducted an analysis of central derivations of Leibniz algebras relative to the Liezation functor, called as Lie-derivations, and a characterization of Lie-stem Leibniz algebras by their Lie-central derivations was obtained. In this paper, we present an overview of these results, and we analyze some new properties concerning Lie-central derivations and almost inner Lie-derivations. Namely, a Leibniz algebra is a vector space equipped with a bilinear bracket operation satisfying the Leibniz identity. We define the Lie-bracket by [x, y]lie = [x, y] + [y, x] , for all x, y . The Lie-center of a Leibniz algebra is the two-sided ideal of elements that annihilate all the elements in the Leibniz algebra through the Lie-bracket. A Lie-derivation is a linear map which acts as a derivative with respect to the Lie-bracket. Obviously, usual derivations are Lie-derivations, but the converse is not true in general. A Lie-derivation is called a Lie-central derivation if its image is contained in the Lie-center. A Lie-derivation is called an almost inner Lie-derivation if the image of an element x is contained in the Lie-commutator of x and the Leibniz algebra. The main results we present in this talk refer to the conditions under which Lie-central derivation and almost inner Lie-derivations coincide.

Keywords: almost inner Lie-derivation, Lie-center, Lie-central derivation, Lie-derivation

Procedia PDF Downloads 133
2647 Ant System with Acoustic Communication

Authors: Saad Bougrine, Salma Ouchraa, Belaid Ahiod, Abdelhakim Ameur El Imrani

Abstract:

Ant colony optimization is an ant algorithm framework that took inspiration from foraging behaviour of ant colonies. Indeed, ACO algorithms use a chemical communication, represented by pheromone trails, to build good solutions. However, ants involve different communication channels to interact. Thus, this paper introduces the acoustic communication between ants while they are foraging. This process allows fine and local exploration of search space and permits optimal solution to be improved.

Keywords: acoustic communication, ant colony optimization, local search, traveling salesman problem

Procedia PDF Downloads 581
2646 Optimal Concentration of Fluorescent Nanodiamonds in Aqueous Media for Bioimaging and Thermometry Applications

Authors: Francisco Pedroza-Montero, Jesús Naín Pedroza-Montero, Diego Soto-Puebla, Osiris Alvarez-Bajo, Beatriz Castaneda, Sofía Navarro-Espinoza, Martín Pedroza-Montero

Abstract:

Nanodiamonds have been widely studied for their physical properties, including chemical inertness, biocompatibility, optical transparency from the ultraviolet to the infrared region, high thermal conductivity, and mechanical strength. In this work, we studied how the fluorescence spectrum of nanodiamonds quenches concerning the concentration in aqueous solutions systematically ranging from 0.1 to 10 mg/mL. Our results demonstrated a non-linear fluorescence quenching as the concentration increases for both of the NV zero-phonon lines; the 5 mg/mL concentration shows the maximum fluorescence emission. Furthermore, this behaviour is theoretically explained as an electronic recombination process that modulates the intensity in the NV centres. Finally, to gain more insight, the FRET methodology is used to determine the fluorescence efficiency in terms of the fluorophores' separation distance. Thus, the concentration level is simulated as follows, a small distance between nanodiamonds would be considered a highly concentrated system, whereas a large distance would mean a low concentrated one. Although the 5 mg/mL concentration shows the maximum intensity, our main interest is focused on the concentration of 0.5 mg/mL, which our studies demonstrate the optimal human cell viability (99%). In this respect, this concentration has the feature of being as biocompatible as water giving the possibility to internalize it in cells without harming the living media. To this end, not only can we track nanodiamonds on the surface or inside the cell with excellent precision due to their fluorescent intensity, but also, we can perform thermometry tests transforming a fluorescence contrast image into a temperature contrast image.

Keywords: nanodiamonds, fluorescence spectroscopy, concentration, bioimaging, thermometry

Procedia PDF Downloads 401
2645 The Three-dimensional Response of Mussel Plaque Anchoring to Wet Substrates under Directional Tensions

Authors: Yingwei Hou, Tao Liu, Yong Pang

Abstract:

The paper explored the three-dimensional deformation of mussel plaques anchor to wet polydimethylsiloxane (PDMS) substrates under tension stress with different angles. Mussel plaques exhibiting natural adhesive structures, have attracted significant attention for their remarkable adhesion properties. Understanding their behavior under mechanical stress, particularly in a three-dimensional context, holds immense relevance for biomimetic material design and bio-inspired adhesive development. This study employed a novel approach to investigate the 3D deformation of the PDMS substrates anchored by mussel plaques subjected to controlled tension. Utilizing our customized stereo digital image correlation technique and mechanical mechanics analyses, we found the distributions of the displacement and resultant force on the substrate became concentrated under the plaque. Adhesion and sucking mechanisms were analyzed for the mussel plaque-substrate system under tension until detachment. The experimental findings were compared with a developed model using finite element analysis and the results provide new insights into mussels’ attachment mechanism. This research not only contributes to the fundamental understanding of biological adhesion but also holds promising implications for the design of innovative adhesive materials with applications in fields such as medical adhesives, underwater technologies, and industrial bonding. The comprehensive exploration of mussel plaque behavior in three dimensions is important for advancements in biomimicry and materials science, fostering the development of adhesives that emulate nature's efficiency.

Keywords: adhesion mechanism, mytilus edulis, mussel plaque, stereo digital image correlation

Procedia PDF Downloads 50
2644 Analysis of Feminist Translation in Subtitling from Arabic into English: A Case Study

Authors: Ghada Ahmed

Abstract:

Feminist translation is one of the strategies adopted in the field of translation studies when a gendered content is being rendered from one language to another, and this strategy has been examined in previous studies on written texts. This research, however, addresses the practice of feminist translation in audiovisual texts that are concerned with the screen, dialogue, image and visual aspects. In this thesis, the objectives are studying feminist translation and its adaptation in subtitling from Arabic into English. It addresses the connections between gender and translation as one domain and feminist translation practices with particular consideration of feminist translation strategies in English subtitles. It examines the visibility of the translator throughout the process, assuming that feminist translation is a product directed by the translator’s feminist position, culture, and ideology as a means of helping unshadow women. It also discusses how subtitling constraints impact feminist translation and how the image that has a narrative value can be integrated into the content of the English subtitles. The reasons for conducting this research project are to study language sexism in English and look into Arabic into English gendered content, taking into consideration the Arabic cultural concepts that may lose their connotations when they are translated into English. This research is also analysing the image in an audiovisual text and its contribution to the written dialogue in subtitling. Thus, this research attempts to answer the following questions: To what extent is there a form of affinity between a gendered content and translation? Is feminist translation an act of merely working on a feminist text or feminising the language of any text, by incorporating the translator’s ideology? How can feminist translation practices be applied in an audiovisual text? How likely is it to adapt feminist translation looking into visual components as well as subtitling constraints? Moreover, the paper searches into the fields of gender and translation; feminist translation, language sexism, media studies, and the gap in the literature related to feminist translation practice in visual texts. For my case study, the "Speed Sisters" film has been chosen so as to analyze its English subtitles for my research. The film is a documentary that was produced in 2015 and directed by Amber Fares. It is about five Palestinian women who try to break the stereotypes about women, and have taken their passion about car-racing forward to be the first all-women car-racing driving team in the Middle East. It tackles the issue of gender in both content and language and this is reflected in the translation. As the research topic is semiotic-channelled, the choice for the theoretical approaches varies and combines between translation studies, audiovisual translation, gender studies, and media studies. Each of which will contribute to understanding a specific field of the research and the results will eventually be integrated to achieve the intended objectives in a way that demonstrates rendering a gendered content in one of the audiovisual translation modes from a language into another.

Keywords: audiovisual translation, feminist translation, films gendered content, subtitling conventions and constraints

Procedia PDF Downloads 292
2643 Advanced Stability Criterion for Time-Delayed Systems of Neutral Type and Its Application

Authors: M. J. Park, S. H. Lee, C. H. Lee, O. M. Kwon

Abstract:

This paper investigates stability problem for linear systems of neutral type with time-varying delay. By constructing various Lyapunov-Krasovskii functional, and utilizing some mathematical techniques, the sufficient stability conditions for the systems are established in terms of linear matrix inequalities (LMIs), which can be easily solved by various effective optimization algorithms. Finally, some illustrative examples are given to show the effectiveness of the proposed criterion.

Keywords: neutral systems, time-delay, stability, Lyapnov method, LMI

Procedia PDF Downloads 343
2642 Study on the Thermal Mixing of Steam and Coolant in the Hybrid Safety Injection Tank

Authors: Sung Uk Ryu, Byoung Gook Jeon, Sung-Jae Yi, Dong-Jin Euh

Abstract:

In such passive safety injection systems in the nuclear power plant as Core Makeup Tank (CMT) and Hybrid Safety Injection Tank, various thermal-hydraulic phenomena including the direct contact condensation of steam and the thermal stratification of coolant occur. These phenomena are also closely related to the performance of the system. Depending on the condensation rate of the steam injected to the tank, the injection of the coolant and pressure equalizing timings of the tank are decided. The steam injected to the tank from the upper nozzle penetrates the coolant and induces a direct contact condensation. In the present study, the direct contact condensation of steam and the thermal mixing between the steam and coolant were examined by using the Particle Image Velocimetry (PIV) technique. Especially, by altering the size of the nozzle from which the steam is injected, the influence of steam injection velocity on the thermal mixing with coolant and condensation shall be comprehended, while also investigating the influence of condensation on the pressure variation inside the tank. Even though the amounts of steam inserted were the same in three different nozzle size conditions, it was found that the velocity of pressure rise becomes lower as the steam injection area decreases. Also, as the steam injection area increases, the thickness of the zone within which the coolant’s temperature decreases. Thereby, the amount of steam condensed by the direct contact condensation also decreases. The results derived from the present study can be utilized for the detailed design of a passive safety injection system, as well as for modeling the direct contact condensation triggered by the steam jet’s penetration into the coolant.

Keywords: passive safety injection systems, steam penetration, direct contact condensation, particle image velocimetry

Procedia PDF Downloads 389
2641 A Generalized Weighted Loss for Support Vextor Classification and Multilayer Perceptron

Authors: Filippo Portera

Abstract:

Usually standard algorithms employ a loss where each error is the mere absolute difference between the true value and the prediction, in case of a regression task. In the present, we present several error weighting schemes that are a generalization of the consolidated routine. We study both a binary classification model for Support Vextor Classification and a regression net for Multylayer Perceptron. Results proves that the error is never worse than the standard procedure and several times it is better.

Keywords: loss, binary-classification, MLP, weights, regression

Procedia PDF Downloads 87
2640 Early Prediction of Diseases in a Cow for Cattle Industry

Authors: Ghufran Ahmed, Muhammad Osama Siddiqui, Shahbaz Siddiqui, Rauf Ahmad Shams Malick, Faisal Khan, Mubashir Khan

Abstract:

In this paper, a machine learning-based approach for early prediction of diseases in cows is proposed. Different ML algos are applied to extract useful patterns from the available dataset. Technology has changed today’s world in every aspect of life. Similarly, advanced technologies have been developed in livestock and dairy farming to monitor dairy cows in various aspects. Dairy cattle monitoring is crucial as it plays a significant role in milk production around the globe. Moreover, it has become necessary for farmers to adopt the latest early prediction technologies as the food demand is increasing with population growth. This highlight the importance of state-ofthe-art technologies in analyzing how important technology is in analyzing dairy cows’ activities. It is not easy to predict the activities of a large number of cows on the farm, so, the system has made it very convenient for the farmers., as it provides all the solutions under one roof. The cattle industry’s productivity is boosted as the early diagnosis of any disease on a cattle farm is detected and hence it is treated early. It is done on behalf of the machine learning output received. The learning models are already set which interpret the data collected in a centralized system. Basically, we will run different algorithms on behalf of the data set received to analyze milk quality, and track cows’ health, location, and safety. This deep learning algorithm draws patterns from the data, which makes it easier for farmers to study any animal’s behavioral changes. With the emergence of machine learning algorithms and the Internet of Things, accurate tracking of animals is possible as the rate of error is minimized. As a result, milk productivity is increased. IoT with ML capability has given a new phase to the cattle farming industry by increasing the yield in the most cost-effective and time-saving manner.

Keywords: IoT, machine learning, health care, dairy cows

Procedia PDF Downloads 63
2639 Improvement of Microscopic Detection of Acid-Fast Bacilli for Tuberculosis by Artificial Intelligence-Assisted Microscopic Platform and Medical Image Recognition System

Authors: Hsiao-Chuan Huang, King-Lung Kuo, Mei-Hsin Lo, Hsiao-Yun Chou, Yusen Lin

Abstract:

The most robust and economical method for laboratory diagnosis of TB is to identify mycobacterial bacilli (AFB) under acid-fast staining despite its disadvantages of low sensitivity and labor-intensive. Though digital pathology becomes popular in medicine, an automated microscopic system for microbiology is still not available. A new AI-assisted automated microscopic system, consisting of a microscopic scanner and recognition program powered by big data and deep learning, may significantly increase the sensitivity of TB smear microscopy. Thus, the objective is to evaluate such an automatic system for the identification of AFB. A total of 5,930 smears was enrolled for this study. An intelligent microscope system (TB-Scan, Wellgen Medical, Taiwan) was used for microscopic image scanning and AFB detection. 272 AFB smears were used for transfer learning to increase the accuracy. Referee medical technicians were used as Gold Standard for result discrepancy. Results showed that, under a total of 1726 AFB smears, the automated system's accuracy, sensitivity and specificity were 95.6% (1,650/1,726), 87.7% (57/65), and 95.9% (1,593/1,661), respectively. Compared to culture, the sensitivity for human technicians was only 33.8% (38/142); however, the automated system can achieve 74.6% (106/142), which is significantly higher than human technicians, and this is the first of such an automated microscope system for TB smear testing in a controlled trial. This automated system could achieve higher TB smear sensitivity and laboratory efficiency and may complement molecular methods (eg. GeneXpert) to reduce the total cost for TB control. Furthermore, such an automated system is capable of remote access by the internet and can be deployed in the area with limited medical resources.

Keywords: TB smears, automated microscope, artificial intelligence, medical imaging

Procedia PDF Downloads 219
2638 Personalization of Context Information Retrieval Model via User Search Behaviours for Ranking Document Relevance

Authors: Kehinde Agbele, Longe Olumide, Daniel Ekong, Dele Seluwa, Akintoye Onamade

Abstract:

One major problem of most existing information retrieval systems (IRS) is that they provide even access and retrieval results to individual users specially based on the query terms user issued to the system. When using IRS, users often present search queries made of ad-hoc keywords. It is then up to IRS to obtain a precise representation of user’s information need, and the context of the information. In effect, the volume and range of the Internet documents is growing exponentially and consequently causes difficulties for a user to obtain information that precisely matches the user interest. Diverse combination techniques are used to achieve the specific goal. This is due, firstly, to the fact that users often do not present queries to IRS that optimally represent the information they want, and secondly, the measure of a document's relevance is highly subjective between diverse users. In this paper, we address the problem by investigating the optimization of IRS to individual information needs in order of relevance. The paper addressed the development of algorithms that optimize the ranking of documents retrieved from IRS. This paper addresses this problem with a two-fold approach in order to retrieve domain-specific documents. Firstly, the design of context of information. The context of a query determines retrieved information relevance using personalization and context-awareness. Thus, executing the same query in diverse contexts often leads to diverse result rankings based on the user preferences. Secondly, the relevant context aspects should be incorporated in a way that supports the knowledge domain representing users’ interests. In this paper, the use of evolutionary algorithms is incorporated to improve the effectiveness of IRS. A context-based information retrieval system that learns individual needs from user-provided relevance feedback is developed whose retrieval effectiveness is evaluated using precision and recall metrics. The results demonstrate how to use attributes from user interaction behavior to improve the IR effectiveness.

Keywords: context, document relevance, information retrieval, personalization, user search behaviors

Procedia PDF Downloads 456
2637 Brain-Computer Interfaces That Use Electroencephalography

Authors: Arda Ozkurt, Ozlem Bozkurt

Abstract:

Brain-computer interfaces (BCIs) are devices that output commands by interpreting the data collected from the brain. Electroencephalography (EEG) is a non-invasive method to measure the brain's electrical activity. Since it was invented by Hans Berger in 1929, it has led to many neurological discoveries and has become one of the essential components of non-invasive measuring methods. Despite the fact that it has a low spatial resolution -meaning it is able to detect when a group of neurons fires at the same time-, it is a non-invasive method, making it easy to use without possessing any risks. In EEG, electrodes are placed on the scalp, and the voltage difference between a minimum of two electrodes is recorded, which is then used to accomplish the intended task. The recordings of EEGs include, but are not limited to, the currents along dendrites from synapses to the soma, the action potentials along the axons connecting neurons, and the currents through the synaptic clefts connecting axons with dendrites. However, there are some sources of noise that may affect the reliability of the EEG signals as it is a non-invasive method. For instance, the noise from the EEG equipment, the leads, and the signals coming from the subject -such as the activity of the heart or muscle movements- affect the signals detected by the electrodes of the EEG. However, new techniques have been developed to differentiate between those signals and the intended ones. Furthermore, an EEG device is not enough to analyze the data from the brain to be used by the BCI implication. Because the EEG signal is very complex, to analyze it, artificial intelligence algorithms are required. These algorithms convert complex data into meaningful and useful information for neuroscientists to use the data to design BCI devices. Even though for neurological diseases which require highly precise data, invasive BCIs are needed; non-invasive BCIs - such as EEGs - are used in many cases to help disabled people's lives or even to ease people's lives by helping them with basic tasks. For example, EEG is used to detect before a seizure occurs in epilepsy patients, which can then prevent the seizure with the help of a BCI device. Overall, EEG is a commonly used non-invasive BCI technique that has helped develop BCIs and will continue to be used to detect data to ease people's lives as more BCI techniques will be developed in the future.

Keywords: BCI, EEG, non-invasive, spatial resolution

Procedia PDF Downloads 69
2636 Enhanced Acquisition Time of a Quantum Holography Scheme within a Nonlinear Interferometer

Authors: Sergio Tovar-Pérez, Sebastian Töpfer, Markus Gräfe

Abstract:

The work proposes a technique that decreases the detection acquisition time of quantum holography schemes down to one-third; this allows the possibility to image moving objects. Since its invention, quantum holography with undetected photon schemes has gained interest in the scientific community. This is mainly due to its ability to tailor the detected wavelengths according to the needs of the scheme implementation. Yet this wavelength flexibility grants the scheme a wide range of possible applications; an important matter was yet to be addressed. Since the scheme uses digital phase-shifting techniques to retrieve the information of the object out of the interference pattern, it is necessary to acquire a set of at least four images of the interference pattern along with well-defined phase steps to recover the full object information. Hence, the imaging method requires larger acquisition times to produce well-resolved images. As a consequence, the measurement of moving objects remains out of the reach of the imaging scheme. This work presents the use and implementation of a spatial light modulator along with a digital holographic technique called quasi-parallel phase-shifting. This technique uses the spatial light modulator to build a structured phase image consisting of a chessboard pattern containing the different phase steps for digitally calculating the object information. Depending on the reduction in the number of needed frames, the acquisition time reduces by a significant factor. This technique opens the door to the implementation of the scheme for moving objects. In particular, the application of this scheme in imaging alive specimens comes one step closer.

Keywords: quasi-parallel phase shifting, quantum imaging, quantum holography, quantum metrology

Procedia PDF Downloads 106
2635 AI for Efficient Geothermal Exploration and Utilization

Authors: Velimir "monty" Vesselinov, Trais Kliplhuis, Hope Jasperson

Abstract:

Artificial intelligence (AI) is a powerful tool in the geothermal energy sector, aiding in both exploration and utilization. Identifying promising geothermal sites can be challenging due to limited surface indicators and the need for expensive drilling to confirm subsurface resources. Geothermal reservoirs can be located deep underground and exhibit complex geological structures, making traditional exploration methods time-consuming and imprecise. AI algorithms can analyze vast datasets of geological, geophysical, and remote sensing data, including satellite imagery, seismic surveys, geochemistry, geology, etc. Machine learning algorithms can identify subtle patterns and relationships within this data, potentially revealing hidden geothermal potential in areas previously overlooked. To address these challenges, a SIML (Science-Informed Machine Learning) technology has been developed. SIML methods are different from traditional ML techniques. In both cases, the ML models are trained to predict the spatial distribution of an output (e.g., pressure, temperature, heat flux) based on a series of inputs (e.g., permeability, porosity, etc.). The traditional ML (a) relies on deep and wide neural networks (NNs) based on simple algebraic mappings to represent complex processes. In contrast, the SIML neurons incorporate complex mappings (including constitutive relationships and physics/chemistry models). This results in ML models that have a physical meaning and satisfy physics laws and constraints. The prototype of the developed software, called GeoTGO, is accessible through the cloud. Our software prototype demonstrates how different data sources can be made available for processing, executed demonstrative SIML analyses, and presents the results in a table and graphic form.

Keywords: science-informed machine learning, artificial inteligence, exploration, utilization, hidden geothermal

Procedia PDF Downloads 43
2634 Leveraging Automated and Connected Vehicles with Deep Learning for Smart Transportation Network Optimization

Authors: Taha Benarbia

Abstract:

The advent of automated and connected vehicles has revolutionized the transportation industry, presenting new opportunities for enhancing the efficiency, safety, and sustainability of our transportation networks. This paper explores the integration of automated and connected vehicles into a smart transportation framework, leveraging the power of deep learning techniques to optimize the overall network performance. The first aspect addressed in this paper is the deployment of automated vehicles (AVs) within the transportation system. AVs offer numerous advantages, such as reduced congestion, improved fuel efficiency, and increased safety through advanced sensing and decisionmaking capabilities. The paper delves into the technical aspects of AVs, including their perception, planning, and control systems, highlighting the role of deep learning algorithms in enabling intelligent and reliable AV operations. Furthermore, the paper investigates the potential of connected vehicles (CVs) in creating a seamless communication network between vehicles, infrastructure, and traffic management systems. By harnessing real-time data exchange, CVs enable proactive traffic management, adaptive signal control, and effective route planning. Deep learning techniques play a pivotal role in extracting meaningful insights from the vast amount of data generated by CVs, empowering transportation authorities to make informed decisions for optimizing network performance. The integration of deep learning with automated and connected vehicles paves the way for advanced transportation network optimization. Deep learning algorithms can analyze complex transportation data, including traffic patterns, demand forecasting, and dynamic congestion scenarios, to optimize routing, reduce travel times, and enhance overall system efficiency. The paper presents case studies and simulations demonstrating the effectiveness of deep learning-based approaches in achieving significant improvements in network performance metrics

Keywords: automated vehicles, connected vehicles, deep learning, smart transportation network

Procedia PDF Downloads 71
2633 Effect of Al on Glancing Angle Deposition Synthesized In₂O₃ Nanocolumn for Photodetector Application

Authors: Chitralekha Ngangbam, Aniruddha Mondal, Naorem Khelchand Singh

Abstract:

Aluminium (Al) doped In2O3 (Indium Oxide) nanocolumn array was synthesized by glancing angle deposition (GLAD) technique on Si (n-type) substrate for photodetector application. The sample was characterized by scanning electron microscopy (SEM). The average diameter of the nanocolumn was calculated from the top view of the SEM image and found to be ∼80 nm. The length of the nanocolumn (~500 nm) was calculated from cross sectional SEM image and it shows that the nanocolumns are perpendicular to the substrate. The EDX analysis confirmed the presence of Al (Aluminium), In (Indium), O (Oxygen) elements in the samples. The XRD patterns of the Al-doped In2O3 nanocolumn show the presence of different phases of the Al doped In2O3 nanocolumn i.e. (222) and (622). Three different peaks were observed from the PL analysis of Al doped In2O3 nanocolumn at 365 nm, 415 nm and 435 nm respectively. The peak at PL emission at 365 nm can be attributed to the near band gap transition of In2O3 whereas the peaks at 415 nm and 435 nm can be attributed to the trap state emissions due to oxygen vacancies and oxygen–indium vacancy centre in Al doped In2O3 nanocolumn. The current-voltage (I–V) characteristics of the Al doped In2O3 nanocolumn based detector was measured through the Au Schottky contact. The devices were then examined under the halogen light (20 W) illumination for photocurrent measurement. The Al-doped In2O3 nanocolumn based optical detector showed high conductivity and low turn on voltage at 0.69 V under white light illumination. A maximum photoresponsivity of 82 A/W at 380 nm was observed for the device. The device shows a high internal gain of ~267 at UV region (380 nm) and ∼127 at visible region (760 nm). Also the rise time and fall time for the device at 650 nm is 0.15 and 0.16 sec respectively which makes it suitable for fast response detector.

Keywords: glancing angle deposition, nanocolumn, semiconductor, photodetector, indium oxide

Procedia PDF Downloads 175
2632 Methodology and Credibility of Unmanned Aerial Vehicle-Based Cadastral Mapping

Authors: Ajibola Isola, Shattri Mansor, Ojogbane Sani, Olugbemi Tope

Abstract:

The cadastral map is the rationale behind city management planning and development. For years, cadastral maps have been produced by ground and photogrammetry platforms. Recent evolution in photogrammetry and remote sensing sensors ignites the use of Unmanned Aerial Vehicle systems (UAVs) for cadastral mapping. Despite the time-saving and multi-dimensional cost-effectiveness of the UAV platform, issues related to cadastral map accuracy are a hindrance to the wide applicability of UAVs' cadastral mapping. This study aims to present an approach leading to the generation and assessing the credibility of UAV cadastral mapping. Different sets of Red, Green, and Blue (RGB) photos were obtained from the Tarot 680-hexacopter UAV platform flown over the Universiti Putra Malaysia campus sports complex at an altitude range of 70 m, 100 m, and 250. Before flying the UAV, twenty-eight ground control points were evenly established in the study area with a real-time kinematic differential global positioning system. The second phase of the study utilizes an image-matching algorithm for photos alignment wherein camera calibration parameters and ten of the established ground control points were used for estimating the inner, relative, and absolute orientations of the photos. The resulting orthoimages are exported to ArcGIS software for digitization. Visual, tabular, and graphical assessments of the resulting cadastral maps showed a different level of accuracy. The results of the study show a gradual approach for generating UAV cadastral mapping and that the cadastral map acquired at 70 m altitude produced better results.

Keywords: aerial mapping, orthomosaic, cadastral map, flying altitude, image processing

Procedia PDF Downloads 76
2631 The Security Trade-Offs in Resource Constrained Nodes for IoT Application

Authors: Sultan Alharby, Nick Harris, Alex Weddell, Jeff Reeve

Abstract:

The concept of the Internet of Things (IoT) has received much attention over the last five years. It is predicted that the IoT will influence every aspect of our lifestyles in the near future. Wireless Sensor Networks are one of the key enablers of the operation of IoTs, allowing data to be collected from the surrounding environment. However, due to limited resources, nature of deployment and unattended operation, a WSN is vulnerable to various types of attack. Security is paramount for reliable and safe communication between IoT embedded devices, but it does, however, come at a cost to resources. Nodes are usually equipped with small batteries, which makes energy conservation crucial to IoT devices. Nevertheless, security cost in terms of energy consumption has not been studied sufficiently. Previous research has used a security specification of 802.15.4 for IoT applications, but the energy cost of each security level and the impact on quality of services (QoS) parameters remain unknown. This research focuses on the cost of security at the IoT media access control (MAC) layer. It begins by studying the energy consumption of IEEE 802.15.4 security levels, which is followed by an evaluation for the impact of security on data latency and throughput, and then presents the impact of transmission power on security overhead, and finally shows the effects of security on memory footprint. The results show that security overhead in terms of energy consumption with a payload of 24 bytes fluctuates between 31.5% at minimum level over non-secure packets and 60.4% at the top security level of 802.15.4 security specification. Also, it shows that security cost has less impact at longer packet lengths, and more with smaller packet size. In addition, the results depicts a significant impact on data latency and throughput. Overall, maximum authentication length decreases throughput by almost 53%, and encryption and authentication together by almost 62%.

Keywords: energy consumption, IEEE 802.15.4, IoT security, security cost evaluation

Procedia PDF Downloads 164
2630 Maximum Power Point Tracking Using FLC Tuned with GA

Authors: Mohamed Amine Haraoubia, Abdelaziz Hamzaoui, Najib Essounbouli

Abstract:

The pursuit of the MPPT has led to the development of many kinds of controllers, one of which is the Fuzzy Logic Controller, which has proven its worth. To further tune this controller this paper will discuss and analyze the use of Genetic Algorithms to tune the Fuzzy Logic Controller. It will provide an introduction to both systems, and test their compatibility and performance.

Keywords: fuzzy logic controller, fuzzy logic, genetic algorithm, maximum power point, maximum power point tracking

Procedia PDF Downloads 369
2629 Adjusting Mind and Heart to Ovarian Cancer: Correlational Study on Italian Women

Authors: Chiara Cosentino, Carlo Pruneti, Carla Merisio, Domenico Sgromo

Abstract:

Introduction – Psychoneuroimmunology as approach clearly showed how psychological features can influence health through specific physiological pathways linked to the stress reaction. This can be true also in cancer, in its latter conceptualization seen as a chronic disease. Therefore, it is still not clear how the psychological features can combine with a physiological specific path, for a better adjustment to cancer. The aim of this study is identifying how in Italian survivors, perceived social support, body image, coping and quality of life correlate with or influence Heart Rate Variability (HRV), the physiological parameter that can mirror a condition of chronic stress or a good relaxing capability. Method - The study had an exploratory transversal design. The final sample was made of 38 ovarian cancer survivors aged from 29 to 80 (M= 56,08; SD=12,76) following a program for Ovarian Cancer at the Oncological Clinic, University Hospital of Parma, Italy. Participants were asked to fill: Multidimensional Scale of Perceived Social Support (MSPSS); Derridford Appearance Scale-59 (DAS-59); Mental Adjustment to Cancer (MAC); Quality of Life Questionnaire (EORTC). For each participant was recorded Short-Term HRV (5 minutes) using emWavePro. Results– Data showed many interesting correlations within the psychological features. EORTC scores have a significant correlation with DAS-59 (r =-.327 p <.05), MSPSS (r =.411 p<.05), and MAC scores, in particular with the strategy Fatalism (r =.364 p<.05). A good social support improves HRV (F(1,33)= 4.27 p<.05). Perceiving themselves as effective in their environment, preserving a good role functioning (EORTC), positively affects HRV (F(1,33)=9.810 p<.001). Women admitting concerns towards body image seem prone to emotive disclosure, reducing emotional distress and improving HRV (β=.453); emotional avoidance worsens HRV (β=-.391). Discussion and conclusion - Results showed a strong relationship between body image and Quality of Life. These data suggest that higher concerns on body image, in particular, the negative self-concept linked to appearance, was linked to the worst functioning in everyday life. The relation between the negative self-concept and a reduction in emotional functioning is understandable in terms of possible distress deriving from the perception of body appearance. The relationship between a high perceived social support and a better functioning in everyday life was also confirmed. In this sample fatalism, was associated with a better physical, role and emotional functioning. In these women, the presence of a good support may activate the physiological Social Engagement System improving their HRV. Perceiving themselves effective in their environment, preserving a good role functioning, also positively affects HRV, probably following the same physiological pathway. A higher presence of concerns about appearance contributes to a higher HRV. Probably women admitting more body concerns are prone to a better emotive disclosure. This could reduce emotional distress improving HRV and global health. This study reached preliminary demonstration of an ‘Integrated Model of Defense’ in these cancer survivors. In these model, psychological features interact building a better quality of life and a condition of psychological well-being that is associated and influence HRV, then the physiological condition.

Keywords: cancer survivors, heart rate variability, ovarian cancer, psychophysiological adjustment

Procedia PDF Downloads 185
2628 Characterization and Monitoring of the Yarn Faults Using Diametric Fault System

Authors: S. M. Ishtiaque, V. K. Yadav, S. D. Joshi, J. K. Chatterjee

Abstract:

The DIAMETRIC FAULTS system has been developed that captures a bi-directional image of yarn continuously in sequentially manner and provides the detailed classification of faults. A novel mathematical framework developed on the acquired bi-directional images forms the basis of fault classification in four broad categories, namely, Thick1, Thick2, Thin and Normal Yarn. A discretised version of Radon transformation has been used to convert the bi-directional images into one-dimensional signals. Images were divided into training and test sample sets. Karhunen–Loève Transformation (KLT) basis is computed for the signals from the images in training set for each fault class taking top six highest energy eigen vectors. The fault class of the test image is identified by taking the Euclidean distance of its signal from its projection on the KLT basis for each sample realization and fault class in the training set. Euclidean distance applied using various techniques is used for classifying an unknown fault class. An accuracy of about 90% is achieved in detecting the correct fault class using the various techniques. The four broad fault classes were further sub classified in four sub groups based on the user set boundary limits for fault length and fault volume. The fault cross-sectional area and the fault length defines the total volume of fault. A distinct distribution of faults is found in terms of their volume and physical dimensions which can be used for monitoring the yarn faults. It has been shown from the configurational based characterization and classification that the spun yarn faults arising out of mass variation, exhibit distinct characteristics in terms of their contours, sizes and shapes apart from their frequency of occurrences.

Keywords: Euclidean distance, fault classification, KLT, Radon Transform

Procedia PDF Downloads 259
2627 Spectroscopic Study of Tb³⁺ Doped Calcium Aluminozincate Phosphor for Display and Solid-State Lighting Applications

Authors: Sumandeep Kaur, Allam Srinivasa Rao, Mula Jayasimhadri

Abstract:

In recent years, rare earth (RE) ions doped inorganic luminescent materials are seeking great attention due to their excellent physical and chemical properties. These materials offer high thermal and chemical stability and exhibit good luminescence properties due to the presence of RE ions. The luminescent properties of these materials are attributed to their intra-configurational f-f transitions in RE ions. A series of Tb³⁺ doped calcium aluminozincate has been synthesized via sol-gel method. The structural and morphological studies have been carried out by recording X-ray diffraction patterns and SEM image. The luminescent spectra have been recorded for a comprehensive study of their luminescence properties. The XRD profile reveals the single-phase orthorhombic crystal structure with an average crystallite size of 65 nm as calculated by using DebyeScherrer equation. The SEM image exhibits completely random, irregular morphology of micron size particles of the prepared samples. The optimization of luminescence has been carried out by varying the dopant Tb³⁺ concentration within the range from 0.5 to 2.0 mol%. The as-synthesized phosphors exhibit intense emission at 544 nm pumped at 478 nm excitation wavelength. The optimized Tb³⁺ concentration has been found to be 1.0 mol% in the present host lattice. The decay curves show bi-exponential fitting for the as-synthesized phosphor. The colorimetric studies show green emission with CIE coordinates (0.334, 0.647) lying in green region for the optimized Tb³⁺ concentration. This report reveals the potential utility of Tb³⁺ doped calcium aluminozincate phosphors for display and solid-state lighting devices.

Keywords: concentration quenching, phosphor, photoluminescence, XRD

Procedia PDF Downloads 145
2626 Computer-Aided Detection of Liver and Spleen from CT Scans using Watershed Algorithm

Authors: Belgherbi Aicha, Bessaid Abdelhafid

Abstract:

In the recent years a great deal of research work has been devoted to the development of semi-automatic and automatic techniques for the analysis of abdominal CT images. The first and fundamental step in all these studies is the semi-automatic liver and spleen segmentation that is still an open problem. In this paper, a semi-automatic liver and spleen segmentation method by the mathematical morphology based on watershed algorithm has been proposed. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological to extract the liver and spleen. The second step consists to improve the quality of the image gradient. In this step, we propose a method for improving the image gradient to reduce the over-segmentation problem by applying the spatial filters followed by the morphological filters. Thereafter we proceed to the segmentation of the liver, spleen. The aim of this work is to develop a method for semi-automatic segmentation liver and spleen based on watershed algorithm, improve the accuracy and the robustness of the liver and spleen segmentation and evaluate a new semi-automatic approach with the manual for liver segmentation. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work. The system has been evaluated by computing the sensitivity and specificity between the semi-automatically segmented (liver and spleen) contour and the manually contour traced by radiological experts. Liver segmentation has achieved the sensitivity and specificity; sens Liver=96% and specif Liver=99% respectively. Spleen segmentation achieves similar, promising results sens Spleen=95% and specif Spleen=99%.

Keywords: CT images, liver and spleen segmentation, anisotropic diffusion filter, morphological filters, watershed algorithm

Procedia PDF Downloads 318
2625 Transforming Data Science Curriculum Through Design Thinking

Authors: Samar Swaid

Abstract:

Today, corporates are moving toward the adoption of Design-Thinking techniques to develop products and services, putting their consumer as the heart of the development process. One of the leading companies in Design-Thinking, IDEO (Innovation, Design, Engineering Organization), defines Design-Thinking as an approach to problem-solving that relies on a set of multi-layered skills, processes, and mindsets that help people generate novel solutions to problems. Design thinking may result in new ideas, narratives, objects or systems. It is about redesigning systems, organizations, infrastructures, processes, and solutions in an innovative fashion based on the users' feedback. Tim Brown, president and CEO of IDEO, sees design thinking as a human-centered approach that draws from the designer's toolkit to integrate people's needs, innovative technologies, and business requirements. The application of design thinking has been witnessed to be the road to developing innovative applications, interactive systems, scientific software, healthcare application, and even to utilizing Design-Thinking to re-think business operations, as in the case of Airbnb. Recently, there has been a movement to apply design thinking to machine learning and artificial intelligence to ensure creating the "wow" effect on consumers. The Association of Computing Machinery task force on Data Science program states that" Data scientists should be able to implement and understand algorithms for data collection and analysis. They should understand the time and space considerations of algorithms. They should follow good design principles developing software, understanding the importance of those principles for testability and maintainability" However, this definition hides the user behind the machine who works on data preparation, algorithm selection and model interpretation. Thus, the Data Science program includes design thinking to ensure meeting the user demands, generating more usable machine learning tools, and developing ways of framing computational thinking. Here, describe the fundamentals of Design-Thinking and teaching modules for data science programs.

Keywords: data science, design thinking, AI, currculum, transformation

Procedia PDF Downloads 75
2624 Systematic and Meta-Analysis of Navigation in Oral and Maxillofacial Trauma and Impact of Machine Learning and AI in Management

Authors: Shohreh Ghasemi

Abstract:

Introduction: Managing oral and maxillofacial trauma is a multifaceted challenge, as it can have life-threatening consequences and significant functional and aesthetic impact. Navigation techniques have been introduced to improve surgical precision to meet this challenge. A machine learning algorithm was also developed to support clinical decision-making regarding treating oral and maxillofacial trauma. Given these advances, this systematic meta-analysis aims to assess the efficacy of navigational techniques in treating oral and maxillofacial trauma and explore the impact of machine learning on their management. Methods: A detailed and comprehensive analysis of studies published between January 2010 and September 2021 was conducted through a systematic meta-analysis. This included performing a thorough search of Web of Science, Embase, and PubMed databases to identify studies evaluating the efficacy of navigational techniques and the impact of machine learning in managing oral and maxillofacial trauma. Studies that did not meet established entry criteria were excluded. In addition, the overall quality of studies included was evaluated using Cochrane risk of bias tool and the Newcastle-Ottawa scale. Results: Total of 12 studies, including 869 patients with oral and maxillofacial trauma, met the inclusion criteria. An analysis of studies revealed that navigation techniques effectively improve surgical accuracy and minimize the risk of complications. Additionally, machine learning algorithms have proven effective in predicting treatment outcomes and identifying patients at high risk for complications. Conclusion: The introduction of navigational technology has great potential to improve surgical precision in oral and maxillofacial trauma treatment. Furthermore, developing machine learning algorithms offers opportunities to improve clinical decision-making and patient outcomes. Still, further studies are necessary to corroborate these results and establish the optimal use of these technologies in managing oral and maxillofacial trauma

Keywords: trauma, machine learning, navigation, maxillofacial, management

Procedia PDF Downloads 55