Search results for: quantum convolutional neural networks
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4219

Search results for: quantum convolutional neural networks

1129 Energy Consumption Modeling for Strawberry Greenhouse Crop by Adaptive Nero Fuzzy Inference System Technique: A Case Study in Iran

Authors: Azar Khodabakhshi, Elham Bolandnazar

Abstract:

Agriculture as the most important food manufacturing sector is not only the energy consumer, but also is known as energy supplier. Using energy is considered as a helpful parameter for analyzing and evaluating the agricultural sustainability. In this study, the pattern of energy consumption of strawberry greenhouses of Jiroft in Kerman province of Iran was surveyed. The total input energy required in the strawberries production was calculated as 113314.71 MJ /ha. Electricity with 38.34% contribution of the total energy was considered as the most energy consumer in strawberry production. In this study, Neuro Fuzzy networks was used for function modeling in the production of strawberries. Results showed that the best model for predicting the strawberries function had a correlation coefficient, root mean square error (RMSE) and mean absolute percentage error (MAPE) equal to 0.9849, 0.0154 kg/ha and 0.11% respectively. Regards to these results, it can be said that Neuro Fuzzy method can be well predicted and modeled the strawberry crop function.

Keywords: crop yield, energy, neuro-fuzzy method, strawberry

Procedia PDF Downloads 368
1128 Design and Development of a Platform for Analyzing Spatio-Temporal Data from Wireless Sensor Networks

Authors: Walid Fantazi

Abstract:

The development of sensor technology (such as microelectromechanical systems (MEMS), wireless communications, embedded systems, distributed processing and wireless sensor applications) has contributed to a broad range of WSN applications which are capable of collecting a large amount of spatiotemporal data in real time. These systems require real-time data processing to manage storage in real time and query the data they process. In order to cover these needs, we propose in this paper a Snapshot spatiotemporal data model based on object-oriented concepts. This model allows saving storing and reducing data redundancy which makes it easier to execute spatiotemporal queries and save analyzes time. Further, to ensure the robustness of the system as well as the elimination of congestion from the main access memory we propose a spatiotemporal indexing technique in RAM called Captree *. As a result, we offer an RIA (Rich Internet Application) -based SOA application architecture which allows the remote monitoring and control.

Keywords: WSN, indexing data, SOA, RIA, geographic information system

Procedia PDF Downloads 247
1127 Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency

Authors: Rania Alshikhe, Vinita Jindal

Abstract:

Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from traveling vehicles, such as taxis through installed global positioning system (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.

Keywords: GPS trajectory, road network, taxi trips, digital map, big data, STEM, LANCE

Procedia PDF Downloads 151
1126 Bitplanes Gray-Level Image Encryption Approach Using Arnold Transform

Authors: Ali Abdrhman M. Ukasha

Abstract:

Data security needed in data transmission, storage, and communication to ensure the security. The single step parallel contour extraction (SSPCE) method is used to create the edge map as a key image from the different Gray level/Binary image. Performing the X-OR operation between the key image and each bit plane of the original image for image pixel values change purpose. The Arnold transform used to changes the locations of image pixels as image scrambling process. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Gary level image and completely reconstructed without any distortion. Also shown that the analyzed algorithm have extremely large security against some attacks like salt & pepper and JPEG compression. Its proof that the Gray level image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.

Keywords: SSPCE method, image compression-salt- peppers attacks, bitplanes decomposition, Arnold transform, lossless image encryption

Procedia PDF Downloads 428
1125 Biofeedback-Driven Sound and Image Generation

Authors: Claudio Burguez, María Castelló, Mikaela Pisani, Marcos Umpiérrez

Abstract:

BIOFEEDBACK exhibition offers a unique experience for each visitor, combining art, neuroscience, and technology in an interactive way. Using a headband that captures the bioelectric activity of the brain, the visitors are able to generate sound and images in a sequence loop, making them an integral part of the artwork. Through this interactive exhibit, visitors gain a deeper appreciation of the beauty and complexity of the brain. As a special takeaway, visitors will receive an NFT as a present, allowing them to continue their engagement with the exhibition beyond the physical space. We used the EEG Biofeedback technique following a closed-loop neuroscience approach, transforming EEG data captured by a Muse S headband in real-time into audiovisual stimulation. PureData is used for sound generation and Generative Adversarial Networks (GANs) for image generation. Thirty participants have experienced the exhibition. For some individuals, it was easier to focus than others. Participants who said they could focus during the exhibit stated that at one point, they felt that they could control the sound, while images were more abstract, and they did not feel that they were able to control them.

Keywords: art, audiovisual, biofeedback, EEG, NFT, neuroscience, technology

Procedia PDF Downloads 64
1124 Nadler's Fixed Point Theorem on Partial Metric Spaces and its Application to a Homotopy Result

Authors: Hemant Kumar Pathak

Abstract:

In 1994, Matthews (S.G. Matthews, Partial metric topology, in: Proc. 8th Summer Conference on General Topology and Applications, in: Ann. New York Acad. Sci., vol. 728, 1994, pp. 183-197) introduced the concept of a partial metric as a part of the study of denotational semantics of data flow networks. He gave a modified version of the Banach contraction principle, more suitable in this context. In fact, (complete) partial metric spaces constitute a suitable framework to model several distinguished examples of the theory of computation and also to model metric spaces via domain theory. In this paper, we introduce the concept of almost partial Hausdorff metric. We prove a fixed point theorem for multi-valued mappings on partial metric space using the concept of almost partial Hausdorff metric and prove an analogous to the well-known Nadler’s fixed point theorem. In the sequel, we derive a homotopy result as an application of our main result.

Keywords: fixed point, partial metric space, homotopy, physical sciences

Procedia PDF Downloads 432
1123 Multi Universe Existence Based-On Quantum Relativity using DJV Circuit Experiment Interpretation

Authors: Muhammad Arif Jalil, Somchat Sonasang, Preecha Yupapin

Abstract:

This study hypothesizes that the universe is at the center of the universe among the white and black holes, which are the entangled pairs. The coupling between them is in terms of spacetime forming the universe and things. The birth of things is based on exchange energy between the white and black sides. That is, the transition from the white side to the black side is called wave-matter, where it has a speed faster than light with positive gravity. The transition from the black to the white side has a speed faster than light with negative gravity called a wave-particle. In the part where the speed is equal to light, the particle rest mass is formed. Things can appear to take shape here. Thus, the gravity is zero because it is the center. The gravitational force belongs to the Earth itself because it is in a position that is twisted towards the white hole. Therefore, it is negative. The coupling of black-white holes occurs directly on both sides. The mass is formed at the saturation and will create universes and other things. Therefore, it can be hundreds of thousands of universes on both sides of the B and white holes before reaching the saturation point of multi-universes. This work will use the DJV circuit that the research team made as an entangled or two-level system circuit that has been experimentally demonstrated. Therefore, this principle has the possibility for interpretation. This work explains the emergence of multiple universes and can be applied as a practical guideline for searching for universes in the future. Moreover, the results indicate that the DJV circuit can create the elementary particles according to Feynman's diagram with rest mass conditions, which will be discussed for fission and fusion applications.

Keywords: multi-universes, feynman diagram, fission, fusion

Procedia PDF Downloads 57
1122 Entrepreneurial Venture Creation through Anchor Event Activities: Pop-Up Stores as On-Site Arenas

Authors: Birgit A. A. Solem, Kristin Bentsen

Abstract:

Scholarly attention in entrepreneurship is currently directed towards understanding entrepreneurial venture creation as a process -the journey of new economic activities from nonexistence to existence often studied through flow- or network models. To complement existing research on entrepreneurial venture creation with more interactivity-based research of organized activities, this study examines two pop-up stores as anchor events involving on-site activities of fifteen participating entrepreneurs launching their new ventures. The pop-up stores were arranged in two middle-sized Norwegian cities and contained different brand stores that brought together actors of sub-networks and communities executing venture creation activities. The pop-up stores became on-site arenas for the entrepreneurs to create, maintain, and rejuvenate their networks, at the same time as becoming venues for temporal coordination of activities involving existing and potential customers in their venture creation. In this work, we apply a conceptual framework based on frequently addressed dilemmas within entrepreneurship theory (discovery/creation, causation/effectuation) to further shed light on the broad aspect of on-site anchor event activities and their venture creation outcomes. The dilemma-based concepts are applied as an analytic toolkit to pursue answers regarding the nature of anchor event activities typically found within entrepreneurial venture creation and how these anchor event activities affect entrepreneurial venture creation outcomes. Our study combines researcher participation with 200 hours of observation and twenty in-depth interviews. Data analysis followed established guidelines for hermeneutic analysis and was intimately intertwined with ongoing data collection. Data was coded and categorized in NVivo 12 software, and iterated several times as patterns were steadily developing. Our findings suggest that core anchor event activities typically found within entrepreneurial venture creation are; a concept- and product experimentation with visitors, arrangements to socialize (evening specials, auctions, and exhibitions), store-in-store concepts, arranged meeting places for peers and close connection with municipality and property owners. Further, this work points to four main entrepreneurial venture creation outcomes derived from the core anchor event activities; (1) venture attention, (2) venture idea-realization, (3) venture collaboration, and (4) venture extension. Our findings show that, depending on which anchor event activities are applied, the outcomes vary. Theoretically, this study offers two main implications. First, anchor event activities are both discovered and created, following the logic of causation, at the same time as being experimental, based on “learning by doing” principles of effectuation during the execution. Second, our research enriches prior studies on venture creation as a process. In this work, entrepreneurial venture creation activities and outcomes are understood through pop-up stores as on-site anchor event arenas, particularly suitable for interactivity-based research requested by the entrepreneurship field. This study also reveals important managerial implications, such as that entrepreneurs should allow themselves to find creative physical venture creation arenas (e.g., pop-up stores, showrooms), as well as collaborate with partners when discovering and creating concepts and activities based on new ideas. In this way, they allow themselves to both strategically plan for- and continually experiment with their venture.

Keywords: anchor event, interactivity-based research, pop-up store, entrepreneurial venture creation

Procedia PDF Downloads 82
1121 Assessing Firm Readiness to Implement Cloud Computing: Toward a Comprehensive Model

Authors: Seyed Mohammadbagher Jafari, Elahe Mahdizadeh, Masomeh Ghahremani

Abstract:

Nowadays almost all organizations depend on information systems to run their businesses. Investment on information systems and their maintenance to keep them always in best situation to support firm business is one of the main issues for every organization. The new concept of cloud computing was developed as a technical and economic model to address this issue. In cloud computing the computing resources, including networks, applications, hardwares and services are configured as needed and are available at the moment of request. However, migration to cloud is not an easy task and there are many issues that should be taken into account. This study tries to provide a comprehensive model to assess a firm readiness to implement cloud computing. By conducting a systematic literature review, four dimensions of readiness were extracted which include technological, human, organizational and environmental dimensions. Every dimension has various criteria that have been discussed in details. This model provides a framework for cloud computing readiness assessment. Organizations that intend to migrate to cloud can use this model as a tool to assess their firm readiness before making any decision on cloud implementation.

Keywords: cloud computing, human readiness, organizational readiness, readiness assessment model

Procedia PDF Downloads 389
1120 Songwriting in the Postdigital Age: Using TikTok and Instagram as Online Informal Learning Technologies

Authors: Matthias Haenisch, Marc Godau, Julia Barreiro, Dominik Maxelon

Abstract:

In times of ubiquitous digitalization and the increasing entanglement of humans and technologies in musical practices in the 21st century, it is to be asked, how popular musicians learn in the (post)digital Age. Against the backdrop of the increasing interest in transferring informal learning practices into formal settings of music education the interdisciplinary research association »MusCoDA – Musical Communities in the (Post)Digital Age« (University of Erfurt/University of Applied Sciences Clara Hoffbauer Potsdam, funded by the German Ministry of Education and Research, pursues the goal to derive an empirical model of collective songwriting practices from the study of informal lelearningf songwriters and bands that can be translated into pedagogical concepts for music education in schools. Drawing on concepts from Community of Musical Practice and Actor Network Theory, lelearnings considered not only as social practice and as participation in online and offline communities, but also as an effect of heterogeneous networks composed of human and non-human actors. Learning is not seen as an individual, cognitive process, but as the formation and transformation of actor networks, i.e., as a practice of assembling and mediating humans and technologies. Based on video stimulated recall interviews and videography of online and offline activities, songwriting practices are followed from the initial idea to different forms of performance and distribution. The data evaluation combines coding and mapping methods of Grounded Theory Methodology and Situational Analysis. This results in network maps in which both the temporality of creative practices and the material and spatial relations of human and technological actors are reconstructed. In addition, positional analyses document the power relations between the participants that structure the learning process of the field. In the area of online informal lelearninginitial key research findings reveal a transformation of the learning subject through the specific technological affordances of TikTok and Instagram and the accompanying changes in the learning practices of the corresponding online communities. Learning is explicitly shaped by the material agency of online tools and features and the social practices entangled with these technologies. Thus, any human online community member can be invited to directly intervene in creative decisions that contribute to the further compositional and structural development of songs. At the same time, participants can provide each other with intimate insights into songwriting processes in progress and have the opportunity to perform together with strangers and idols. Online Lelearnings characterized by an increase in social proximity, distribution of creative agency and informational exchange between participants. While it seems obvious that traditional notions not only of lelearningut also of the learning subject cannot be maintained, the question arises, how exactly the observed informal learning practices and the subject that emerges from the use of social media as online learning technologies can be transferred into contexts of formal learning

Keywords: informal learning, postdigitality, songwriting, actor-network theory, community of musical practice, social media, TikTok, Instagram, apps

Procedia PDF Downloads 120
1119 Use of Generative Adversarial Networks (GANs) in Neuroimaging and Clinical Neuroscience Applications

Authors: Niloufar Yadgari

Abstract:

GANs are a potent form of deep learning models that have found success in various fields. They are part of the larger group of generative techniques, which aim to produce authentic data using a probabilistic model that learns distributions from actual samples. In clinical settings, GANs have demonstrated improved abilities in capturing spatially intricate, nonlinear, and possibly subtle disease impacts in contrast to conventional generative techniques. This review critically evaluates the current research on how GANs are being used in imaging studies of different neurological conditions like Alzheimer's disease, brain tumors, aging of the brain, and multiple sclerosis. We offer a clear explanation of different GAN techniques for each use case in neuroimaging and delve into the key hurdles, unanswered queries, and potential advancements in utilizing GANs in this field. Our goal is to connect advanced deep learning techniques with neurology studies, showcasing how GANs can assist in clinical decision-making and enhance our comprehension of the structural and functional aspects of brain disorders.

Keywords: GAN, pathology, generative adversarial network, neuro imaging

Procedia PDF Downloads 18
1118 Typology of Fake News Dissemination Strategies in Social Networks in Social Events

Authors: Mohadese Oghbaee, Borna Firouzi

Abstract:

The emergence of the Internet and more specifically the formation of social media has provided the ground for paying attention to new types of content dissemination. In recent years, Social media users share information, communicate with others, and exchange opinions on social events in this space. Many of the information published in this space are suspicious and produced with the intention of deceiving others. These contents are often called "fake news". Fake news, by disrupting the circulation of the concept and similar concepts such as fake news with correct information and misleading public opinion, has the ability to endanger the security of countries and deprive the audience of the basic right of free access to real information; Competing governments, opposition elements, profit-seeking individuals and even competing organizations, knowing about this capacity, act to distort and overturn the facts in the virtual space of the target countries and communities on a large scale and influence public opinion towards their goals. This process of extensive de-truthing of the information space of the societies has created a wave of harm and worries all over the world. The formation of these concerns has led to the opening of a new path of research for the timely containment and reduction of the destructive effects of fake news on public opinion. In addition, the expansion of this phenomenon has the potential to create serious and important problems for societies, and its impact on events such as the 2016 American elections, Brexit, 2017 French elections, 2019 Indian elections, etc., has caused concerns and led to the adoption of approaches It has been dealt with. In recent years, a simple look at the growth trend of research in "Scopus" shows an increasing increase in research with the keyword "false information", which reached its peak in 2020, namely 524 cases, reached, while in 2015, only 30 scientific-research contents were published in this field. Considering that one of the capabilities of social media is to create a context for the dissemination of news and information, both true and false, in this article, the classification of strategies for spreading fake news in social networks was investigated in social events. To achieve this goal, thematic analysis research method was chosen. In this way, an extensive library study was first conducted in global sources. Then, an in-depth interview was conducted with 18 well-known specialists and experts in the field of news and media in Iran. These experts were selected by purposeful sampling. Then by analyzing the data using the theme analysis method, strategies were obtained; The strategies achieved so far (research is in progress) include unrealistically strengthening/weakening the speed and content of the event, stimulating psycho-media movements, targeting emotional audiences such as women, teenagers and young people, strengthening public hatred, calling the reaction legitimate/illegitimate. events, incitement to physical conflict, simplification of violent protests and targeted publication of images and interviews were introduced.

Keywords: fake news, social network, social events, thematic analysis

Procedia PDF Downloads 56
1117 Executive Function in Youth With ADHD and ASD: A Systematic Review and Meta-analysis

Authors: Parker Townes, Prabdeep Panesar, Chunlin Liu, Soo Youn Lee, Dan Devoe, Paul D. Arnold, Jennifer Crosbie, Russell Schachar

Abstract:

Attention-deficit hyperactivity disorder (ADHD) and autism spectrum disorder (ASD) are impairing childhood neurodevelopmental disorders with problems in executive functions. Executive functions are higher-level mental processes essential for daily functioning and goal attainment. There is genetic and neural overlap between ADHD and ASD. The aim of this meta-analysis was to evaluate if pediatric ASD and ADHD have distinct executive function profiles. This review was completed following Cochrane guidelines. Fifty-eight articles were identified through database searching, followed by a blinded screening in duplicate. A meta-analysis was performed for all task performance metrics evaluated by at least two articles. Forty-five metrics from 24 individual tasks underwent analysis. No differences were found between youth with ASD and ADHD in any domain under direct comparison. However, individuals with ASD and ADHD exhibited deficient attention, flexibility, visuospatial abilities, working memory, processing speed, and response inhibition compared to controls. No deficits in planning were noted in either disorder. Only 11 studies included a group with comorbid ASD+ADHD, making it difficult to determine whether common executive function deficits are a function of comorbidity. Further research is needed to determine if comorbidity accounts for the apparent commonality in executive function between ASD and ADHD.

Keywords: autism spectrum disorder, ADHD, neurocognition, executive function, youth

Procedia PDF Downloads 71
1116 Assessment of Smart Mechatronics Application in Agriculture

Authors: Sairoel Amertet, Girma Gebresenbet

Abstract:

Smart mechatronics systems in agriculture can be traced back to the mid-1980s, when research into automated fruit harvesting systems began in Japan, Europe, and the United States. Since then, impressive advances have been made in smart mechatronics systems. Furthermore, smart mechatronics systems are promising areas, and as a result, we were intrigued to learn more about them. Consequently, the purpose of this study was to examine the smart mechatronic systems that have been applied to agricultural areas so far, with inspiration from the smart mechatronic system in other sectors. To get an overview of the current state of the art, benefits and drawbacks of smart mechatronics systems, various approaches were investigated. Moreover, smart mechatronic modules and various networks applied in agriculture processing were examined. Finally, we explored how the data retrieved using the one-way analysis of variance related to each other. The result showed that there were strongly related keywords for different journals. With the virtually limited use of sophisticated mechatronics in the agricultural industry and, at the same time, the low production rate, the demand for food security has fallen dramatically. Therefore, the application of smart mechatronics systems in agricultural sectors would be taken into consideration in order to overcome these issues.

Keywords: mechatronics, robotic, robotic system, automation, agriculture mechanism

Procedia PDF Downloads 71
1115 Modeling Food Popularity Dependencies Using Social Media Data

Authors: DEVASHISH KHULBE, MANU PATHAK

Abstract:

The rise in popularity of major social media platforms have enabled people to share photos and textual information about their daily life. One of the popular topics about which information is shared is food. Since a lot of media about food are attributed to particular locations and restaurants, information like spatio-temporal popularity of various cuisines can be analyzed. Tracking the popularity of food types and retail locations across space and time can also be useful for business owners and restaurant investors. In this work, we present an approach using off-the shelf machine learning techniques to identify trends and popularity of cuisine types in an area using geo-tagged data from social media, Google images and Yelp. After adjusting for time, we use the Kernel Density Estimation to get hot spots across the location and model the dependencies among food cuisines popularity using Bayesian Networks. We consider the Manhattan borough of New York City as the location for our analyses but the approach can be used for any area with social media data and information about retail businesses.

Keywords: Web Mining, Geographic Information Systems, Business popularity, Spatial Data Analyses

Procedia PDF Downloads 110
1114 The Influence of Disturbances Generated by Arc Furnaces on the Power Quality

Authors: Z. Olczykowski

Abstract:

The paper presents the impact of work on the electric arc furnace. Arc equipment is one of the largest receivers powered by the power system. Electric arc disturbances arising during melting process occurring in these furnaces are the cause of an abrupt change of the passive power of furnaces. Currents drawn by these devices undergo an abrupt change, which in turn cause voltage fluctuations and light flicker. The quantitative evaluation of the voltage fluctuations is now the basic criterion of assessment of an influence of unquiet receiver on the supplying net. The paper presents the method of determination of range of voltage fluctuations and light flicker at parallel operation of arc devices. The results of measurements of voltage fluctuations and light flicker indicators recorded in power supply networks of steelworks were presented, with different number of parallel arc devices. Measurements of energy quality parameters were aimed at verifying the proposed method in practice. It was also analyzed changes in other parameters of electricity: the content of higher harmonics, asymmetry, voltage dips.

Keywords: power quality, arc furnaces, propagation of voltage fluctuations, disturbances

Procedia PDF Downloads 130
1113 Bitplanes Image Encryption/Decryption Using Edge Map (SSPCE Method) and Arnold Transform

Authors: Ali A. Ukasha

Abstract:

Data security needed in data transmission, storage, and communication to ensure the security. The single step parallel contour extraction (SSPCE) method is used to create the edge map as a key image from the different Gray level/Binary image. Performing the X-OR operation between the key image and each bit plane of the original image for image pixel values change purpose. The Arnold transform used to changes the locations of image pixels as image scrambling process. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Gary level image and completely reconstructed without any distortion. Also shown that the analyzed algorithm have extremely large security against some attacks like salt & pepper and JPEG compression. Its proof that the Gray level image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.

Keywords: SSPCE method, image compression, salt and peppers attacks, bitplanes decomposition, Arnold transform, lossless image encryption

Procedia PDF Downloads 488
1112 Multi-Level Air Quality Classification in China Using Information Gain and Support Vector Machine

Authors: Bingchun Liu, Pei-Chann Chang, Natasha Huang, Dun Li

Abstract:

Machine Learning and Data Mining are the two important tools for extracting useful information and knowledge from large datasets. In machine learning, classification is a wildly used technique to predict qualitative variables and is generally preferred over regression from an operational point of view. Due to the enormous increase in air pollution in various countries especially China, Air Quality Classification has become one of the most important topics in air quality research and modelling. This study aims at introducing a hybrid classification model based on information theory and Support Vector Machine (SVM) using the air quality data of four cities in China namely Beijing, Guangzhou, Shanghai and Tianjin from Jan 1, 2014 to April 30, 2016. China's Ministry of Environmental Protection has classified the daily air quality into 6 levels namely Serious Pollution, Severe Pollution, Moderate Pollution, Light Pollution, Good and Excellent based on their respective Air Quality Index (AQI) values. Using the information theory, information gain (IG) is calculated and feature selection is done for both categorical features and continuous numeric features. Then SVM Machine Learning algorithm is implemented on the selected features with cross-validation. The final evaluation reveals that the IG and SVM hybrid model performs better than SVM (alone), Artificial Neural Network (ANN) and K-Nearest Neighbours (KNN) models in terms of accuracy as well as complexity.

Keywords: machine learning, air quality classification, air quality index, information gain, support vector machine, cross-validation

Procedia PDF Downloads 231
1111 The Impact of Artificial Intelligence on Pharmacy and Pharmacology

Authors: Mamdouh Milad Adly Morkos

Abstract:

Despite having the greatest rates of mortality and morbidity in the world, low- and middle-income (LMIC) nations trail high-income nations in terms of the number of clinical trials, the number of qualified researchers, and the amount of research information specific to their people. Health inequities and the use of precision medicine may be hampered by a lack of local genomic data, clinical pharmacology and pharmacometrics competence, and training opportunities. These issues can be solved by carrying out health care infrastructure development, which includes data gathering and well-designed clinical pharmacology training in LMICs. It will be advantageous if there is international cooperation focused at enhancing education and infrastructure and promoting locally motivated clinical trials and research. This paper outlines various instances where clinical pharmacology knowledge could be put to use, including pharmacogenomic opportunities that could lead to better clinical guideline recommendations. Examples of how clinical pharmacology training can be successfully implemented in LMICs are also provided, including clinical pharmacology and pharmacometrics training programmes in Africa and a Tanzanian researcher's personal experience while on a training sabbatical in the United States. These training initiatives will profit from advocacy for clinical pharmacologists' employment prospects and career development pathways, which are gradually becoming acknowledged and established in LMICs. The advancement of training and research infrastructure to increase clinical pharmacologists' knowledge in LMICs would be extremely beneficial because they have a significant role to play in global health

Keywords: electromagnetic solar system, nano-material, nano pharmacology, pharmacovigilance, quantum theoryclinical simulation, education, pharmacology, simulation, virtual learning low- and middle-income, clinical pharmacology, pharmacometrics, career development pathways

Procedia PDF Downloads 70
1110 Nanoparticles in Drug Delivery and Therapy of Alzeheimer's Disease

Authors: Nirupama Dixit, Anyaa Mittal, Neeru Sood

Abstract:

Alzheimer’s disease (AD) is a progressive form of dementia, contributing to up to 70% of cases, mostly observed in elderly but is not restricted to old age. The pathophysiology of the disease is characterized by specific pathological changes in brain. The changes (i.e. accumulation of metal ions in brain, formation of extracellular β-amyloid (Aβ) peptide aggregates and tangle of hyper phosphorylated Tau protein inside neurons) damage the neuronal connections irreversibly. The current issues in improvement of life quality of Alzheimer's patient lies in the fact that the diagnosis is made at a late stage of the disease and the medications do not treat the basic causes of Alzheimer's. The targeted delivery of drug through the blood brain barrier (BBB) poses several limitations via traditional approaches for treatment. To overcome these drug delivery limitation, nanoparticles provide a promising solution. This review focuses on current strategies for efficient targeted drug delivery using nanoparticles and improving the quality of therapy provided to the patient. Nanoparticles can be used to encapsulate drug (which is generally hydrophobic) to ensure its passage to brain; they can be conjugated to metal ion chelators to reduce the metal load in neural tissue thus lowering the harmful effects of oxidative damage; can be conjugated with drug and monoclonal antibodies against BBB endogenous receptors. Finally this review covers how the nanoparticles can play a role in diagnosing the disease.

Keywords: Alzheimer's disease, β-amyloid plaques, blood brain barrier, metal chelators, nanoparticles

Procedia PDF Downloads 482
1109 Synthesis and Characterization of Biodegradable Elastomeric Polyester Amide for Tissue Engineering Applications

Authors: Abdulrahman T. Essa, Ahmed Aied, Omar Hamid, Felicity R. A. J. Rose, Kevin M. Shakesheff

Abstract:

Biodegradable poly(ester amide)s are promising polymers for biomedical applications such as drug delivery and tissue engineering because of their optimized chemical and physical properties. In this study, we developed a biodegradable polyester amide elastomer poly(serinol sebacate) (PSS) composed of crosslinked networks based on serinol and sebacic acid. The synthesized polymers were characterized to evaluate their chemical structures, mechanical properties, degradation behaviors and in vitro cytocompatibility. Analysis of proton nuclear magnetic resonance and Fourier transform infrared spectroscopy revealed the structure of the polymer. The PSS exhibit excellent solubility in a variety of solvents such as methanol, dimethyl sulfoxide and dimethylformamide. More importantly, the mechanical properties of PSS could be tuned by changing the curing conditions. In addition, the 3T3 fibroblast cells cultured on the PSS demonstrated good cell attachment and high viability.

Keywords: biodegradable, biomaterial, elastomer, mechanical properties, poly(serinol sebacate)

Procedia PDF Downloads 347
1108 AutoML: Comprehensive Review and Application to Engineering Datasets

Authors: Parsa Mahdavi, M. Amin Hariri-Ardebili

Abstract:

The development of accurate machine learning and deep learning models traditionally demands hands-on expertise and a solid background to fine-tune hyperparameters. With the continuous expansion of datasets in various scientific and engineering domains, researchers increasingly turn to machine learning methods to unveil hidden insights that may elude classic regression techniques. This surge in adoption raises concerns about the adequacy of the resultant meta-models and, consequently, the interpretation of the findings. In response to these challenges, automated machine learning (AutoML) emerges as a promising solution, aiming to construct machine learning models with minimal intervention or guidance from human experts. AutoML encompasses crucial stages such as data preparation, feature engineering, hyperparameter optimization, and neural architecture search. This paper provides a comprehensive overview of the principles underpinning AutoML, surveying several widely-used AutoML platforms. Additionally, the paper offers a glimpse into the application of AutoML on various engineering datasets. By comparing these results with those obtained through classical machine learning methods, the paper quantifies the uncertainties inherent in the application of a single ML model versus the holistic approach provided by AutoML. These examples showcase the efficacy of AutoML in extracting meaningful patterns and insights, emphasizing its potential to revolutionize the way we approach and analyze complex datasets.

Keywords: automated machine learning, uncertainty, engineering dataset, regression

Procedia PDF Downloads 54
1107 Predicting Options Prices Using Machine Learning

Authors: Krishang Surapaneni

Abstract:

The goal of this project is to determine how to predict important aspects of options, including the ask price. We want to compare different machine learning models to learn the best model and the best hyperparameters for that model for this purpose and data set. Option pricing is a relatively new field, and it can be very complicated and intimidating, especially to inexperienced people, so we want to create a machine learning model that can predict important aspects of an option stock, which can aid in future research. We tested multiple different models and experimented with hyperparameter tuning, trying to find some of the best parameters for a machine-learning model. We tested three different models: a Random Forest Regressor, a linear regressor, and an MLP (multi-layer perceptron) regressor. The most important feature in this experiment is the ask price; this is what we were trying to predict. In the field of stock pricing prediction, there is a large potential for error, so we are unable to determine the accuracy of the models based on if they predict the pricing perfectly. Due to this factor, we determined the accuracy of the model by finding the average percentage difference between the predicted and actual values. We tested the accuracy of the machine learning models by comparing the actual results in the testing data and the predictions made by the models. The linear regression model performed worst, with an average percentage error of 17.46%. The MLP regressor had an average percentage error of 11.45%, and the random forest regressor had an average percentage error of 7.42%

Keywords: finance, linear regression model, machine learning model, neural network, stock price

Procedia PDF Downloads 69
1106 The Influence of Beta Shape Parameters in Project Planning

Authors: Αlexios Kotsakis, Stefanos Katsavounis, Dimitra Alexiou

Abstract:

Networks can be utilized to represent project planning problems, using nodes for activities and arcs to indicate precedence relationship between them. For fixed activity duration, a simple algorithm calculates the amount of time required to complete a project, followed by the activities that comprise the critical path. Program Evaluation and Review Technique (PERT) generalizes the above model by incorporating uncertainty, allowing activity durations to be random variables, producing nevertheless a relatively crude solution in planning problems. In this paper, based on the findings of the relevant literature, which strongly suggests that a Beta distribution can be employed to model earthmoving activities, we utilize Monte Carlo simulation, to estimate the project completion time distribution and measure the influence of skewness, an element inherent in activities of modern technical projects. We also extract the activity criticality index, with an ultimate goal to produce more accurate planning estimations.

Keywords: beta distribution, PERT, Monte Carlo simulation, skewness, project completion time distribution

Procedia PDF Downloads 140
1105 [Keynote Talk]: Computer-Assisted Language Learning (CALL) for Teaching English to Speakers of Other Languages (TESOL/ESOL) as a Foreign Language (TEFL/EFL), Second Language (TESL/ESL), or Additional Language (TEAL/EAL)

Authors: Andrew Laghos

Abstract:

Computer-assisted language learning (CALL) is defined as the use of computers to help learn languages. In this study we look at several different types of CALL tools and applications and how they can assist Adults and Young Learners in learning the English language as a foreign, second or additional language. It is important to identify the roles of the teacher and the learners, and what the learners’ motivations are for learning the language. Audio, video, interactive multimedia games, online translation services, conferencing, chat rooms, discussion forums, social networks, social media, email communication, songs and music video clips are just some of the many ways computers are currently being used to enhance language learning. CALL may be used for classroom teaching as well as for online and mobile learning. Advantages and disadvantages of CALL are discussed and the study ends with future predictions of CALL.

Keywords: computer-assisted language learning (CALL), teaching English as a foreign language (TEFL/EFL), adult learners, young learners

Procedia PDF Downloads 422
1104 Multilingual Females and Linguistic Change: A Quantitative and Qualitative Sociolinguistic Case Study of Minority Speaker in Southeast Asia

Authors: Stefanie Siebenhütter

Abstract:

Men and women use minority and majority languages differently and with varying confidence levels. This paper contrasts gendered differences in language use with socioeconomic status and age factors of minority language speakers in Southeast Asia. Language use and competence are conditioned by the variable of gender. Potential reasons for this variation by examining gendered language awareness and sociolinguistic attitudes will be given. Moreover, it is analyzed whether women in multilingual minority speakers’ society function as 'leaders of linguistic change', as represented in Labov’s sociolinguistic model. It is asked whether the societal role expectations in collectivistic cultures influence the model of linguistic change. The findings reveal speaking preferences and suggest predictions on the prospective language use, which is a stable situation of multilingualism. The study further exhibits differences between male and females identity-forming processes and shows why females are the leaders of (socio-) linguistic change.

Keywords: gender, identity construction, multilingual minorities, linguistic change, social networks

Procedia PDF Downloads 153
1103 Composition Dependent Spectroscopic Studies of Sm3+-Doped Alkali Fluoro Tungsten Tellurite Glasses

Authors: K. Swapna, Sk. Mahamuda, Ch, Annapurna, A. Srinivasa Rao, G. Vijaya Prakash

Abstract:

Samarium ions doped Alkali Fluoro Tungsten Tellurite (AFTT) Glasses have been prepared by using the melt quenching technique and characterized through various spectroscopic techniques such as optical absorption, excitation, emission and decay spectral studies. From the measured absorption spectra of Sm3+ ions in AFTT glasses, the optical band gap and Urbach energies have been evaluated. The spectroscopic parameters such as oscillator strengths (f), Judd-Ofelt (J-O) intensity parameters (Ωλ), spontaneous emission probability (AR), branching ratios (βR) and radiative lifetimes (τR) of various excited levels have been determined from the absorption spectrum by using J-O analysis. A strong luminescence in the reddish-orange spectral region has been observed for all the Sm3+ ions doped AFTT glasses. It consisting four emission transitions occurring from the 4G5/2metastable state to the lower lying states 6H5/2, 6H7/2, 6H9/2 and 6H11/2 upon exciting the sample with a 478 nm line of an argon ion laser. The stimulated emission cross-sections (σe) and branching ratios (βmeas) were estimated from the emission spectra for all emission transitions. Correlation of the radiative lifetime with the experimental lifetime measured from the day curves allows us to measure the quantum efficiency of the prepared glasses. In order to know the colour emission of the prepared glasses under near UV excitation, the emission intensities were analyzed using CIE 1931 colour chromaticity diagram. The aforementioned spectral studies carried out on Sm3+ ions doped AFTT glasses allowed us to conclude that, these glasses are best suited for orange-red visible lasers.

Keywords: fluoro tungsten tellurite glasses, judd-ofelt intensity parameters, lifetime, stimulated emission cross-section

Procedia PDF Downloads 269
1102 The Sub-Optimality of the Electricity Subsidy on Tube Wells in Balochistan (Pakistan): An Analysis Based on Socio-Cultural and Policy Distortions

Authors: Rameesha Javaid

Abstract:

Agriculture is the backbone of the economy of the province of Balochistan which is known as the ‘fruit basket’ of Pakistan. Its climate zones comprising highlands and plateaus, dependent on rain water, are more suited for the production of deciduous fruit. The vagaries of weather and more so the persistent droughts prompted the government to announce flat rates of electricity bills per month irrespective of the size of the farm, quantum or water used and the category of crop group. That has, no doubt, resulted in increased cropping intensity, more production and employment but has enormously burdened the official exchequer which picks up the residual bills in certain percentages amongst the federal and provincial governments and the local electricity company. This study tests the desirability of continuing the subsidy in the present mode. Optimization of social welfare of farmers has been the focus of the study with emphasis on the contribution of positive externalities and distortions caused in terms of negative externalities. By using the optimization technique with due allowance for distortions, it has been established that the subsidy calls for limiting policy distortions as they cause sub-optimal utilization of the tube well subsidy and improved policy programming. The sensitivity analysis with changed rankings of contributing variables towards social welfare does not significantly change the result. Therefore it leads to the net findings and policy recommendations of significantly reducing the subsidy size, correcting and curtailing policy distortions and targeting the subsidy grant more towards small farmers to generate more welfare by saving a sizeable amount from the subsidy for investment in the wellbeing of the farmers in rural Balochistan.

Keywords: distortion, policy distortion, socio-cultural distortion, social welfare, subsidy

Procedia PDF Downloads 282
1101 Tehran Province Water and Wastewater Company Approach on Energy Efficiency by the Development of Renewable Energy to Achieving the Sustainable Development Legal Principle

Authors: Mohammad Parvaresh, Mahdi Babaee, Bahareh Arghand, Roushanak Fahimi Hanzaee, Davood Nourmohammadi

Abstract:

Today, the intelligent network of water and wastewater as one of the key steps in realizing the smart city in the world. Use of pressure relief valves in urban water networks in order to reduce the pressure is necessary in Tehran city. But use these pressure relief valves lead to waste water, more power consumption, and environmental pollution because Tehran Province Water and Wastewater Co. use a quarter of industry 's electricity. In this regard, Tehran Province Water and Wastewater Co. identified solutions to reduce direct and indirect costs in energy use in the process of production, transmission and distribution of water because this company has extensive facilities and high capacity to realize green economy and industry. The aim of this study is to analyze the new project in water and wastewater industry to reach sustainable development.

Keywords: Tehran Province Water and Wastewater Company, water network efficiency, sustainable development, International Environmental Law

Procedia PDF Downloads 285
1100 Images Selection and Best Descriptor Combination for Multi-Shot Person Re-Identification

Authors: Yousra Hadj Hassen, Walid Ayedi, Tarek Ouni, Mohamed Jallouli

Abstract:

To re-identify a person is to check if he/she has been already seen over a cameras network. Recently, re-identifying people over large public cameras networks has become a crucial task of great importance to ensure public security. The vision community has deeply investigated this area of research. Most existing researches rely only on the spatial appearance information from either one or multiple person images. Actually, the real person re-id framework is a multi-shot scenario. However, to efficiently model a person’s appearance and to choose the best samples to remain a challenging problem. In this work, an extensive comparison of descriptors of state of the art associated with the proposed frame selection method is studied. Specifically, we evaluate the samples selection approach using multiple proposed descriptors. We show the effectiveness and advantages of the proposed method by extensive comparisons with related state-of-the-art approaches using two standard datasets PRID2011 and iLIDS-VID.

Keywords: camera network, descriptor, model, multi-shot, person re-identification, selection

Procedia PDF Downloads 270