Search results for: physics-informed neural network
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5283

Search results for: physics-informed neural network

2163 A Social Network Analysis for Formulating Construction Defect Generation Mechanisms

Authors: Hamad Aljassmi, Sangwon Han

Abstract:

Various solutions for preventing construction defects have been suggested. However, a construction company may have difficulties adopting all these suggestions due to financial and practical constraints. Based on this recognition, this paper aims to identify the most significant defect causes and formulate their defect generation mechanism in order to help a construction company to set priorities of its defect prevention strategies. For this goal, we conducted a questionnaire survey of 106 industry professionals and identified five most significant causes including: (1) organizational culture, (2) time pressure and constraints, (3) workplace quality system, (4) financial constraints upon operational expenses and (5) inadequate employee training or learning opportunities.

Keywords: defect, quality, failure, risk

Procedia PDF Downloads 627
2162 A Method for False Alarm Recognition Based on Multi-Classification Support Vector Machine

Authors: Weiwei Cui, Dejian Lin, Leigang Zhang, Yao Wang, Zheng Sun, Lianfeng Li

Abstract:

Built-in test (BIT) is an important technology in testability field, and it is widely used in state monitoring and fault diagnosis. With the improvement of modern equipment performance and complexity, the scope of BIT becomes larger, and it leads to the emergence of false alarm problem. The false alarm makes the health assessment unstable, and it reduces the effectiveness of BIT. The conventional false alarm suppression methods such as repeated test and majority voting cannot meet the requirement for a complicated system, and the intelligence algorithms such as artificial neural networks (ANN) are widely studied and used. However, false alarm has a very low frequency and small sample, yet a method based on ANN requires a large size of training sample. To recognize the false alarm, we propose a method based on multi-classification support vector machine (SVM) in this paper. Firstly, we divide the state of a system into three states: healthy, false-alarm, and faulty. Then we use multi-classification with '1 vs 1' policy to train and recognize the state of a system. Finally, an example of fault injection system is taken to verify the effectiveness of the proposed method by comparing ANN. The result shows that the method is reasonable and effective.

Keywords: false alarm, fault diagnosis, SVM, k-means, BIT

Procedia PDF Downloads 155
2161 A Deep-Learning Based Prediction of Pancreatic Adenocarcinoma with Electronic Health Records from the State of Maine

Authors: Xiaodong Li, Peng Gao, Chao-Jung Huang, Shiying Hao, Xuefeng B. Ling, Yongxia Han, Yaqi Zhang, Le Zheng, Chengyin Ye, Modi Liu, Minjie Xia, Changlin Fu, Bo Jin, Karl G. Sylvester, Eric Widen

Abstract:

Predicting the risk of Pancreatic Adenocarcinoma (PA) in advance can benefit the quality of care and potentially reduce population mortality and morbidity. The aim of this study was to develop and prospectively validate a risk prediction model to identify patients at risk of new incident PA as early as 3 months before the onset of PA in a statewide, general population in Maine. The PA prediction model was developed using Deep Neural Networks, a deep learning algorithm, with a 2-year electronic-health-record (EHR) cohort. Prospective results showed that our model identified 54.35% of all inpatient episodes of PA, and 91.20% of all PA that required subsequent chemoradiotherapy, with a lead-time of up to 3 months and a true alert of 67.62%. The risk assessment tool has attained an improved discriminative ability. It can be immediately deployed to the health system to provide automatic early warnings to adults at risk of PA. It has potential to identify personalized risk factors to facilitate customized PA interventions.

Keywords: cancer prediction, deep learning, electronic health records, pancreatic adenocarcinoma

Procedia PDF Downloads 155
2160 Survey: Topology Hiding in Multipath Routing Protocol in MANET

Authors: Akshay Suhas Phalke, Manohar S. Chaudhari

Abstract:

In this paper, we have discussed the multipath routing with its variants. Our purpose is to discuss the different types of the multipath routing mechanism. Here we also put the taxonomy of the multipath routing. Multipath routing is used for the alternate path routing, reliable transmission of data and for better utilization of network resources. We also discussed the multipath routing for topology hiding such as TOHIP. In multipath routing, different parameters such as energy efficiency, packet delivery ratio, shortest path routing, fault tolerance play an important role. We have discussed a number of multipath routing protocol based on different parameters lastly.

Keywords: multi-path routing, WSN, topology, fault detection, trust

Procedia PDF Downloads 354
2159 A Local Tensor Clustering Algorithm to Annotate Uncharacterized Genes with Many Biological Networks

Authors: Paul Shize Li, Frank Alber

Abstract:

A fundamental task of clinical genomics is to unravel the functions of genes and their associations with disorders. Although experimental biology has made efforts to discover and elucidate the molecular mechanisms of individual genes in the past decades, still about 40% of human genes have unknown functions, not to mention the diseases they may be related to. For those biologists who are interested in a particular gene with unknown functions, a powerful computational method tailored for inferring the functions and disease relevance of uncharacterized genes is strongly needed. Studies have shown that genes strongly linked to each other in multiple biological networks are more likely to have similar functions. This indicates that the densely connected subgraphs in multiple biological networks are useful in the functional and phenotypic annotation of uncharacterized genes. Therefore, in this work, we have developed an integrative network approach to identify the frequent local clusters, which are defined as those densely connected subgraphs that frequently occur in multiple biological networks and consist of the query gene that has few or no disease or function annotations. This is a local clustering algorithm that models multiple biological networks sharing the same gene set as a three-dimensional matrix, the so-called tensor, and employs the tensor-based optimization method to efficiently find the frequent local clusters. Specifically, massive public gene expression data sets that comprehensively cover dynamic, physiological, and environmental conditions are used to generate hundreds of gene co-expression networks. By integrating these gene co-expression networks, for a given uncharacterized gene that is of biologist’s interest, the proposed method can be applied to identify the frequent local clusters that consist of this uncharacterized gene. Finally, those frequent local clusters are used for function and disease annotation of this uncharacterized gene. This local tensor clustering algorithm outperformed the competing tensor-based algorithm in both module discovery and running time. We also demonstrated the use of the proposed method on real data of hundreds of gene co-expression data and showed that it can comprehensively characterize the query gene. Therefore, this study provides a new tool for annotating the uncharacterized genes and has great potential to assist clinical genomic diagnostics.

Keywords: local tensor clustering, query gene, gene co-expression network, gene annotation

Procedia PDF Downloads 168
2158 Collaborative and Experimental Cultures in Virtual Reality Journalism: From the Perspective of Content Creators

Authors: Radwa Mabrook

Abstract:

Virtual Reality (VR) content creation is a complex and an expensive process, which requires multi-disciplinary teams of content creators. Grant schemes from technology companies help media organisations to explore the VR potential in journalism and factual storytelling. Media organisations try to do as much as they can in-house, but they may outsource due to time constraints and skill availability. Journalists, game developers, sound designers and creative artists work together and bring in new cultures of work. This study explores the collaborative experimental nature of VR content creation, through tracing every actor involved in the process and examining their perceptions of the VR work. The study builds on Actor Network Theory (ANT), which decomposes phenomena into their basic elements and traces the interrelations among them. Therefore, the researcher conducted 22 semi-structured interviews with VR content creators between November 2017 and April 2018. Purposive and snowball sampling techniques allowed the researcher to recruit fact-based VR content creators from production studios and media organisations, as well as freelancers. Interviews lasted up to three hours, and they were a mix of Skype calls and in-person interviews. Participants consented for their interviews to be recorded, and for their names to be revealed in the study. The researcher coded interviews’ transcripts in Nvivo software, looking for key themes that correspond with the research questions. The study revealed that VR content creators must be adaptive to change, open to learn and comfortable with mistakes. The VR content creation process is very iterative because VR has no established work flow or visual grammar. Multi-disciplinary VR team members often speak different languages making it hard to communicate. However, adaptive content creators perceive VR work as a fun experience and an opportunity to learn. The traditional sense of competition and the strive for information exclusivity are now replaced by a strong drive for knowledge sharing. VR content creators are open to share their methods of work and their experiences. They target to build a collaborative network that aims to harness VR technology for journalism and factual storytelling. Indeed, VR is instilling collaborative and experimental cultures in journalism.

Keywords: collaborative culture, content creation, experimental culture, virtual reality

Procedia PDF Downloads 127
2157 Extracting Attributes for Twitter Hashtag Communities

Authors: Ashwaq Alsulami, Jianhua Shao

Abstract:

Various organisations often need to understand discussions on social media, such as what trending topics are and characteristics of the people engaged in the discussion. A number of approaches have been proposed to extract attributes that would characterise a discussion group. However, these approaches are largely based on supervised learning, and as such they require a large amount of labelled data. We propose an approach in this paper that does not require labelled data, but rely on lexical sources to detect meaningful attributes for online discussion groups. Our findings show an acceptable level of accuracy in detecting attributes for Twitter discussion groups.

Keywords: attributed community, attribute detection, community, social network

Procedia PDF Downloads 162
2156 Using Deep Learning in Lyme Disease Diagnosis

Authors: Teja Koduru

Abstract:

Untreated Lyme disease can lead to neurological, cardiac, and dermatological complications. Rapid diagnosis of the erythema migrans (EM) rash, a characteristic symptom of Lyme disease is therefore crucial to early diagnosis and treatment. In this study, we aim to utilize deep learning frameworks including Tensorflow and Keras to create deep convolutional neural networks (DCNN) to detect images of acute Lyme Disease from images of erythema migrans. This study uses a custom database of erythema migrans images of varying quality to train a DCNN capable of classifying images of EM rashes vs. non-EM rashes. Images from publicly available sources were mined to create an initial database. Machine-based removal of duplicate images was then performed, followed by a thorough examination of all images by a clinician. The resulting database was combined with images of confounding rashes and regular skin, resulting in a total of 683 images. This database was then used to create a DCNN with an accuracy of 93% when classifying images of rashes as EM vs. non EM. Finally, this model was converted into a web and mobile application to allow for rapid diagnosis of EM rashes by both patients and clinicians. This tool could be used for patient prescreening prior to treatment and lead to a lower mortality rate from Lyme disease.

Keywords: Lyme, untreated Lyme, erythema migrans rash, EM rash

Procedia PDF Downloads 241
2155 Minimizing Fresh and Wastewater Using Water Pinch Technique in Petrochemical Industries

Authors: Wasif Mughees, Malik Al-Ahmad, Muhammad Naeem

Abstract:

This research involves the design and analysis of pinch-based water/wastewater networks to minimize water utility in the petrochemical and petroleum industries. A study has been done on Tehran Oil Refinery to analyze feasibilities of regeneration, reuse and recycling of water network. COD is considered as a single key contaminant. Amount of freshwater was reduced about 149m3/h (43.8%) regarding COD. Re-design (or retrofitting) of water allocation in the networks was undertaken. The results were analyzed through graphical method and mathematical programming technique which clearly demonstrated that amount of required water would be determined by mass transfer of COD.

Keywords: minimization, water pinch, water management, pollution prevention

Procedia PDF Downloads 448
2154 Results of Three-Year Operation of 220kV Pilot Superconducting Fault Current Limiter in Moscow Power Grid

Authors: M. Moyzykh, I. Klichuk, L. Sabirov, D. Kolomentseva, E. Magommedov

Abstract:

Modern city electrical grids are forced to increase their density due to the increasing number of customers and requirements for reliability and resiliency. However, progress in this direction is often limited by the capabilities of existing network equipment. New energy sources or grid connections increase the level of short-circuit currents in the adjacent network, which can exceed the maximum rating of equipment–breaking capacity of circuit breakers, thermal and dynamic current withstand qualities of disconnectors, cables, and transformers. Superconducting fault current limiter (SFCL) is a modern solution designed to deal with the increasing fault current levels in power grids. The key feature of this device is its instant (less than 2 ms) limitation of the current level due to the nature of the superconductor. In 2019 Moscow utilities installed SuperOx SFCL in the city power grid to test the capabilities of this novel technology. The SFCL became the first SFCL in the Russian energy system and is currently the most powerful SFCL in the world. Modern SFCL uses second-generation high-temperature superconductor (2G HTS). Despite its name, HTS still requires low temperatures of liquid nitrogen for operation. As a result, Moscow SFCL is built with a cryogenic system to provide cooling to the superconductor. The cryogenic system consists of three cryostats that contain a superconductor part and are filled with liquid nitrogen (three phases), three cryocoolers, one water chiller, three cryopumps, and pressure builders. All these components are controlled by an automatic control system. SFCL has been continuously operating on the city grid for over three years. During that period of operation, numerous faults occurred, including cryocooler failure, chiller failure, pump failure, and others (like a cryogenic system power outage). All these faults were eliminated without an SFCL shut down due to the specially designed cryogenic system backups and quick responses of grid operator utilities and the SuperOx crew. The paper will describe in detail the results of SFCL operation and cryogenic system maintenance and what measures were taken to solve and prevent similar faults in the future.

Keywords: superconductivity, current limiter, SFCL, HTS, utilities, cryogenics

Procedia PDF Downloads 80
2153 Ant-Tracking Attribute: A Model for Understanding Production Response

Authors: Prince Suka Neekia Momta, Rita Iheoma Achonyeulo

Abstract:

Ant Tracking seismic attribute applied over 4-seconds seismic volume revealed structural features triggered by clay diapirism, growth fault development, rapid deltaic sedimentation and intense drilling. The attribute was extracted on vertical seismic sections and time slices. Mega tectonic structures such as growth faults and clay diapirs are visible on vertical sections with obscured minor lineaments or fractures. Fractures are distinctively visible on time slices yielding recognizable patterns corroborating established geologic models. This model seismic attribute enabled the understanding of fluid flow characteristics and production responses. Three structural patterns recognized in the field include: major growth faults, minor faults or lineaments and network of fractures. Three growth faults mapped on seismic section form major deformation bands delimiting the area into three blocks or depocenters. The growth faults trend E-W, dip down-to-south in the basin direction, and cut across the study area. The faults initiating from about 2000ms extended up to 500ms, and tend to progress parallel and opposite to the growth direction of an upsurging diapiric structure. The diapiric structures form the major deformational bands originating from great depths (below 2000ms) and rising to about 1200ms where series of sedimentary layers onlapped and pinchout stratigraphically against the diapir. Several other secondary faults or lineaments that form parallel streaks to one another also accompanied the growth faults. The fracture networks have no particular trend but form a network surrounding the well area. Faults identified in the study area have potentials for structural hydrocarbon traps whereas the presence of fractures created a fractured-reservoir condition that enhanced rapid fluid flow especially water. High aquifer flow potential aided by possible fracture permeability resulted in rapid decline in oil rate. Through the application of Ant Tracking attribute, it is possible to obtain detailed interpretation of structures that can have direct influence on oil and gas production.

Keywords: seismic, attributes, production, structural

Procedia PDF Downloads 70
2152 Is There a Group of "Digital Natives" at Secondary Schools?

Authors: L. Janská, J. Kubrický

Abstract:

The article describes a research focused on the influence of the information and communication technology (ICT) on the pupils' learning. The investigation deals with the influences that distinguish between the group of pupils influenced by ICT and the group of pupils not influenced by ICT. The group influenced by ICT should evince a different approach in number of areas (in managing of two and more activities at once, in a quick orientation and searching for information on the Internet, in an ability to quickly and effectively assess the data sources, in the assessment of attitudes and opinions of the other users of the network, in critical thinking, in the preference to work in teams, in the sharing of information and personal data via the virtual social networking, in insisting on the immediate reaction on their every action etc.).

Keywords: ICT influence, digital natives, pupil´s learning

Procedia PDF Downloads 291
2151 Power Management in Wireless Combustible Gas Sensors

Authors: Denis Spirjakin, Alexander Baranov, Saba Akbari, Natalia Kalenova, Vladimir Sleptsov

Abstract:

In this paper we propose the approach to power management in wireless combustible gas sensors. This approach makes possible drastically prolong sensor nodes autonomous lifetime. That is necessary to tie battery replacement to every year technical service procedures which are claimed by safety standards. Using this approach the current consumption of the wireless combustible gas sensor node was decreased from 80 mA to less than 2 mA and the power consumption from more than 220 mW to 4.6 mW. These values provide autonomous lifetime of the node more than one year.

Keywords: Gas sensors, power management, wireless sensor network

Procedia PDF Downloads 724
2150 AIR SAFE: an Internet of Things System for Air Quality Management Leveraging Artificial Intelligence Algorithms

Authors: Mariangela Viviani, Daniele Germano, Simone Colace, Agostino Forestiero, Giuseppe Papuzzo, Sara Laurita

Abstract:

Nowadays, people spend most of their time in closed environments, in offices, or at home. Therefore, secure and highly livable environmental conditions are needed to reduce the probability of aerial viruses spreading. Also, to lower the human impact on the planet, it is important to reduce energy consumption. Heating, Ventilation, and Air Conditioning (HVAC) systems account for the major part of energy consumption in buildings [1]. Devising systems to control and regulate the airflow is, therefore, essential for energy efficiency. Moreover, an optimal setting for thermal comfort and air quality is essential for people’s well-being, at home or in offices, and increases productivity. Thanks to the features of Artificial Intelligence (AI) tools and techniques, it is possible to design innovative systems with: (i) Improved monitoring and prediction accuracy; (ii) Enhanced decision-making and mitigation strategies; (iii) Real-time air quality information; (iv) Increased efficiency in data analysis and processing; (v) Advanced early warning systems for air pollution events; (vi) Automated and cost-effective m onitoring network; and (vii) A better understanding of air quality patterns and trends. We propose AIR SAFE, an IoT-based infrastructure designed to optimize air quality and thermal comfort in indoor environments leveraging AI tools. AIR SAFE employs a network of smart sensors collecting indoor and outdoor data to be analyzed in order to take any corrective measures to ensure the occupants’ wellness. The data are analyzed through AI algorithms able to predict the future levels of temperature, relative humidity, and CO₂ concentration [2]. Based on these predictions, AIR SAFE takes actions, such as opening/closing the window or the air conditioner, to guarantee a high level of thermal comfort and air quality in the environment. In this contribution, we present the results from the AI algorithm we have implemented on the first s et o f d ata c ollected i n a real environment. The results were compared with other models from the literature to validate our approach.

Keywords: air quality, internet of things, artificial intelligence, smart home

Procedia PDF Downloads 93
2149 Feedback of Using Set-Up Candid Clips as New Media

Authors: Miss Suparada Prapawong

Abstract:

The objectives were to analyze the using of new media in the form of set up candid clip that affects the product and presenter, to study the effectiveness of using new media in the form of set up candid clip in order to increase the circulation and audience satisfaction and to use the earned information and knowledge to develop the communication for publicizing and advertising via new media. This research is qualitative research based on questionnaire and in-depth interview from experts. The findings showed the advantages and disadvantages of communication for publicizing and advertising via new media in the form of set up candid clip including with the specific target group for this kind of advertising. It will be useful for fields of publicizing and advertising in the new media forms at the present.

Keywords: candid clip, communication, new media, social network

Procedia PDF Downloads 308
2148 Politics in Academia: How the Diffusion of Innovation Relates to Professional Capital

Authors: Autumn Rooms Cypres, Barbara Driver

Abstract:

The purpose of this study is to extend discussions about innovations and career politics. Research questions that grounded this effort were: How does an academic learn the unspoken rules of the academy? What happens politically to an academic’s career when their research speaks against the grain of society? Do professors perceive signals that it is time to move on to another institution or even to another career? Epistemology and Methods: This qualitative investigation was focused on examining perceptions of academics. Therefore an open-ended field study, based on Grounded Theory, was used. This naturalistic paradigm (Lincoln & Guba,1985) was selected because it tends to understand information in terms of whole, of patterns, and in relations to the context of the environment. The technique for gathering data was the process of semi-structured, in-depth interviewing. Twenty five academics across the United States were interviewed relative to their career trajectories and the politics and opportunities they have encountered in relation to their research efforts. Findings: The analysis of interviews revealed four themes: Academics are beholden to 2 specific networks of power that influence their sense of job security; the local network based on their employing university and the national network of scholars who share the same field of research. The fights over what counts as research can and does drift from the intellectual to the political, and personal. Academic were able to identify specific instances of shunning and or punishment from their colleagues related directly to the dissemination of research that spoke against the grain of the local or national networks. Academics identified specific signals from both of these networks indicating that their career was flourishing or withering. Implications: This research examined insights from those who persevered when the fights over what and who counts drifted from the intellectual to the political, and the personal. Considerations of why such drifts happen were offered in the form of a socio-political construct called Fit, which included thoughts on hegemony, discourse, and identity. This effort reveals the importance of understanding what professional capital is relative to job security. It also reveals that fear is an enmeshed and often unspoken part of the culture of Academia. Further research to triangulate these findings would be helpful within international contexts.

Keywords: politics, academia, job security, context

Procedia PDF Downloads 321
2147 Development of Geo-computational Model for Analysis of Lassa Fever Dynamics and Lassa Fever Outbreak Prediction

Authors: Adekunle Taiwo Adenike, I. K. Ogundoyin

Abstract:

Lassa fever is a neglected tropical virus that has become a significant public health issue in Nigeria, with the country having the greatest burden in Africa. This paper presents a Geo-Computational Model for Analysis and Prediction of Lassa Fever Dynamics and Outbreaks in Nigeria. The model investigates the dynamics of the virus with respect to environmental factors and human populations. It confirms the role of the rodent host in virus transmission and identifies how climate and human population are affected. The proposed methodology is carried out on a Linux operating system using the OSGeoLive virtual machine for geographical computing, which serves as a base for spatial ecology computing. The model design uses Unified Modeling Language (UML), and the performance evaluation uses machine learning algorithms such as random forest, fuzzy logic, and neural networks. The study aims to contribute to the control of Lassa fever, which is achievable through the combined efforts of public health professionals and geocomputational and machine learning tools. The research findings will potentially be more readily accepted and utilized by decision-makers for the attainment of Lassa fever elimination.

Keywords: geo-computational model, lassa fever dynamics, lassa fever, outbreak prediction, nigeria

Procedia PDF Downloads 94
2146 Set Up Candid Clips Effectiveness

Authors: P. Suparada, D. Eakapotch

Abstract:

The objectives were to analyze the using of new media in the form of set up candid clip that affects the product and presenter, to study the effectiveness of using new media in the form of set up candid clip in order to increase the circulation and audience satisfaction and to use the earned information and knowledge to develop the communication for publicizing and advertising via new media. This research is qualitative research based on questionnaire and in-depth interview from experts. The findings showed the advantages and disadvantages of communication for publicizing and advertising via new media in the form of set up candid clip including with the specific target group for this kind of advertising. It will be useful for fields of publicizing and advertising in the new media forms at the present.

Keywords: candid clip, communication, new media, social network

Procedia PDF Downloads 248
2145 An Efficient Algorithm for Solving the Transmission Network Expansion Planning Problem Integrating Machine Learning with Mathematical Decomposition

Authors: Pablo Oteiza, Ricardo Alvarez, Mehrdad Pirnia, Fuat Can

Abstract:

To effectively combat climate change, many countries around the world have committed to a decarbonisation of their electricity, along with promoting a large-scale integration of renewable energy sources (RES). While this trend represents a unique opportunity to effectively combat climate change, achieving a sound and cost-efficient energy transition towards low-carbon power systems poses significant challenges for the multi-year Transmission Network Expansion Planning (TNEP) problem. The objective of the multi-year TNEP is to determine the necessary network infrastructure to supply the projected demand in a cost-efficient way, considering the evolution of the new generation mix, including the integration of RES. The rapid integration of large-scale RES increases the variability and uncertainty in the power system operation, which in turn increases short-term flexibility requirements. To meet these requirements, flexible generating technologies such as energy storage systems must be considered within the TNEP as well, along with proper models for capturing the operational challenges of future power systems. As a consequence, TNEP formulations are becoming more complex and difficult to solve, especially for its application in realistic-sized power system models. To meet these challenges, there is an increasing need for developing efficient algorithms capable of solving the TNEP problem with reasonable computational time and resources. In this regard, a promising research area is the use of artificial intelligence (AI) techniques for solving large-scale mixed-integer optimization problems, such as the TNEP. In particular, the use of AI along with mathematical optimization strategies based on decomposition has shown great potential. In this context, this paper presents an efficient algorithm for solving the multi-year TNEP problem. The algorithm combines AI techniques with Column Generation, a traditional decomposition-based mathematical optimization method. One of the challenges of using Column Generation for solving the TNEP problem is that the subproblems are of mixed-integer nature, and therefore solving them requires significant amounts of time and resources. Hence, in this proposal we solve a linearly relaxed version of the subproblems, and trained a binary classifier that determines the value of the binary variables, based on the results obtained from the linearized version. A key feature of the proposal is that we integrate the binary classifier into the optimization algorithm in such a way that the optimality of the solution can be guaranteed. The results of a study case based on the HRP 38-bus test system shows that the binary classifier has an accuracy above 97% for estimating the value of the binary variables. Since the linearly relaxed version of the subproblems can be solved with significantly less time than the integer programming counterpart, the integration of the binary classifier into the Column Generation algorithm allowed us to reduce the computational time required for solving the problem by 50%. The final version of this paper will contain a detailed description of the proposed algorithm, the AI-based binary classifier technique and its integration into the CG algorithm. To demonstrate the capabilities of the proposal, we evaluate the algorithm in case studies with different scenarios, as well as in other power system models.

Keywords: integer optimization, machine learning, mathematical decomposition, transmission planning

Procedia PDF Downloads 85
2144 A Horn Antenna Loaded with FSS of Crossed Dipoles

Authors: Ibrahim Mostafa El-Mongy, Abdelmegid Allam

Abstract:

In this article analysis and investigation of the effect of loading a horn antenna with frequency selective surface (FSS) of crossed dipoles of finite size is presented. It is fabricated on Rogers RO4350 (lossy) of relative permittivity 3.33, thickness 1.524 mm and loss tangent 0.004. Basically it is applied for filtering and minimizing the interference and noise in the desired band. The filtration is carried out using a finite FSS of crossed dipoles of overall dimensions 98x58 mm2. The filtration is shown by limiting the transmission bandwidth from 4 GHz (8–12 GHz) to 0.25 GHz (10.75–11 GHz). It is simulated using CST MWS and measured using network analyzer. There is a good agreement between the simulated and measured results.

Keywords: antenna, filtenna, frequency selective surface (FSS), horn

Procedia PDF Downloads 458
2143 A Survey and Theory of the Effects of Various Hamlet Videos on Viewers’ Brains

Authors: Mark Pizzato

Abstract:

How do ideas, images, and emotions in stage-plays and videos affect us? Do they evoke a greater awareness (or cognitive reappraisal of emotions) through possible shifts between left-cortical, right-cortical, and subcortical networks? To address these questions, this presentation summarizes the research of various neuroscientists, especially Bernard Baars and others involved in Global Workspace Theory, Matthew Lieberman in social neuroscience, Iain McGilchrist on left and right cortical functions, and Jaak Panksepp on the subcortical circuits of primal emotions. Through such research, this presentation offers an ‘inner theatre’ model of the brain, regarding major hubs of neural networks and our animal ancestry. It also considers recent experiments, by Mario Beauregard, on the cognitive reappraisal of sad, erotic, and aversive film clips. Finally, it applies the inner-theatre model and related research to survey results of theatre students who read and then watched the ‘To be or not to be’ speech in 8 different video versions (from stage and screen productions) of William Shakespeare’s Hamlet. Findings show that students become aware of left-cortical, right-cortical, and subcortical brain functions—and shifts between them—through staging and movie-making choices in each of the different videos.

Keywords: cognitive reappraisal, Hamlet, neuroscience, Shakespeare, theatre

Procedia PDF Downloads 315
2142 Calculating Non-Unique Sliding Modes for Switched Dynamical Systems

Authors: Eugene Stepanov, Arkadi Ponossov

Abstract:

Ordinary differential equations with switching nonlinearities constitute a very useful tool in many applications. The solutions of such equations can usually be calculated analytically if they cross the discontinuities transversally. Otherwise, one has trajectories that slides along the discontinuity, and the calculations become less straightforward in this case. For instance, one of the problems one faces is non-uniqueness of the sliding modes. In the presentation, it is proposed to apply the theory of hybrid dynamical systems to calculate the solutions that are ‘hidden’ in the discontinuities. Roughly, one equips the underlying switched system with an explicitly designed discrete dynamical system (‘automaton’), which governs the dynamics of the switched system. This construction ‘splits’ the dynamics, which, as it is shown in the presentation, gives uniqueness of the resulting hybrid trajectories and at the same time provides explicit formulae for them. Projecting the hybrid trajectories back onto the original continuous system explains non-uniqueness of its trajectories. The automaton is designed with the help of the attractors of the specially constructed adjoint dynamical system. Several examples are provided in the presentation, which supports the efficiency of the suggested scheme. The method can be of interest in control theory, gene regulatory networks, neural field models and other fields, where switched dynamics is a part of the analysis.

Keywords: hybrid dynamical systems, singular perturbation analysis, sliding modes, switched dynamics

Procedia PDF Downloads 163
2141 Applying Transformative Service Design to Develop Brand Community Service in Women, Children and Infants Retailing

Authors: Shian Wan, Yi-Chang Wang, Yu-Chien Lin

Abstract:

This research discussed the various theories of service design, the importance of service design methodology, and the development of transformative service design framework. In this study, transformative service design is applied while building a new brand community service for women, children and infants retailing business. The goal is to enhance the brand recognition and customer loyalty, effectively increase the brand community engagement by embedding the brand community in social network and ultimately, strengthen the impact and the value of the company brand.

Keywords: service design, transformative service design, brand community, innovation

Procedia PDF Downloads 498
2140 Topological Analyses of Unstructured Peer to Peer Systems: A Survey

Authors: Hend Alrasheed

Abstract:

Due to their different properties that have led to avoid several limitations of classic client/server systems, there has been a great interest in the development and the improvement of different peer to peer systems. Understanding the properties of complex peer to peer networks is essential for their future improvements. It was shown that the performances of peer to peer protocols are directly related to their underlying topologies. Therefore, multiple efforts have analyzed the topologies of different peer to peer systems. This study presents an overview of major findings of close experimental analyses to different topologies of three unstructured peer to peer systems: BitTorrent, Gnutella, and FreeNet.

Keywords: peer to peer networks, network topology, graph diameter, clustering coefficient, small-world property, random graph, degree distribution

Procedia PDF Downloads 381
2139 Comparing Image Processing and AI Techniques for Disease Detection in Plants

Authors: Luiz Daniel Garay Trindade, Antonio De Freitas Valle Neto, Fabio Paulo Basso, Elder De Macedo Rodrigues, Maicon Bernardino, Daniel Welfer, Daniel Muller

Abstract:

Agriculture plays an important role in society since it is one of the main sources of food in the world. To help the production and yield of crops, precision agriculture makes use of technologies aiming at improving productivity and quality of agricultural commodities. One of the problems hampering quality of agricultural production is the disease affecting crops. Failure in detecting diseases in a short period of time can result in small or big damages to production, causing financial losses to farmers. In order to provide a map of the contributions destined to the early detection of plant diseases and a comparison of the accuracy of the selected studies, a systematic literature review of the literature was performed, showing techniques for digital image processing and neural networks. We found 35 interesting tool support alternatives to detect disease in 19 plants. Our comparison of these studies resulted in an overall average accuracy of 87.45%, with two studies very closer to obtain 100%.

Keywords: pattern recognition, image processing, deep learning, precision agriculture, smart farming, agricultural automation

Procedia PDF Downloads 379
2138 Targeting Violent Extremist Narratives: Applying Network Targeting Techniques to the Communication Functions of Terrorist Groups

Authors: John Hardy

Abstract:

Over the last decade, the increasing utility of extremist narratives to the operational effectiveness of terrorist organizations has been evidenced by the proliferation of inspired or affiliated attacks across the world. Famous examples such as regional al-Qaeda affiliates and the self-styled “Islamic State” demonstrate the effectiveness of leveraging communication technologies to disseminate propaganda, recruit members, and orchestrate attacks. Terrorist organizations with the capacity to harness the communicative power offered by digital communication technologies and effective political narratives have held an advantage over their targets in recent years. Terrorists have leveraged the perceived legitimacy of grass-roots actors to appeal to a global audience of potential supporters and enemies alike, and have wielded a proficiency in profile-raising which remains unmatched by counter terrorism narratives around the world. In contrast, many attempts at propagating official counter-narratives have been received by target audiences as illegitimate, top-down and impersonally bureaucratic. However, the benefits provided by widespread communication and extremist narratives have come at an operational cost. Terrorist organizations now face a significant challenge in protecting their access to communications technologies and authority over the content they create and endorse. The dissemination of effective narratives has emerged as a core function of terrorist organizations with international reach via inspired or affiliated attacks. As such, it has become a critical function which can be targeted by intelligence and security forces. This study applies network targeting principles which have been used by coalition forces against a range of non-state actors in the Middle East and South Asia to the communicative function of terrorist organizations. This illustrates both a conceptual link between functional targeting and operational disruption in the abstract and a tangible impact on the operational effectiveness of terrorists by degrading communicative ability and legitimacy. Two case studies highlight the utility of applying functional targeting against terrorist organizations. The first case is the targeted killing of Anwar al-Awlaki, an al-Qaeda propagandist who crafted a permissive narrative and effective propaganda videos to attract recruits who committed inspired terrorist attacks in the US and overseas. The second is a series of operations against Islamic State propagandists in Syria, including the capture or deaths of a cadre of high profile Islamic State members, including Junaid Hussain, Abu Mohammad al-Adnani, Neil Prakash, and Rachid Kassim. The group of Islamic State propagandists were linked to a significant rise in affiliated and enabled terrorist attacks and were subsequently targeted by law enforcement and military agencies. In both cases, the disruption of communication between the terrorist organization and recruits degraded both communicative and operational functions. Effective functional targeting on member recruitment and operational tempo suggests that narratives are a critical function which can be leveraged against terrorist organizations. Further application of network targeting methods to terrorist narratives may enhance the efficacy of a range of counter terrorism techniques employed by security and intelligence agencies.

Keywords: countering violent extremism, counter terrorism, intelligence, terrorism, violent extremism

Procedia PDF Downloads 291
2137 Improve of Power Quality in Electrical Network Using STATCOM

Authors: A. R. Alesaadi

Abstract:

Flexible AC transmission system (FACTS) devices have an important rule on expended electrical transmission networks. These devices can provide control of one or more AC transmission system parameters to enhance controllability and increase power transfer capability. In this paper the effect of these devices on reliability of electrical networks is studied and it is shown that using of FACTS devices can improve the reliability of power networks and power quality in electrical networks, significantly.

Keywords: FACTS devices, power networks, power quality, STATCOM

Procedia PDF Downloads 668
2136 A Unified Approach for Digital Forensics Analysis

Authors: Ali Alshumrani, Nathan Clarke, Bogdan Ghite, Stavros Shiaeles

Abstract:

Digital forensics has become an essential tool in the investigation of cyber and computer-assisted crime. Arguably, given the prevalence of technology and the subsequent digital footprints that exist, it could have a significant role across almost all crimes. However, the variety of technology platforms (such as computers, mobiles, Closed-Circuit Television (CCTV), Internet of Things (IoT), databases, drones, cloud computing services), heterogeneity and volume of data, forensic tool capability, and the investigative cost make investigations both technically challenging and prohibitively expensive. Forensic tools also tend to be siloed into specific technologies, e.g., File System Forensic Analysis Tools (FS-FAT) and Network Forensic Analysis Tools (N-FAT), and a good deal of data sources has little to no specialist forensic tools. Increasingly it also becomes essential to compare and correlate evidence across data sources and to do so in an efficient and effective manner enabling an investigator to answer high-level questions of the data in a timely manner without having to trawl through data and perform the correlation manually. This paper proposes a Unified Forensic Analysis Tool (U-FAT), which aims to establish a common language for electronic information and permit multi-source forensic analysis. Core to this approach is the identification and development of forensic analyses that automate complex data correlations, enabling investigators to investigate cases more efficiently. The paper presents a systematic analysis of major crime categories and identifies what forensic analyses could be used. For example, in a child abduction, an investigation team might have evidence from a range of sources including computing devices (mobile phone, PC), CCTV (potentially a large number), ISP records, and mobile network cell tower data, in addition to third party databases such as the National Sex Offender registry and tax records, with the desire to auto-correlate and across sources and visualize in a cognitively effective manner. U-FAT provides a holistic, flexible, and extensible approach to providing digital forensics in technology, application, and data-agnostic manner, providing powerful and automated forensic analysis.

Keywords: digital forensics, evidence correlation, heterogeneous data, forensics tool

Procedia PDF Downloads 196
2135 Efficacy of a Wiener Filter Based Technique for Speech Enhancement in Hearing Aids

Authors: Ajish K. Abraham

Abstract:

Hearing aid is the most fundamental technology employed towards rehabilitation of persons with sensory neural hearing impairment. Hearing in noise is still a matter of major concern for many hearing aid users and thus continues to be a challenging issue for the hearing aid designers. Several techniques are being currently used to enhance the speech at the hearing aid output. Most of these techniques, when implemented, result in reduction of intelligibility of the speech signal. Thus the dissatisfaction of the hearing aid user towards comprehending the desired speech amidst noise is prevailing. Multichannel Wiener Filter is widely implemented in binaural hearing aid technology for noise reduction. In this study, Wiener filter based noise reduction approach is experimented for a single microphone based hearing aid set up. This method checks the status of the input speech signal in each frequency band and then selects the relevant noise reduction procedure. Results showed that the Wiener filter based algorithm is capable of enhancing speech even when the input acoustic signal has a very low Signal to Noise Ratio (SNR). Performance of the algorithm was compared with other similar algorithms on the basis of improvement in intelligibility and SNR of the output, at different SNR levels of the input speech. Wiener filter based algorithm provided significant improvement in SNR and intelligibility compared to other techniques.

Keywords: hearing aid output speech, noise reduction, SNR improvement, Wiener filter, speech enhancement

Procedia PDF Downloads 247
2134 Cluster-Based Multi-Path Routing Algorithm in Wireless Sensor Networks

Authors: Si-Gwan Kim

Abstract:

Small-size and low-power sensors with sensing, signal processing and wireless communication capabilities is suitable for the wireless sensor networks. Due to the limited resources and battery constraints, complex routing algorithms used for the ad-hoc networks cannot be employed in sensor networks. In this paper, we propose node-disjoint multi-path hexagon-based routing algorithms in wireless sensor networks. We suggest the details of the algorithm and compare it with other works. Simulation results show that the proposed scheme achieves better performance in terms of efficiency and message delivery ratio.

Keywords: clustering, multi-path, routing protocol, sensor network

Procedia PDF Downloads 404