Search results for: vector network analyzer
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5911

Search results for: vector network analyzer

1231 Earthquake Relocations and Constraints on the Lateral Velocity Variations along the Gulf of Suez, Using the Modified Joint Hypocenter Method Determination

Authors: Abu Bakr Ahmed Shater

Abstract:

Hypocenters of 250 earthquakes recorded by more than 5 stations from the Egyptian seismic network around the Gulf of Suez were relocated and the seismic stations correction for the P-wave is estimated, using the modified joint hypocenter method determination. Five stations TR1, SHR, GRB, ZAF and ZET have minus signs in the station P-wave travel time corrections and their values are -0.235, -0.366, -0.288, -0.366 and -0.058, respectively. It is possible to assume that, the underground model in this area has a particular characteristic of high velocity structure in which the other stations TR2, RDS, SUZ, HRG and ZNM have positive signs and their values are 0.024, 0.187, 0.314, 0.645 and 0.145, respectively. It is possible to assume that, the underground model in this area has particular characteristic of low velocity structure. The hypocenteral location determined by the Modified joint hypocenter method is more precise than those determined by the other routine work program. This method simultaneously solves the earthquake locations and station corrections. The station corrections reflect, not only the different crustal conditions in the vicinity of the stations, but also the difference between the actual and modeled seismic velocities along each of the earthquake - station ray paths. The stations correction obtained is correlated with the major surface geological features in the study area. As a result of the relocation, the low velocity area appears in the northeastern and southwestern sides of the Gulf of Suez, while the southeastern and northwestern parts are of high velocity area.

Keywords: gulf of Suez, seismicity, relocation of hypocenter, joint hypocenter determination

Procedia PDF Downloads 354
1230 Automatic Identification and Monitoring of Wildlife via Computer Vision and IoT

Authors: Bilal Arshad, Johan Barthelemy, Elliott Pilton, Pascal Perez

Abstract:

Getting reliable, informative, and up-to-date information about the location, mobility, and behavioural patterns of animals will enhance our ability to research and preserve biodiversity. The fusion of infra-red sensors and camera traps offers an inexpensive way to collect wildlife data in the form of images. However, extracting useful data from these images, such as the identification and counting of animals remains a manual, time-consuming, and costly process. In this paper, we demonstrate that such information can be automatically retrieved by using state-of-the-art deep learning methods. Another major challenge that ecologists are facing is the recounting of one single animal multiple times due to that animal reappearing in other images taken by the same or other camera traps. Nonetheless, such information can be extremely useful for tracking wildlife and understanding its behaviour. To tackle the multiple count problem, we have designed a meshed network of camera traps, so they can share the captured images along with timestamps, cumulative counts, and dimensions of the animal. The proposed method takes leverage of edge computing to support real-time tracking and monitoring of wildlife. This method has been validated in the field and can be easily extended to other applications focusing on wildlife monitoring and management, where the traditional way of monitoring is expensive and time-consuming.

Keywords: computer vision, ecology, internet of things, invasive species management, wildlife management

Procedia PDF Downloads 136
1229 Living Lab as a Service: Developing Context Induced, Co-creational Innovation Routines as a Process Tool for Nature Based Solutions

Authors: Immanuel Darkwa

Abstract:

Climate change and environmental degradation are existential threats requiring urgent transnational action. The SDGs, as well as regional initiatives the like European Green Deal, as ambitious as they are, put an emphasis on innovatively tackling threats posed by climate change regionally. While co-creational approaches are being propagated, there is no reference blueprint for how potential solutions, particularly nature-based solutions, may be developed and implemented within urban-settings. Using a single case study in Zagreb, Croatia, this paper proposes a workshop-tool for a Living Lab as a Service model for sustainable Nature-Based-Thinking, Nature–Centred-Design and Nature based solutions. The approach is based on a co-creational methodology developed through literature synthesis, expert interviews, focus group discussions, surveys and synthesized through rigorous research analysis and participatory observation. The ensuing tool involves workshop-processes, tested with through-the-process identified stakeholders with distinctive roles and functions. The resulting framework proposes a Nature-Based-Centred-Thinking process tool involving ‘green’ routines supported by a focal unit and a collaborative network, and that allows for the development of nature-based solutions.

Keywords: living labs, nature-based solutions, nature- based design, innovation processes, innovation routines and tools

Procedia PDF Downloads 69
1228 Anthropogenic Impact on Migration Process of River Yamuna in Delhi-NCR Using Geospatial Techniques

Authors: Mohd Asim, K. Nageswara Rao

Abstract:

The present work was carried out on River Yamuna passing through Delhi- National Capital Region (Delhi-NCR) of India for a stretch of about 130 km to assess the anthropogenic impact on the channel migration process for a period of 200 years with the help of satellite data and topographical maps with integration of geographic information system environment. Digital Shoreline Analysis System (DSAS) application was used to quantify river channel migration in ArcGIS environment. The average river channel migration was calculated to be 22.8 m/year for the entire study area. River channel migration was found to be moving in westward and eastward direction. Westward migration is more than 4 km maximum in length and eastward migration is about 4.19 km. The river has migrated a total of 32.26 sq. km of area. The results reveal that the river is being impacted by various human activities. The impact indicators include engineering structures, sand mining, embankments, urbanization, land use/land cover, canal network. The DSAS application was also used to predict the position of river channel in future for 2032 and 2042 by analyzing the past and present rate and direction of movement. The length of channel in 2032 and 2042 will be 132.5 and 141.6 km respectively. The channel will migrate maximum after crossing Okhla Barrage near Faridabad for about 3.84 sq. km from 2022 to 2042 from west to east.

Keywords: river migration, remote sensing, river Yamuna, anthropogenic impacts, DSAS, Delhi-NCR

Procedia PDF Downloads 119
1227 Regenerative City Regions: Exploring the Connections between Regenerative Development, Collaborative Governance and Progressive Regionalism

Authors: Lorena F. Axinte

Abstract:

Territorial rescaling is a universal practice in the UK, following a logic of agglomeration and competition as the only chance for cities to thrive. Cardiff Capital Region is one of the latest examples, and its governance structures and developmental narratives are currently being shaped. Its evolution must be compatible with the Wellbeing of Future Generations Act, a Welsh legislation that requires public bodies to put sustainability at the core of all actions. Departing from this case study, the project follows the evolution of Cardiff Capital Region and assesses it based on a new a conceptual framework that connects the notions of regenerative development, collaborative governance, and progressive regionalism. The hypothetical synergies between these different theoretical perspectives are demonstrated, inferring that if regenerative development is aimed at, it must necessarily start with collaborative modes of governance. The objective is to explore (a) whether expanding the network of active stakeholders who get to intervene in the governance structure can contribute to a more progressive definition and development of the city region and (b) whether this can be considered a pathway towards regenerative development. The exploratory fieldwork conducted during the initial phase of the project used qualitative methods, which will be complemented next by different participatory research approaches, as well as a quantitative analysis. Despite being in its early days, the study is showing that a wider range of voices can indeed change priorities, reconcile and balance between the economic drivers and the wider social, economic, cultural and environmental aspects.

Keywords: Cardiff Capital Region, collaborative governance, progressive regionalism, regenerative development

Procedia PDF Downloads 306
1226 Hardware Implementation for the Contact Force Reconstruction in Tactile Sensor Arrays

Authors: María-Luisa Pinto-Salamanca, Wilson-Javier Pérez-Holguín

Abstract:

Reconstruction of contact forces is a fundamental technique for analyzing the properties of a touched object and is essential for regulating the grip force in slip control loops. This is based on the processing of the distribution, intensity, and direction of the forces during the capture of the sensors. Currently, efficient hardware alternatives have been used more frequently in different fields of application, allowing the implementation of computationally complex algorithms, as is the case with tactile signal processing. The use of hardware for smart tactile sensing systems is a research area that promises to improve the processing time and portability requirements of applications such as artificial skin and robotics, among others. The literature review shows that hardware implementations are present today in almost all stages of smart tactile detection systems except in the force reconstruction process, a stage in which they have been less applied. This work presents a hardware implementation of a model-driven reported in the literature for the contact force reconstruction of flat and rigid tactile sensor arrays from normal stress data. From the analysis of a software implementation of such a model, this implementation proposes the parallelization of tasks that facilitate the execution of matrix operations and a two-dimensional optimization function to obtain a vector force by each taxel in the array. This work seeks to take advantage of the parallel hardware characteristics of Field Programmable Gate Arrays, FPGAs, and the possibility of applying appropriate techniques for algorithms parallelization using as a guide the rules of generalization, efficiency, and scalability in the tactile decoding process and considering the low latency, low power consumption, and real-time execution as the main parameters of design. The results show a maximum estimation error of 32% in the tangential forces and 22% in the normal forces with respect to the simulation by the Finite Element Modeling (FEM) technique of Hertzian and non-Hertzian contact events, over sensor arrays of 10×10 taxels of different sizes. The hardware implementation was carried out on an MPSoC XCZU9EG-2FFVB1156 platform of Xilinx® that allows the reconstruction of force vectors following a scalable approach, from the information captured by means of tactile sensor arrays composed of up to 48 × 48 taxels that use various transduction technologies. The proposed implementation demonstrates a reduction in estimation time of x / 180 compared to software implementations. Despite the relatively high values of the estimation errors, the information provided by this implementation on the tangential and normal tractions and the triaxial reconstruction of forces allows to adequately reconstruct the tactile properties of the touched object, which are similar to those obtained in the software implementation and in the two FEM simulations taken as reference. Although errors could be reduced, the proposed implementation is useful for decoding contact forces for portable tactile sensing systems, thus helping to expand electronic skin applications in robotic and biomedical contexts.

Keywords: contact forces reconstruction, forces estimation, tactile sensor array, hardware implementation

Procedia PDF Downloads 191
1225 Deep Reinforcement Learning Approach for Optimal Control of Industrial Smart Grids

Authors: Niklas Panten, Eberhard Abele

Abstract:

This paper presents a novel approach for real-time and near-optimal control of industrial smart grids by deep reinforcement learning (DRL). To achieve highly energy-efficient factory systems, the energetic linkage of machines, technical building equipment and the building itself is desirable. However, the increased complexity of the interacting sub-systems, multiple time-variant target values and stochastic influences by the production environment, weather and energy markets make it difficult to efficiently control the energy production, storage and consumption in the hybrid industrial smart grids. The studied deep reinforcement learning approach allows to explore the solution space for proper control policies which minimize a cost function. The deep neural network of the DRL agent is based on a multilayer perceptron (MLP), Long Short-Term Memory (LSTM) and convolutional layers. The agent is trained within multiple Modelica-based factory simulation environments by the Advantage Actor Critic algorithm (A2C). The DRL controller is evaluated by means of the simulation and then compared to a conventional, rule-based approach. Finally, the results indicate that the DRL approach is able to improve the control performance and significantly reduce energy respectively operating costs of industrial smart grids.

Keywords: industrial smart grids, energy efficiency, deep reinforcement learning, optimal control

Procedia PDF Downloads 188
1224 Dynamics of Museum Visitors’ Experiences Studies: A Bibliometric Analysis

Authors: Tesfaye Fentaw Nigatu, Alexander Trupp, Teh Pek Yen

Abstract:

Research on museums and the experiences of visitors has flourished in recent years, especially after museums became centers of edutainment beyond preserving heritage resources. This paper aims to comprehensively understand the changes, continuities, and future research development directions of museum visitors’ experiences. To identify current research trends, the paper summarizes and analyses research article publications from 1986 to 2023 on museum visitors' experiences. Bibliometric analysis software VOSviewer and Harzing POP (Publish or Perish) were used to analyze 407 academic articles. The articles were generated from the Scopus database. The study attempted to map new insights for future scholars and academics to expand the scope of museum visitors’ experience studies by analyzing keywords, citation patterns, influential articles in the field, publication trends, collaborations between authors, institutions, and clusters of highly cited articles. Accessibility to museums, social media usage within museums, aesthetics in museum settings, mixed reality experiences, sustainability issues, and emotions have emerged as key research areas in the study of museum visitors' experiences. The results benefit stakeholders and researchers in advancing the collective progress of considering recent research trends to stay informed about the latest developments and breakthroughs in the global academic landscape and visitors’ experiences development in the museum.

Keywords: bibliometric analysis, museum, network analysis, visitors’ experiences, visual analysis

Procedia PDF Downloads 63
1223 Seismic Perimeter Surveillance System (Virtual Fence) for Threat Detection and Characterization Using Multiple ML Based Trained Models in Weighted Ensemble Voting

Authors: Vivek Mahadev, Manoj Kumar, Neelu Mathur, Brahm Dutt Pandey

Abstract:

Perimeter guarding and protection of critical installations require prompt intrusion detection and assessment to take effective countermeasures. Currently, visual and electronic surveillance are the primary methods used for perimeter guarding. These methods can be costly and complicated, requiring careful planning according to the location and terrain. Moreover, these methods often struggle to detect stealthy and camouflaged insurgents. The object of the present work is to devise a surveillance technique using seismic sensors that overcomes the limitations of existing systems. The aim is to improve intrusion detection, assessment, and characterization by utilizing seismic sensors. Most of the similar systems have only two types of intrusion detection capability viz., human or vehicle. In our work we could even categorize further to identify types of intrusion activity such as walking, running, group walking, fence jumping, tunnel digging and vehicular movements. A virtual fence of 60 meters at GCNEP, Bahadurgarh, Haryana, India, was created by installing four underground geophones at a distance of 15 meters each. The signals received from these geophones are then processed to find unique seismic signatures called features. Various feature optimization and selection methodologies, such as LightGBM, Boruta, Random Forest, Logistics, Recursive Feature Elimination, Chi-2 and Pearson Ratio were used to identify the best features for training the machine learning models. The trained models were developed using algorithms such as supervised support vector machine (SVM) classifier, kNN, Decision Tree, Logistic Regression, Naïve Bayes, and Artificial Neural Networks. These models were then used to predict the category of events, employing weighted ensemble voting to analyze and combine their results. The models were trained with 1940 training events and results were evaluated with 831 test events. It was observed that using the weighted ensemble voting increased the efficiency of predictions. In this study we successfully developed and deployed the virtual fence using geophones. Since these sensors are passive, do not radiate any energy and are installed underground, it is impossible for intruders to locate and nullify them. Their flexibility, quick and easy installation, low costs, hidden deployment and unattended surveillance make such systems especially suitable for critical installations and remote facilities with difficult terrain. This work demonstrates the potential of utilizing seismic sensors for creating better perimeter guarding and protection systems using multiple machine learning models in weighted ensemble voting. In this study the virtual fence achieved an intruder detection efficiency of over 97%.

Keywords: geophone, seismic perimeter surveillance, machine learning, weighted ensemble method

Procedia PDF Downloads 76
1222 Multistage Data Envelopment Analysis Model for Malmquist Productivity Index Using Grey's System Theory to Evaluate Performance of Electric Power Supply Chain in Iran

Authors: Mesbaholdin Salami, Farzad Movahedi Sobhani, Mohammad Sadegh Ghazizadeh

Abstract:

Evaluation of organizational performance is among the most important measures that help organizations and entities continuously improve their efficiency. Organizations can use the existing data and results from the comparison of units under investigation to obtain an estimation of their performance. The Malmquist Productivity Index (MPI) is an important index in the evaluation of overall productivity, which considers technological developments and technical efficiency at the same time. This article proposed a model based on the multistage MPI, considering limited data (Grey’s theory). This model can evaluate the performance of units using limited and uncertain data in a multistage process. It was applied by the electricity market manager to Iran’s electric power supply chain (EPSC), which contains uncertain data, to evaluate the performance of its actors. Results from solving the model showed an improvement in the accuracy of future performance of the units under investigation, using the Grey’s system theory. This model can be used in all case studies, in which MPI is used and there are limited or uncertain data.

Keywords: Malmquist Index, Grey's Theory, CCR Model, network data envelopment analysis, Iran electricity power chain

Procedia PDF Downloads 161
1221 Cloud Shield: Model to Secure User Data While Using Content Delivery Network Services

Authors: Rachna Jain, Sushila Madan, Bindu Garg

Abstract:

Cloud computing is the key powerhouse in numerous organizations due to shifting of their data to the cloud environment. In recent years it has been observed that cloud-based-services are being used on large scale for content storage, distribution and processing. Various issues have been observed in cloud computing environment that need to be addressed. Security and privacy are found topmost concern area. In this paper, a novel security model is proposed to secure data by utilizing CDN services like image to icon conversion. CDN Service is a content delivery service which converts an image to icon, word to pdf & Latex to pdf etc. Presented model is used to convert an image into icon by keeping image secret. Here security of image is imparted so that image should be encrypted and decrypted by data owners only. It is also discussed in the paper that how server performs multiplication and selection on encrypted data without decryption. The data can be image file, word file, audio or video file. Moreover, the proposed model is capable enough to multiply images, encrypt them and send to a server application for conversion. Eventually, the prime objective is to encrypt an image and convert the encrypted image to image Icon by utilizing homomorphic encryption.

Keywords: cloud computing, user data security, homomorphic encryption, image multiplication, CDN service

Procedia PDF Downloads 329
1220 A Brief Review on Doping in Sports and Performance-Enhancing Drugs

Authors: Zahra Mohajer, Afsaneh Soltani

Abstract:

Doping is a major issue in competitive sports and is favored by vast groups of athletes. The feeling of being higher-ranking than others and gaining fame has caused many athletes to misuse drugs. The definition of doping is to use prohibited substances and/or methods that help physical or mental performances or both. Doping counts as the illegal use of chemical substances or drugs, excessive amounts of physiological substances to increase the performance at or out of competition or even the use of inappropriate medications to treat an injury to gain the ability to participate in a competition. The International Olympic Committee (IOC) and World Anti-Doping Agency (WADA) have forbidden these substances to ensure fair and equal competition and also the health of the competitors. As of 2004 WADA has published an international list of illegal substances used for doping, which is updated annually. In the process of the Genome Project scientists have gained the ability to treat numerous diseases by gene therapy, which may result in bodily performance increase and therefore a potential opportunity to misuse by some athletes. Gene doping is defined as the non-therapeutic direct and indirect genetic modifications using genetic materials that can improve the performances in sports events. Biosynthetic drugs are a form of indirect genetic engineering. The method can be performed in three ways such as injecting the DNA directly into the muscle, inserting the genetically engineered cells, or transferring the DNA using a virus as a vector. Erythropoietin is a hormone majorly released by the kidney and in small amounts by the liver. Its function is to stimulate the erythropoiesis and therefore the more production of red blood cells (RBC) which causes an increase in Hemoglobin (Hb). During this process, the oxygen delivery to muscles will increase, which will improve athletic performance and postpone exhaustion. There are ways to increase the oxygen transferred to muscles such as blood transfusion, stimulating the production of red blood cells by using Erythropoietin (EPO), and also using allosteric effectors of Hemoglobin. EPO can either be injected as a protein or can be inserted into the cells as the gene which encodes EPO. Adeno-associated viruses have been employed to deliver the EPO gene to the cells. Employing the genes that naturally exist in the human body such as the EPO gene can reduce the risk of detecting gene doping. The first research about blood doping was conducted in 1947. The study has shown that an increase in hematocrit (HCT) up to 55% following homologous transfusion makes it more unchallenging for the body to perform the exercise at the altitude. Thereafter athletes’ attraction to blood infusion escalated. Also, a study has demonstrated that by reinfusing their own blood 4 weeks after being drawn, three men have shown a rise in Hb level which improved the oxygen uptake, and a delay in exhaustion. The list of performance-enhancing drugs is published by WADA annually and includes the following drugs: anabolic agents, hormones, Beta-2 agonists, Beta-blockers, Diuretics, Stimulants, narcotics, cannabinoids, and corticosteroids.

Keywords: doping, PEDs, sports, WADA

Procedia PDF Downloads 105
1219 The Developing of Knowledge-Based System for the Medical Treatment with Herbs

Authors: Rujijan Vichivanives

Abstract:

This research aims to create a knowledge-based system as a database for self-healthcare analysis, diagnosis of simple illnesses, and the use of Thai herbs instead of modern medicine by using principles of Thai traditional medication theory. These were disseminated by website network programs within Suan Sunandha Rajabhat University. The population used in this study was divided into two groups: the first group consisted of four experts of Thai traditional medication and the second group was 300 website users. The methods used for collecting data were paper questionnaires and poll questionnaires on the website. The statistics used for analyzing data was at an average level. The results were divided into three parts: the first part was the development of a knowledge-based system and the second part was applied programs on website. Both parts could be fulfilled and achieved according to the set goal. The third part was the evaluation of the study: The evaluation of the viewpoints of the experts towards website designs were evaluated at a good level of 4.20. The satisfaction evaluation of the users was found at a good level of average satisfactory level at 4.24. It was found that the young population of those under the age of 16 had less cares about their health than the population of other teenagers, working age adults and those of older age. The research findings should be extended in order to encourage the lifestyle modifications to people of all ages by using the self-healthcare principles.

Keywords: developing, herbs, knowledge-based system, medical treatment

Procedia PDF Downloads 325
1218 Optimal Simultaneous Sizing and Siting of DGs and Smart Meters Considering Voltage Profile Improvement in Active Distribution Networks

Authors: T. Sattarpour, D. Nazarpour

Abstract:

This paper investigates the effect of simultaneous placement of DGs and smart meters (SMs), on voltage profile improvement in active distribution networks (ADNs). A substantial center of attention has recently been on responsive loads initiated in power system problem studies such as distributed generations (DGs). Existence of responsive loads in active distribution networks (ADNs) would have undeniable effect on sizing and siting of DGs. For this reason, an optimal framework is proposed for sizing and siting of DGs and SMs in ADNs. SMs are taken into consideration for the sake of successful implementing of demand response programs (DRPs) such as direct load control (DLC) with end-side consumers. Looking for voltage profile improvement, the optimization procedure is solved by genetic algorithm (GA) and tested on IEEE 33-bus distribution test system. Different scenarios with variations in the number of DG units, individual or simultaneous placing of DGs and SMs, and adaptive power factor (APF) mode for DGs to support reactive power have been established. The obtained results confirm the significant effect of DRPs and APF mode in determining the optimal size and site of DGs to be connected in ADN resulting to the improvement of voltage profile as well.

Keywords: active distribution network (ADN), distributed generations (DGs), smart meters (SMs), demand response programs (DRPs), adaptive power factor (APF)

Procedia PDF Downloads 299
1217 Unstructured-Data Content Search Based on Optimized EEG Signal Processing and Multi-Objective Feature Extraction

Authors: Qais M. Yousef, Yasmeen A. Alshaer

Abstract:

Over the last few years, the amount of data available on the globe has been increased rapidly. This came up with the emergence of recent concepts, such as the big data and the Internet of Things, which have furnished a suitable solution for the availability of data all over the world. However, managing this massive amount of data remains a challenge due to their large verity of types and distribution. Therefore, locating the required file particularly from the first trial turned to be a not easy task, due to the large similarities of names for different files distributed on the web. Consequently, the accuracy and speed of search have been negatively affected. This work presents a method using Electroencephalography signals to locate the files based on their contents. Giving the concept of natural mind waves processing, this work analyses the mind wave signals of different people, analyzing them and extracting their most appropriate features using multi-objective metaheuristic algorithm, and then classifying them using artificial neural network to distinguish among files with similar names. The aim of this work is to provide the ability to find the files based on their contents using human thoughts only. Implementing this approach and testing it on real people proved its ability to find the desired files accurately within noticeably shorter time and retrieve them as a first choice for the user.

Keywords: artificial intelligence, data contents search, human active memory, mind wave, multi-objective optimization

Procedia PDF Downloads 173
1216 Pinch Technology for Minimization of Water Consumption at a Refinery

Authors: W. Mughees, M. Alahmad

Abstract:

Water is the most significant entity that controls local and global development. For the Gulf region, especially Saudi Arabia, with its limited potable water resources, the potential of the fresh water problem is highly considerable. In this research, the study involves the design and analysis of pinch-based water/wastewater networks. Multiple water/wastewater networks were developed using pinch analysis involving direct recycle/material recycle method. Property-integration technique was adopted to carry out direct recycle method. Particularly, a petroleum refinery was considered as a case study. In direct recycle methodology, minimum water discharge and minimum fresh water resource targets were estimated. Re-design (or retrofitting) of water allocation in the networks was undertaken. Chemical Oxygen Demand (COD) and hardness properties were taken as pollutants. This research was based on single and double contaminant approach for COD and hardness and the amount of fresh water was reduced from 340.0 m3/h to 149.0 m3/h (43.8%), 208.0 m3/h (61.18%) respectively. While regarding double contaminant approach, reduction in fresh water demand was 132.0 m3/h (38.8%). The required analysis was also carried out using mathematical programming technique. Operating software such as LINGO was used for these studies which have verified the graphical method results in a valuable and accurate way. Among the multiple water networks, the one possible water allocation network was developed based on mass exchange.

Keywords: minimization, water pinch, water management, pollution prevention

Procedia PDF Downloads 472
1215 Manipulation of Ideological Items in the Audiovisual Translation of Voiced-Over Documentaries in the Arab World

Authors: S. Chabbak

Abstract:

In a widely globalized world, the influence of audiovisual translation on the culture and identity of audiences is unmistakable. However, in the Arab World, there is a noticeable disproportion between this growing influence and the research carried out in the field. As a matter of fact, the voiced-over documentary is one of the most abundantly translated genres in the Arab World that carries lots of ideological elements which are in many cases rendered by manipulation. However, voiced-over documentaries have hardly received any focused attention from researchers in the Arab World. This paper attempts to scrutinize the process of translation of voiced-over documentaries in the Arab World, from French into Arabic in the present case study, by sub-categorizing the ideological items subject to manipulation, identifying the techniques utilized in their translation and exploring the potential extra-linguistic factors that prompt translation agents to opt for manipulative translation. The investigation is based on a corpus of 94 episodes taken from a series entitled 360° GEO Reports, produced by the French German network ARTE in French, and acquired, translated and aired by Al Jazeera Documentary Channel for Arab audiences. The results yielded 124 cases of manipulation in four sub-categories of ideological items, and the use of 10 different oblique procedures in the process of manipulative translation. The study also revealed that manipulation is in most of the instances dictated by the editorial line of the broadcasting channel, in addition to the religious, geopolitical and socio-cultural peculiarities of the target culture.

Keywords: audiovisual translation, ideological items, manipulation, voiced-over documentaries

Procedia PDF Downloads 210
1214 Severity Index Level in Effectively Managing Medium Voltage Underground Power Cable

Authors: Mohd Azraei Pangah Pa'at, Mohd Ruzlin Mohd Mokhtar, Norhidayu Rameli, Tashia Marie Anthony, Huzainie Shafi Abd Halim

Abstract:

Partial Discharge (PD) diagnostic mapping testing is one of the main diagnostic testing techniques that are widely used in the field or onsite testing for underground power cable in medium voltage level. The existence of PD activities is an early indication of insulation weakness hence early detection of PD activities can be determined and provides an initial prediction on the condition of the cable. To effectively manage the results of PD Mapping test, it is important to have acceptable criteria to facilitate prioritization of mitigation action. Tenaga Nasional Berhad (TNB) through Distribution Network (DN) division have developed PD severity model name Severity Index (SI) for offline PD mapping test since 2007 based on onsite test experience. However, this severity index recommendation action had never been revised since its establishment. At presence, PD measurements data have been extensively increased, hence the severity level indication and the effectiveness of the recommendation actions can be analyzed and verified again. Based on the new revision, the recommended action to be taken will be able to reflect the actual defect condition. Hence, will be accurately prioritizing preventive action plan and minimizing maintenance expenditure.

Keywords: partial discharge, severity index, diagnostic testing, medium voltage, power cable

Procedia PDF Downloads 177
1213 An Internet of Things Smart Washroom Framework

Authors: Robin Ratnasingham, Maher Elshakankiri

Abstract:

This research report will look at how to make a smart washroom to increase public hygiene and cleanliness. The system would use IoT devices to pick up various activities in the washroom and notify the appropriate stakeholders or devices to regulate the condition of the washroom. As more people are required to physically go back to the office or school, ensuring a clean and sanitized washroom is even more important now than before. It would help prevent virus outbreaks and safeguard the organization from shutdowns or slowdowns in their business. A framework of the suggested smart washroom was introduced to help reduce the chances of a virus outbreak. Most organizations outsource renovation or implementation to an external party. Using the smart washroom framework, we looked at vendors that provide smart washroom solutions. There are IoT vendors that cannot match the framework, and there are vendors that can support the framework design. This segment is a niche market, and most of the devices are similar in their basic functions. However, all the vendors have unique characteristics to give them a competitive advantage over the rest of the IoT washroom companies. Ultimately, the organization would need to decide if they want to add IoT devices to enable smart capability or renovate the washroom to create a fluid IoT smart washroom design. The report would introduce an IoT smart washroom framework to help organizations design a cohesive preventive measure network for the daily maintenance routine. The framework is designed to help understand how to manage washroom cleanliness more efficiently and to provide guidance in achieving this goal. The leading result is eliminating potential viral outbreaks that could jeopardize the organization.

Keywords: IoT, smart washroom, public hygiene, cleanliness, virus outbreaks, safeguard

Procedia PDF Downloads 91
1212 A Hybrid Traffic Model for Smoothing Traffic Near Merges

Authors: Shiri Elisheva Decktor, Sharon Hornstein

Abstract:

Highway merges and unmarked junctions are key components in any urban road network, which can act as bottlenecks and create traffic disruption. Inefficient highway merges may trigger traffic instabilities such as stop-and-go waves, pose safety conditions and lead to longer journey times. These phenomena occur spontaneously if the average vehicle density exceeds a certain critical value. This study focuses on modeling the traffic using a microscopic traffic flow model. A hybrid traffic model, which combines human-driven and controlled vehicles is assumed. The controlled vehicles obey different driving policies when approaching the merge, or in the vicinity of other vehicles. We developed a co-simulation model in SUMO (Simulation of Urban Mobility), in which the human-driven cars are modeled using the IDM model, and the controlled cars are modeled using a dedicated controller. The scenario chosen for this study is a closed track with one merge and one exit, which could be later implemented using a scaled infrastructure on our lab setup. This will enable us to benchmark the results of this study obtained in simulation, to comparable results in similar conditions in the lab. The metrics chosen for the comparison of the performance of our algorithm on the overall traffic conditions include the average speed, wait time near the merge, and throughput after the merge, measured under different travel demand conditions (low, medium, and heavy traffic).

Keywords: highway merges, traffic modeling, SUMO, driving policy

Procedia PDF Downloads 103
1211 The Importance of Downstream Supply Chain in Supply Chain Risk Management: Multi-Objective Optimization

Authors: Zohreh Khojasteh-Ghamari, Takashi Irohara

Abstract:

One of the efficient ways in supply chain risk management is avoiding the interruption in Supply Chain (SC) before it occurs. Although the majority of the organizations focus on their first-tier suppliers to avoid risk in the SC, studies show that in only 60 percent of the disruption cases the reason is first tier suppliers. In the 40 percent of the SC disruptions, the reason is downstream SC, which is the second tier and lower. Due to the increasing complexity and interrelation of modern supply chains, the SC elements have become difficult to trace. Moreover, studies show that there is a vital need to better understand the integration of risk and visibility, especially in the context of multiple objectives. In this study, we propose a multi-objective programming model to avoid disruption in SC. The objective of this study is evaluating the effect of downstream SCV on managing supply chain risk. We propose a multi-objective mathematical programming model with the objective functions of minimizing the total cost and maximizing the downstream supply chain visibility (SCV). The decision variable is supplier selection. We assume there are several manufacturers and several candidate suppliers. For each manufacturer, our model proposes the best suppliers with the lowest cost and maximum visibility in downstream supply chain. We examine the applicability of the model by numerical examples. We also define several scenarios for datasets and observe the tendency. The results show that minimum visibility in downstream SC is needed to have a safe SC network.

Keywords: downstream supply chain, optimization, supply chain risk, supply chain visibility

Procedia PDF Downloads 241
1210 Enhancing Information Technologies with AI: Unlocking Efficiency, Scalability, and Innovation

Authors: Abdal-Hafeez Alhussein

Abstract:

Artificial Intelligence (AI) has become a transformative force in the field of information technologies, reshaping how data is processed, analyzed, and utilized across various domains. This paper explores the multifaceted applications of AI within information technology, focusing on three key areas: automation, scalability, and data-driven decision-making. We delve into how AI-powered automation is optimizing operational efficiency in IT infrastructures, from automated network management to self-healing systems that reduce downtime and enhance performance. Scalability, another critical aspect, is addressed through AI’s role in cloud computing and distributed systems, enabling the seamless handling of increasing data loads and user demands. Additionally, the paper highlights the use of AI in cybersecurity, where real-time threat detection and adaptive response mechanisms significantly improve resilience against sophisticated cyberattacks. In the realm of data analytics, AI models—especially machine learning and natural language processing—are driving innovation by enabling more precise predictions, automated insights extraction, and enhanced user experiences. The paper concludes with a discussion on the ethical implications of AI in information technologies, underscoring the importance of transparency, fairness, and responsible AI use. It also offers insights into future trends, emphasizing the potential of AI to further revolutionize the IT landscape by integrating with emerging technologies like quantum computing and IoT.

Keywords: artificial intelligence, information technology, automation, scalability

Procedia PDF Downloads 3
1209 Fuzzy Inference-Assisted Saliency-Aware Convolution Neural Networks for Multi-View Summarization

Authors: Tanveer Hussain, Khan Muhammad, Amin Ullah, Mi Young Lee, Sung Wook Baik

Abstract:

The Big Data generated from distributed vision sensors installed on large scale in smart cities create hurdles in its efficient and beneficial exploration for browsing, retrieval, and indexing. This paper presents a three-folded framework for effective video summarization of such data and provide a compact and representative format of Big Video Data. In the first fold, the paper acquires input video data from the installed cameras and collect clues such as type and count of objects and clarity of the view from a chunk of pre-defined number of frames of each view. The decision of representative view selection for a particular interval is based on fuzzy inference system, acquiring a precise and human resembling decision, reinforced by the known clues as a part of the second fold. In the third fold, the paper forwards the selected view frames to the summary generation mechanism that is supported by a saliency-aware convolution neural network (CNN) model. The new trend of fuzzy rules for view selection followed by CNN architecture for saliency computation makes the multi-view video summarization (MVS) framework a suitable candidate for real-world practice in smart cities.

Keywords: big video data analysis, fuzzy logic, multi-view video summarization, saliency detection

Procedia PDF Downloads 183
1208 Technologic Information about Photovoltaic Applied in Urban Residences

Authors: Stephanie Fabris Russo, Daiane Costa Guimarães, Jonas Pedro Fabris, Maria Emilia Camargo, Suzana Leitão Russo, José Augusto Andrade Filho

Abstract:

Among renewable energy sources, solar energy is the one that has stood out. Solar radiation can be used as a thermal energy source and can also be converted into electricity by means of effects on certain materials, such as thermoelectric and photovoltaic panels. These panels are often used to generate energy in homes, buildings, arenas, etc., and have low pollution emissions. Thus, a technological prospecting was performed to find patents related to the use of photovoltaic plates in urban residences. The patent search was based on ESPACENET, associating the keywords photovoltaic and home, where we found 136 patent documents in the period of 1994-2015 in the fields title and abstract. Note that the years 2009, 2010, 2011, 2012, 2013 and 2014 had the highest number of applicants, with respectively, 11, 13, 23, 29, 15 and 21. Regarding the country that deposited about this technology, it is clear that China leads with 67 patent deposits, followed by Japan with 38 patents applications. It is important to note that most depositors, 50% are companies, 44% are individual inventors and only 6% are universities. On the International Patent classification (IPC) codes, we noted that the most present classification in results was H02J3/38, which represents provisions in parallel to feed a single network by two or more generators, converters or transformers. Among all categories, there is the H session, which means Electricity, with 70% of the patents.

Keywords: photovoltaic, urban residences, technology forecasting, prospecting

Procedia PDF Downloads 294
1207 Electrical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: electrical disaggregation, DTW, general appliance modeling, event detection

Procedia PDF Downloads 73
1206 Timing and Probability of Presurgical Teledermatology: Survival Analysis

Authors: Felipa de Mello-Sampayo

Abstract:

The aim of this study is to undertake, from patient’s perspective, the timing and probability of using teledermatology, comparing it with a conventional referral system. The dynamic stochastic model’s main value-added consists of the concrete application to patients waiting for dermatology surgical intervention. Patients with low health level uncertainty must use teledermatology treatment as soon as possible, which is precisely when the teledermatology is least valuable. The results of the model were then tested empirically with the teledermatology network covering the area served by the Hospital Garcia da Horta, Portugal, links the primary care centers of 24 health districts with the hospital’s dermatology department via the corporate intranet of the Portuguese healthcare system. Health level volatility can be understood as the hazard of developing skin cancer and the trend of health level as the bias of developing skin lesions. The results of the survival analysis suggest that the theoretical model can explain the use of teledermatology. It depends negatively on the volatility of patients' health, and positively on the trend of health, i.e., the lower the risk of developing skin cancer and the younger the patients, the more presurgical teledermatology one expects to occur. Presurgical teledermatology also depends positively on out-of-pocket expenses and negatively on the opportunity costs of teledermatology, i.e., the lower the benefit missed by using teledermatology, the more presurgical teledermatology one expects to occur.

Keywords: teledermatology, wait time, uncertainty, opportunity cost, survival analysis

Procedia PDF Downloads 124
1205 Distributed Coordination of Connected and Automated Vehicles at Multiple Interconnected Intersections

Authors: Zhiyuan Du, Baisravan Hom Chaudhuri, Pierluigi Pisu

Abstract:

In connected vehicle systems where wireless communication is available among the involved vehicles and intersection controllers, it is possible to design an intersection coordination strategy that leads the connected and automated vehicles (CAVs) travel through the road intersections without the conventional traffic light control. In this paper, we present a distributed coordination strategy for the CAVs at multiple interconnected intersections that aims at improving system fuel efficiency and system mobility. We present a distributed control solution where in the higher level, the intersection controllers calculate the road desired average velocity and optimally assign reference velocities of each vehicle. In the lower level, every vehicle is considered to use model predictive control (MPC) to track their reference velocity obtained from the higher level controller. The proposed method has been implemented on a simulation-based case with two-interconnected intersection network. Additionally, the effects of mixed vehicle types on the coordination strategy has been explored. Simulation results indicate the improvement on vehicle fuel efficiency and traffic mobility of the proposed method.

Keywords: connected vehicles, automated vehicles, intersection coordination systems, multiple interconnected intersections, model predictive control

Procedia PDF Downloads 353
1204 An Approach to Control Electric Automotive Water Pumps Deploying Artificial Neural Networks

Authors: Gabriel S. Adesina, Ruixue Cheng, Geetika Aggarwal, Michael Short

Abstract:

With the global shift towards sustainability and technological advancements, electric Hybrid vehicles (EHVs) are increasingly being seen as viable alternatives to traditional internal combustion (IC) engine vehicles, which also require efficient cooling systems. The electric Automotive Water Pump (AWP) has been introduced as an alternative to IC engine belt-driven pump systems. However, current control methods for AWPs typically employ fixed gain settings, which are not ideal for the varying conditions of dynamic vehicle environments, potentially leading to overheating issues. To overcome the limitations of fixed gain control, this paper proposes implementing an artificial neural network (ANN) for managing the AWP in EHVs. The proposed ANN provides an intelligent, adaptive control strategy that enhances the AWP's performance, supported through MATLAB simulation work illustrated in this paper. Comparative analysis demonstrates that the ANN-based controller surpasses conventional PID and fuzzy logic-based controllers (FLC), exhibiting no overshoot, 0.1secs rapid response, and 0.0696 IAE performance. Consequently, the findings suggest that ANNs can be effectively utilized in EHVs.

Keywords: automotive water pump, cooling system, electric hybrid vehicles, artificial neural networks, PID control, fuzzy logic control, IAE, MATLAB

Procedia PDF Downloads 28
1203 Development of Partial Discharge Defect Recognition and Status Diagnosis System with Adaptive Deep Learning

Authors: Chien-kuo Chang, Bo-wei Wu, Yi-yun Tang, Min-chiu Wu

Abstract:

This paper proposes a power equipment diagnosis system based on partial discharge (PD), which is characterized by increasing the readability of experimental data and the convenience of operation. This system integrates a variety of analysis programs of different data formats and different programming languages and then establishes a set of interfaces that can follow and expand the structure, which is also helpful for subsequent maintenance and innovation. This study shows a case of using the developed Convolutional Neural Networks (CNN) to integrate with this system, using the designed model architecture to simplify the complex training process. It is expected that the simplified training process can be used to establish an adaptive deep learning experimental structure. By selecting different test data for repeated training, the accuracy of the identification system can be enhanced. On this platform, the measurement status and partial discharge pattern of each equipment can be checked in real time, and the function of real-time identification can be set, and various training models can be used to carry out real-time partial discharge insulation defect identification and insulation state diagnosis. When the electric power equipment entering the dangerous period, replace equipment early to avoid unexpected electrical accidents.

Keywords: partial discharge, convolutional neural network, partial discharge analysis platform, adaptive deep learning

Procedia PDF Downloads 72
1202 Intelligent Minimal Allocation of Capacitors in Distribution Networks Using Genetic Algorithm

Authors: S. Neelima, P. S. Subramanyam

Abstract:

A distribution system is an interface between the bulk power system and the consumers. Among these systems, radial distributions system is popular because of low cost and simple design. In distribution systems, the voltages at buses reduces when moved away from the substation, also the losses are high. The reason for a decrease in voltage and high losses is the insufficient amount of reactive power, which can be provided by the shunt capacitors. But the placement of the capacitor with an appropriate size is always a challenge. Thus, the optimal capacitor placement problem is to determine the location and size of capacitors to be placed in distribution networks in an efficient way to reduce the power losses and improve the voltage profile of the system. For this purpose, in this paper, two stage methodologies are used. In the first stage, the load flow of pre-compensated distribution system is carried out using ‘dimension reducing distribution load flow algorithm (DRDLFA)’. On the basis of this load flow the potential locations of compensation are computed. In the second stage, Genetic Algorithm (GA) technique is used to determine the optimal location and size of the capacitors such that the cost of the energy loss and capacitor cost to be a minimum. The above method is tested on IEEE 9 and 34 bus system and compared with other methods in the literature.

Keywords: dimension reducing distribution load flow algorithm, DRDLFA, genetic algorithm, electrical distribution network, optimal capacitors placement, voltage profile improvement, loss reduction

Procedia PDF Downloads 388