Search results for: embedded network
1049 Distributed System Computing Resource Scheduling Algorithm Based on Deep Reinforcement Learning
Authors: Yitao Lei, Xingxiang Zhai, Burra Venkata Durga Kumar
Abstract:
As the quantity and complexity of computing in large-scale software systems increase, distributed system computing becomes increasingly important. The distributed system realizes high-performance computing by collaboration between different computing resources. If there are no efficient resource scheduling resources, the abuse of distributed computing may cause resource waste and high costs. However, resource scheduling is usually an NP-hard problem, so we cannot find a general solution. However, some optimization algorithms exist like genetic algorithm, ant colony optimization, etc. The large scale of distributed systems makes this traditional optimization algorithm challenging to work with. Heuristic and machine learning algorithms are usually applied in this situation to ease the computing load. As a result, we do a review of traditional resource scheduling optimization algorithms and try to introduce a deep reinforcement learning method that utilizes the perceptual ability of neural networks and the decision-making ability of reinforcement learning. Using the machine learning method, we try to find important factors that influence the performance of distributed system computing and help the distributed system do an efficient computing resource scheduling. This paper surveys the application of deep reinforcement learning on distributed system computing resource scheduling proposes a deep reinforcement learning method that uses a recurrent neural network to optimize the resource scheduling, and proposes the challenges and improvement directions for DRL-based resource scheduling algorithms.Keywords: resource scheduling, deep reinforcement learning, distributed system, artificial intelligence
Procedia PDF Downloads 1111048 An Application of Path Planning Algorithms for Autonomous Inspection of Buried Pipes with Swarm Robots
Authors: Richard Molyneux, Christopher Parrott, Kirill Horoshenkov
Abstract:
This paper aims to demonstrate how various algorithms can be implemented within swarms of autonomous robots to provide continuous inspection within underground pipeline networks. Current methods of fault detection within pipes are costly, time consuming and inefficient. As such, solutions tend toward a more reactive approach, repairing faults, as opposed to proactively seeking leaks and blockages. The paper presents an efficient inspection method, showing that autonomous swarm robotics is a viable way of monitoring underground infrastructure. Tailored adaptations of various Vehicle Routing Problems (VRP) and path-planning algorithms provide a customised inspection procedure for complicated networks of underground pipes. The performance of multiple algorithms is compared to determine their effectiveness and feasibility. Notable inspirations come from ant colonies and stigmergy, graph theory, the k-Chinese Postman Problem ( -CPP) and traffic theory. Unlike most swarm behaviours which rely on fast communication between agents, underground pipe networks are a highly challenging communication environment with extremely limited communication ranges. This is due to the extreme variability in the pipe conditions and relatively high attenuation of acoustic and radio waves with which robots would usually communicate. This paper illustrates how to optimise the inspection process and how to increase the frequency with which the robots pass each other, without compromising the routes they are able to take to cover the whole network.Keywords: autonomous inspection, buried pipes, stigmergy, swarm intelligence, vehicle routing problem
Procedia PDF Downloads 1661047 Automatic Classification of Lung Diseases from CT Images
Authors: Abobaker Mohammed Qasem Farhan, Shangming Yang, Mohammed Al-Nehari
Abstract:
Pneumonia is a kind of lung disease that creates congestion in the chest. Such pneumonic conditions lead to loss of life of the severity of high congestion. Pneumonic lung disease is caused by viral pneumonia, bacterial pneumonia, or Covidi-19 induced pneumonia. The early prediction and classification of such lung diseases help to reduce the mortality rate. We propose the automatic Computer-Aided Diagnosis (CAD) system in this paper using the deep learning approach. The proposed CAD system takes input from raw computerized tomography (CT) scans of the patient's chest and automatically predicts disease classification. We designed the Hybrid Deep Learning Algorithm (HDLA) to improve accuracy and reduce processing requirements. The raw CT scans have pre-processed first to enhance their quality for further analysis. We then applied a hybrid model that consists of automatic feature extraction and classification. We propose the robust 2D Convolutional Neural Network (CNN) model to extract the automatic features from the pre-processed CT image. This CNN model assures feature learning with extremely effective 1D feature extraction for each input CT image. The outcome of the 2D CNN model is then normalized using the Min-Max technique. The second step of the proposed hybrid model is related to training and classification using different classifiers. The simulation outcomes using the publically available dataset prove the robustness and efficiency of the proposed model compared to state-of-art algorithms.Keywords: CT scan, Covid-19, deep learning, image processing, lung disease classification
Procedia PDF Downloads 1551046 A Program Evaluation of TALMA Full-Year Fellowship Teacher Preparation
Authors: Emilee M. Cruz
Abstract:
Teachers take part in short-term teaching fellowships abroad, and their preparation before, during, and after the experience is critical to affecting teachers’ feelings of success in the international classroom. A program evaluation of the teacher preparation within TALMA: The Israel Program for Excellence in English (TALMA) full-year teaching fellowship was conducted. A questionnaire was developed that examined professional development, deliberate reflection, and cultural and language immersion offered before, during, and after the short-term experience. The evaluation also surveyed teachers’ feelings of preparedness for the Israeli classroom and any recommendations they had for future teacher preparation within the fellowship program. The review suggests the TALMA program includes integrated professional learning communities between fellows and Israeli co-teachers, more opportunities for immersive Hebrew language learning, a broader professional network with Israelis, and opportunities for guided discussion with the TALMA community continued participation in TALMA events and learning following the full-year fellowship. Similar short-term international programs should consider the findings in the design of their participation preparation programs. The review also offers direction for future program evaluation of short-term participant preparation, including the need for frequent response item updates to match current offerings and evaluation of participant feelings of preparedness before, during, and after the full-year fellowship.Keywords: educational program evaluation, international teaching, short-term teaching, teacher beliefs, teaching fellowship, teacher preparation
Procedia PDF Downloads 1821045 SIP Flooding Attacks Detection and Prevention Using Shannon, Renyi and Tsallis Entropy
Authors: Neda Seyyedi, Reza Berangi
Abstract:
Voice over IP (VOIP) network, also known as Internet telephony, is growing increasingly having occupied a large part of the communications market. With the growth of each technology, the related security issues become of particular importance. Taking advantage of this technology in different environments with numerous features put at our disposal, there arises an increasing need to address the security threats. Being IP-based and playing a signaling role in VOIP networks, Session Initiation Protocol (SIP) lets the invaders use weaknesses of the protocol to disable VOIP service. One of the most important threats is denial of service attack, a branch of which in this article we have discussed as flooding attacks. These attacks make server resources wasted and deprive it from delivering service to authorized users. Distributed denial of service attacks and attacks with a low rate can mislead many attack detection mechanisms. In this paper, we introduce a mechanism which not only detects distributed denial of service attacks and low rate attacks, but can also identify the attackers accurately. We detect and prevent flooding attacks in SIP protocol using Shannon (FDP-S), Renyi (FDP-R) and Tsallis (FDP-T) entropy. We conducted an experiment to compare the percentage of detection and rate of false alarm messages using any of the Shannon, Renyi and Tsallis entropy as a measure of disorder. Implementation results show that, according to the parametric nature of the Renyi and Tsallis entropy, by changing the parameters, different detection percentages and false alarm rates will be gained with the possibility to adjust the sensitivity of the detection mechanism.Keywords: VOIP networks, flooding attacks, entropy, computer networks
Procedia PDF Downloads 4051044 The BETA Module in Action: An Empirical Study on Enhancing Entrepreneurial Skills through Kearney's and Bloom's Guiding Principles
Authors: Yen Yen Tan, Lynn Lam, Cynthia Lam, Angela Koh, Edwin Seng
Abstract:
Entrepreneurial education plays a crucial role in nurturing future innovators and change-makers. Over time, significant progress has been made in refining instructional approaches to develop the necessary skills among learners effectively. Two highly valuable frameworks, Kearney's "4 Principles of Entrepreneurial Pedagogy" and Bloom's "Three Domains of Learning," serve as guiding principles in entrepreneurial education. Kearney's principles align with experiential and student-centric learning, which are crucial for cultivating an entrepreneurial mindset. The potential synergies between these frameworks hold great promise for enhancing entrepreneurial acumen among students. However, despite this potential, their integration remains largely unexplored. This study aims to bridge this gap by building upon the Business Essentials through Action (BETA) module and investigating its contributions to nurturing the entrepreneurial mindset. This study employs a quasi-experimental mixed-methods approach, combining quantitative and qualitative elements to ensure comprehensive and insightful data. A cohort of 235 students participated, with 118 enrolled in the BETA module and 117 in a traditional curriculum. Their Personal Entrepreneurial Competencies (PECs) were assessed before admission (pre-Y1) and one year into the course (post-Y1) using a comprehensive 55-item PEC questionnaire, enabling measurement of critical traits such as opportunity-seeking, persistence, and risk-taking. Rigorous computations of individual entrepreneurial competencies and overall PEC scores were performed, including a correction factor to mitigate potential self-assessment bias. The orchestration of Kearney's principles and Bloom's domains within the BETA module necessitates a granular examination. Here, qualitative revelations surface, courtesy of structured interviews aligned with contemporary research methodologies. These interviews act as a portal, ushering us into the transformative journey undertaken by students. Meanwhile, the study pivots to explore the BETA module's influence on students' entrepreneurial competencies from the vantage point of faculty members. A symphony of insights emanates from intimate focus group discussions featuring six dedicated lecturers, who share their perceptions, experiences, and reflective narratives, illuminating the profound impact of pedagogical practices embedded within the BETA module. Preliminary findings from ongoing data analysis indicate promising results, showcasing a substantial improvement in entrepreneurial skills among students participating in the BETA module. This study promises not only to elevate students' entrepreneurial competencies but also to illuminate the broader canvas of applicability for Kearney's principles and Bloom's domains. The dynamic interplay of quantitative analyses, proffering precise competency metrics, and qualitative revelations, delving into the nuanced narratives of transformative journeys, engenders a holistic understanding of this educational endeavour. Through a rigorous quasi-experimental mixed-methods approach, this research aims to establish the BETA module's effectiveness in fostering entrepreneurial acumen among students at Singapore Polytechnic, thereby contributing valuable insights to the broader discourse on educational methodologies.Keywords: entrepreneurial education, experiential learning, pedagogical frameworks, innovative competencies
Procedia PDF Downloads 641043 Internet of Things for Smart Dedicated Outdoor Air System in Buildings
Authors: Dararat Tongdee, Surapong Chirarattananon, Somchai Maneewan, Chantana Punlek
Abstract:
Recently, the Internet of Things (IoT) is the important technology that connects devices to the network and people can access real-time communication. This technology is used to report, collect, and analyze the big data for achieving a purpose. For a smart building, there are many IoT technologies that enable management and building operators to improve occupant thermal comfort, indoor air quality, and building energy efficiency. In this research, we propose monitoring and controlling performance of a smart dedicated outdoor air system (SDOAS) based on IoT platform. The SDOAS was specifically designed with the desiccant unit and thermoelectric module. The designed system was intended to monitor, notify, and control indoor environmental factors such as temperature, humidity, and carbon dioxide (CO₂) level. The SDOAS was tested under the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE 62.2) and indoor air quality standard. The system will notify the user by Blynk notification when the status of the building is uncomfortable or tolerable limits are reached according to the conditions that were set. The user can then control the system via a Blynk application on a smartphone. The experimental result indicates that the temperature and humidity of indoor fresh air in the comfort zone are approximately 26 degree Celsius and 58% respectively. Furthermore, the CO₂ level was controlled lower than 1000 ppm by indoor air quality standard condition. Therefore, the proposed system can efficiently work and be easy to use for buildings.Keywords: internet of things, indoor air quality, smart dedicated outdoor air system, thermal comfort
Procedia PDF Downloads 1991042 The Prevalence and Impact of Anxiety Among Medical Students in the MENA Region: A Systematic Review, Meta-Analysis, and Meta-Regression
Authors: Kawthar F. Albasri, Abdullah M. AlHudaithi, Dana B. AlTurairi, Abdullaziz S. AlQuraini, Adoub Y. AlDerazi, Reem A. Hubail, Haitham A. Jahrami
Abstract:
Several studies have found that medical students have a significant prevalence of anxiety. The purpose of this review paper is to carefully evaluate the current research on anxiety among medical students in the MENA region and, as a result, estimate the prevalence of these disturbances. Multiple databases, including the CINAHL (Cumulative Index to Nursing and Allied Health Literature), Cochrane Library, Embase, MEDLINE (Medical Literature Analysis and Retrieval System Online), PubMed, PsycINFO (Psychological Information Database), Scopus, Web of Science, UpToDate, ClinicalTrials.gov, WHO Global Health Library, EbscoHost, ProQuest, JAMA Network, and ScienceDirect, were searched. The retrieved article reference lists were rigorously searched and rated for quality. A random effects meta-analysis was performed to compute estimates. The current meta-analysis revealed an alarming estimated pooled prevalence of anxiety (K = 46, N = 27023) of 52.5% [95%CI: 43.3%–61.6%]. A total of 62.0% [95% CI 42.9%; 78.0%] of the students (K = 18, N = 16466) suffered from anxiety during the COVID-19 pandemic, while 52.5% [95% CI 43.3%; 61.6%] had anxiety before COVID-19. Based on the GAD-7 measure, a total of 55.7% [95%CI 30.5%; 78.3%] of the students (K = 10, N = 5830) had anxiety, and a total of 54.7% of the students (K = 18, N = 12154) [95%CI 42.8%; 66.0%] had anxiety using the DASS-21 or 42 measure. Anxiety is a common issue among medical students, making it a genuine problem. Further research should be conducted post-COVD 19, with a focus on anxiety prevention and intervention initiatives for medical students.Keywords: anxiety, medical students, MENA, meta-analysis, prevalence
Procedia PDF Downloads 731041 Optimized Techniques for Reducing the Reactive Power Generation in Offshore Wind Farms in India
Authors: Pardhasaradhi Gudla, Imanual A.
Abstract:
The generated electrical power in offshore needs to be transmitted to grid which is located in onshore by using subsea cables. Long subsea cables produce reactive power, which should be compensated in order to limit transmission losses, to optimize the transmission capacity, and to keep the grid voltage within the safe operational limits. Installation cost of wind farm includes the structure design cost and electrical system cost. India has targeted to achieve 175GW of renewable energy capacity by 2022 including offshore wind power generation. Due to sea depth is more in India, the installation cost will be further high when compared to European countries where offshore wind energy is already generating successfully. So innovations are required to reduce the offshore wind power project cost. This paper presents the optimized techniques to reduce the installation cost of offshore wind firm with respect to electrical transmission systems. This technical paper provides the techniques for increasing the current carrying capacity of subsea cable by decreasing the reactive power generation (capacitance effect) of the subsea cable. There are many methods for reactive power compensation in wind power plants so far in execution. The main reason for the need of reactive power compensation is capacitance effect of subsea cable. So if we diminish the cable capacitance of cable then the requirement of the reactive power compensation will be reduced or optimized by avoiding the intermediate substation at midpoint of the transmission network.Keywords: offshore wind power, optimized techniques, power system, sub sea cable
Procedia PDF Downloads 1931040 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture
Authors: Sajjad Akbar, Rabia Bashir
Abstract:
With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.Keywords: agent based web content mining, content centric networking, information centric networking
Procedia PDF Downloads 4751039 Multi-Objective Optimization of Intersections
Authors: Xiang Li, Jian-Qiao Sun
Abstract:
As the crucial component of city traffic network, intersections have significant impacts on urban traffic performance. Despite of the rapid development in transportation systems, increasing traffic volumes result in severe congestions especially at intersections in urban areas. Effective regulation of vehicle flows at intersections has always been an important issue in the traffic control system. This study presents a multi-objective optimization method at intersections with cellular automata to achieve better traffic performance. Vehicle conflicts and pedestrian interference are considered. Three categories of the traffic performance are studied including transportation efficiency, energy consumption and road safety. The left-turn signal type, signal timing and lane assignment are optimized for different traffic flows. The multi-objective optimization problem is solved with the cell mapping method. The optimization results show the conflicting nature of different traffic performance. The influence of different traffic variables on the intersection performance is investigated. It is observed that the proposed optimization method is effective in regulating the traffic at the intersection to meet multiple objectives. Transportation efficiency can be usually improved by the permissive left-turn signal, which sacrifices safety. Right-turn traffic suffers significantly when the right-turn lanes are shared with the through vehicles. The effect of vehicle flow on the intersection performance is significant. The display pattern of the optimization results can be changed remarkably by the traffic volume variation. Pedestrians have strong interference with the traffic system.Keywords: cellular automata, intersection, multi-objective optimization, traffic system
Procedia PDF Downloads 5801038 Vascularized Adipose Tissue Engineering by Using Adipose ECM/Fibroin Hydrogel
Authors: Alisan Kayabolen, Dilek Keskin, Ferit Avcu, Andac Aykan, Fatih Zor, Aysen Tezcaner
Abstract:
Adipose tissue engineering is a promising field for regeneration of soft tissue defects. However, only very thin implants can be used in vivo since vascularization is still a problem for thick implants. Another problem is finding a biocompatible scaffold with good mechanical properties. In this study, the aim is to develop a thick vascularized adipose tissue that will integrate with the host, and perform its in vitro and in vivo characterizations. For this purpose, a hydrogel of decellularized adipose tissue (DAT) and fibroin was produced, and both endothelial cells and adipocytes that were differentiated from adipose derived stem cells were encapsulated in this hydrogel. Mixing DAT with fibroin allowed rapid gel formation by vortexing. It also provided to adjust mechanical strength by changing fibroin to DAT ratio. Based on compression tests, gels of DAT/fibroin ratio with similar mechanical properties to adipose tissue was selected for cell culture experiments. In vitro characterizations showed that DAT is not cytotoxic; on the contrary, it has many natural ECM components which provide biocompatibility and bioactivity. Subcutaneous implantation of hydrogels resulted with no immunogenic reaction or infection. Moreover, localized empty hydrogels gelled successfully around host vessel with required shape. Implantations of cell encapsulated hydrogels and histological analyses are under study. It is expected that endothelial cells inside the hydrogel will form a capillary network and they will bind to the host vessel passing through hydrogel.Keywords: adipose tissue engineering, decellularization, encapsulation, hydrogel, vascularization
Procedia PDF Downloads 5281037 Binderless Naturally-extracted Metal-free Electrocatalyst for Efficient NOₓ Reduction
Authors: Hafiz Muhammad Adeel Sharif, Tian Li, Changping Li
Abstract:
Recently, the emission of nitrogen-sulphur oxides (NOₓ, SO₂) has become a global issue and causing serious threats to health and the environment. Catalytic reduction of NOx and SOₓ gases into friendly gases is considered one of the best approaches. However, regeneration of the catalyst, higher bond-dissociation energy for NOx, i.e., 150.7 kcal/mol, escape of intermediate gas (N₂O, a greenhouse gas) with treated flue-gas, and limited activity of catalyst remains a great challenge. Here, a cheap, binderless naturally-extracted bass-wood thin carbon electrode (TCE) is presented, which shows excellent catalytic activity towards NOx reduction. The bass-wood carbonization at 900 ℃ followed by thermal activation in the presence of CO2 gas at 750 ℃. The thermal activation resulted in an increase in epoxy groups on the surface of the TCE and enhancement in the surface area as well as the degree of graphitization. The TCE unique 3D strongly inter-connected network through hierarchical micro/meso/macro pores that allow large electrode/electrolyte interface. Owing to these characteristics, the TCE exhibited excellent catalytic efficiency towards NOx (~83.3%) under ambient conditions and enhanced catalytic response under pH and sulphite exposure as well as excellent stability up to 168 hours. Moreover, a temperature-dependent activity trend was found where the highest catalytic activity was achieved at 80 ℃, beyond which the electrolyte became evaporative and resulted in a performance decrease. The designed electrocatalyst showed great potential for effective NOx-reduction, which is highly cost-effective, green, and sustainable.Keywords: electrocatalyst, NOx-reduction, bass-wood electrode, integrated wet-scrubbing, sustainable
Procedia PDF Downloads 771036 Linux Security Management: Research and Discussion on Problems Caused by Different Aspects
Authors: Ma Yuzhe, Burra Venkata Durga Kumar
Abstract:
The computer is a great invention. As people use computers more and more frequently, the demand for PCs is growing, and the performance of computer hardware is also rising to face more complex processing and operation. However, the operating system, which provides the soul for computers, has stopped developing at a stage. In the face of the high price of UNIX (Uniplexed Information and Computering System), batch after batch of personal computer owners can only give up. Disk Operating System is too simple and difficult to bring innovation into play, which is not a good choice. And MacOS is a special operating system for Apple computers, and it can not be widely used on personal computers. In this environment, Linux, based on the UNIX system, was born. Linux combines the advantages of the operating system and is composed of many microkernels, which is relatively powerful in the core architecture. Linux system supports all Internet protocols, so it has very good network functions. Linux supports multiple users. Each user has no influence on their own files. Linux can also multitask and run different programs independently at the same time. Linux is a completely open source operating system. Users can obtain and modify the source code for free. Because of these advantages of Linux, it has also attracted a large number of users and programmers. The Linux system is also constantly upgraded and improved. It has also issued many different versions, which are suitable for community use and commercial use. Linux system has good security because it relies on a file partition system. However, due to the constant updating of vulnerabilities and hazards, the using security of the operating system also needs to be paid more attention to. This article will focus on the analysis and discussion of Linux security issues.Keywords: Linux, operating system, system management, security
Procedia PDF Downloads 1081035 Stem Cell Fate Decision Depending on TiO2 Nanotubular Geometry
Authors: Jung Park, Anca Mazare, Klaus Von Der Mark, Patrik Schmuki
Abstract:
In clinical application of TiO2 implants on tooth and hip replacement, migration, adhesion and differentiation of neighboring mesenchymal stem cells onto implant surfaces are critical steps for successful bone regeneration. In a recent decade, accumulated attention has been paid on nanoscale electrochemical surface modifications on TiO2 layer for improving bone-TiO2 surface integration. We generated, on titanium surfaces, self-assembled layers of vertically oriented TiO2 nanotubes with defined diameters between 15 and 100 nm and here we show that mesenchymal stem cells finely sense TiO2 nanotubular geometry and quickly decide their cell fate either to differentiation into osteoblasts or to programmed cell death (apoptosis) on TiO2 nanotube layers. These cell fate decisions are critically dependent on nanotube size differences (15-100nm in diameters) of TiO2 nanotubes sensing by integrin clustering. We further demonstrate that nanoscale topography-sensing is feasible not only in mesenchymal stem cells but rather seems as generalized nanoscale microenvironment-cell interaction mechanism in several cell types composing bone tissue network including osteoblasts, osteoclast, endothelial cells and hematopoietic stem cells. Additionally we discuss the synergistic effect of simultaneous stimulation by nanotube-bound growth factor and nanoscale topographic cues on enhanced bone regeneration.Keywords: TiO2 nanotube, stem cell fate decision, nano-scale microenvironment, bone regeneration
Procedia PDF Downloads 4321034 Subway Stray Current Effects on Gas Pipelines in the City of Tehran
Authors: Mohammad Derakhshani, Saeed Reza Allahkarama, Michael Isakhani-Zakaria, Masoud Samadian, Hojjat Sharifi Rasaey
Abstract:
In order to investigate the effects of stray current from DC traction systems (subway) on cathodically protected gas pipelines, the subway and the gas network maps in the city of Tehran were superimposed and a comprehensive map was prepared. 213 intersections and about 100150 meters of parallel sections of gas pipelines were found with respect to the railway right of way which was specified for field measurements. The potential measurements data were logged for one hour in each test point. 24-hour potential monitoring was carried out in selected test points as well. Results showed that dynamic stray current from subway on pipeline potential appears as fluctuations in its static potential that is visible in the diagrams during night periods. These fluctuations can cause the pipeline potential to exit the safe zone and lead to corrosion or overprotection. In this study, a maximum potential shift of 100 mv in the pipe-to-soil potential was considered as a criterion for dynamic stray current effective presence. Results showed that a potential fluctuation range between 100 mV to 3 V exists in measured points on pipelines which exceeds the proposed criterion and needs to be investigated. Corrosion rates influenced by stray currents were calculated using coupons. Results showed that coupon linked to the pipeline in one of the locations at region 1 of the city of Tehran has a corrosion rate of 4.2 mpy (with cathodic protection and under influence of stray currents) which is about 1.5 times more than free corrosion rate of 2.6 mpy.Keywords: stray current, DC traction, subway, buried Pipelines, cathodic protection list
Procedia PDF Downloads 8221033 Monitoring Synthesis of Biodiesel through Online Density Measurements
Authors: Arnaldo G. de Oliveira, Jr, Matthieu Tubino
Abstract:
The transesterification process of triglycerides with alcohols that occurs during the biodiesel synthesis causes continuous changes in several physical properties of the reaction mixture, such as refractive index, viscosity and density. Amongst them, density can be an useful parameter to monitor the reaction, in order to predict the composition of the reacting mixture and to verify the conversion of the oil into biodiesel. In this context, a system was constructed in order to continuously determine changes in the density of the reacting mixture containing soybean oil, methanol and sodium methoxide (30 % w/w solution in methanol), stirred at 620 rpm at room temperature (about 27 °C). A polyethylene pipe network connected to a peristaltic pump was used in order to collect the mixture and pump it through a coil fixed on the plate of an analytical balance. The collected mass values were used to trace a curve correlating the mass of the system to the reaction time. The density variation profile versus the time clearly shows three different steps: 1) the dispersion of methanol in oil causes a decrease in the system mass due to the lower alcohol density followed by stabilization; 2) the addition of the catalyst (sodium methoxide) causes a larger decrease in mass compared to the first step (dispersion of methanol in oil) because of the oil conversion into biodiesel; 3) the final stabilization, denoting the end of the reaction. This density variation profile provides information that was used to predict the composition of the mixture over the time and the reaction rate. The precise knowledge of the duration of the synthesis means saving time and resources on a scale production system. This kind of monitoring provides several interesting features such as continuous measurements without collecting aliquots.Keywords: biodiesel, density measurements, online continuous monitoring, synthesis
Procedia PDF Downloads 5751032 Early Depression Detection for Young Adults with a Psychiatric and AI Interdisciplinary Multimodal Framework
Authors: Raymond Xu, Ashley Hua, Andrew Wang, Yuru Lin
Abstract:
During COVID-19, the depression rate has increased dramatically. Young adults are most vulnerable to the mental health effects of the pandemic. Lower-income families have a higher ratio to be diagnosed with depression than the general population, but less access to clinics. This research aims to achieve early depression detection at low cost, large scale, and high accuracy with an interdisciplinary approach by incorporating clinical practices defined by American Psychiatric Association (APA) as well as multimodal AI framework. The proposed approach detected the nine depression symptoms with Natural Language Processing sentiment analysis and a symptom-based Lexicon uniquely designed for young adults. The experiments were conducted on the multimedia survey results from adolescents and young adults and unbiased Twitter communications. The result was further aggregated with the facial emotional cues analyzed by the Convolutional Neural Network on the multimedia survey videos. Five experiments each conducted on 10k data entries reached consistent results with an average accuracy of 88.31%, higher than the existing natural language analysis models. This approach can reach 300+ million daily active Twitter users and is highly accessible by low-income populations to promote early depression detection to raise awareness in adolescents and young adults and reveal complementary cues to assist clinical depression diagnosis.Keywords: artificial intelligence, COVID-19, depression detection, psychiatric disorder
Procedia PDF Downloads 1311031 NANCY: Combining Adversarial Networks with Cycle-Consistency for Robust Multi-Modal Image Registration
Authors: Mirjana Ruppel, Rajendra Persad, Amit Bahl, Sanja Dogramadzi, Chris Melhuish, Lyndon Smith
Abstract:
Multimodal image registration is a profoundly complex task which is why deep learning has been used widely to address it in recent years. However, two main challenges remain: Firstly, the lack of ground truth data calls for an unsupervised learning approach, which leads to the second challenge of defining a feasible loss function that can compare two images of different modalities to judge their level of alignment. To avoid this issue altogether we implement a generative adversarial network consisting of two registration networks GAB, GBA and two discrimination networks DA, DB connected by spatial transformation layers. GAB learns to generate a deformation field which registers an image of the modality B to an image of the modality A. To do that, it uses the feedback of the discriminator DB which is learning to judge the quality of alignment of the registered image B. GBA and DA learn a mapping from modality A to modality B. Additionally, a cycle-consistency loss is implemented. For this, both registration networks are employed twice, therefore resulting in images ˆA, ˆB which were registered to ˜B, ˜A which were registered to the initial image pair A, B. Thus the resulting and initial images of the same modality can be easily compared. A dataset of liver CT and MRI was used to evaluate the quality of our approach and to compare it against learning and non-learning based registration algorithms. Our approach leads to dice scores of up to 0.80 ± 0.01 and is therefore comparable to and slightly more successful than algorithms like SimpleElastix and VoxelMorph.Keywords: cycle consistency, deformable multimodal image registration, deep learning, GAN
Procedia PDF Downloads 1311030 Integrated Free Space Optical Communication and Optical Sensor Network System with Artificial Intelligence Techniques
Authors: Yibeltal Chanie Manie, Zebider Asire Munyelet
Abstract:
5G and 6G technology offers enhanced quality of service with high data transmission rates, which necessitates the implementation of the Internet of Things (IoT) in 5G/6G architecture. In this paper, we proposed the integration of free space optical communication (FSO) with fiber sensor networks for IoT applications. Recently, free-space optical communications (FSO) are gaining popularity as an effective alternative technology to the limited availability of radio frequency (RF) spectrum. FSO is gaining popularity due to flexibility, high achievable optical bandwidth, and low power consumption in several applications of communications, such as disaster recovery, last-mile connectivity, drones, surveillance, backhaul, and satellite communications. Hence, high-speed FSO is an optimal choice for wireless networks to satisfy the full potential of 5G/6G technology, offering 100 Gbit/s or more speed in IoT applications. Moreover, machine learning must be integrated into the design, planning, and optimization of future optical wireless communication networks in order to actualize this vision of intelligent processing and operation. In addition, fiber sensors are important to achieve real-time, accurate, and smart monitoring in IoT applications. Moreover, we proposed deep learning techniques to estimate the strain changes and peak wavelength of multiple Fiber Bragg grating (FBG) sensors using only the spectrum of FBGs obtained from the real experiment.Keywords: optical sensor, artificial Intelligence, Internet of Things, free-space optics
Procedia PDF Downloads 631029 Iris Cancer Detection System Using Image Processing and Neural Classifier
Authors: Abdulkader Helwan
Abstract:
Iris cancer, so called intraocular melanoma is a cancer that starts in the iris; the colored part of the eye that surrounds the pupil. There is a need for an accurate and cost-effective iris cancer detection system since the available techniques used currently are still not efficient. The combination of the image processing and artificial neural networks has a great efficiency for the diagnosis and detection of the iris cancer. Image processing techniques improve the diagnosis of the cancer by enhancing the quality of the images, so the physicians diagnose properly. However, neural networks can help in making decision; whether the eye is cancerous or not. This paper aims to develop an intelligent system that stimulates a human visual detection of the intraocular melanoma, so called iris cancer. The suggested system combines both image processing techniques and neural networks. The images are first converted to grayscale, filtered, and then segmented using prewitt edge detection algorithm to detect the iris, sclera circles and the cancer. The principal component analysis is used to reduce the image size and for extracting features. Those features are considered then as inputs for a neural network which is capable of deciding if the eye is cancerous or not, throughout its experience adopted by many training iterations of different normal and abnormal eye images during the training phase. Normal images are obtained from a public database available on the internet, “Mile Research”, while the abnormal ones are obtained from another database which is the “eyecancer”. The experimental results for the proposed system show high accuracy 100% for detecting cancer and making the right decision.Keywords: iris cancer, intraocular melanoma, cancerous, prewitt edge detection algorithm, sclera
Procedia PDF Downloads 5031028 Global Historical Distribution Range of Brown Bear (Ursus Arctos)
Authors: Tariq Mahmood, Faiza Lehrasab, Faraz Akrim, Muhammad Sajid nadeem, Muhammad Mushtaq, Unza waqar, Ayesha Sheraz, Shaista Andleeb
Abstract:
Brown bear (Ursus arctos), a member of the family Ursidae, is distributed in a wide range of habitats in North America, Europe and Asia. Suspectedly, the global distribution range of brown bears is decreasing at the moment due to various factors. The carnivore species is categorized as ‘Least Concern’ globally by the IUCN Red List of Threatened Species. However, there are some fragmented, small populations that are on the verge of extinction, as is in Pakistan, where the species is listed as ‘Critically Endangered’, with a declining population trend. Importantly, the global historical distribution range of brown bears is undocumented. Therefore, in the current study, we reconstructed and estimated the historical distribution range of brown bears using QGIS software and also analyzed the network of protected areas in the past and current ranges of the species. Results showed that brown bear was more widely distributed in historic times, encompassing 52.6 million km² area as compared to their current distribution of 38.8 million km², resulting in a total range contraction of up to approximately 28 %. In the past, a total of N = 62,234 protected Areas, covering approximately 3.89 million km² were present in the distribution range of the species, while now a total of N= 33,313 Protected Areas, covering approximately 2.75 million km² area, are present in the current distribution range of the brown bear. The brown bear distribution range in the protected areas has also contracted by 1.15 million km² and the total percentage reduction of PAs is 29%.Keywords: brown bear, historic distribution, range contraction, protected areas
Procedia PDF Downloads 601027 Genetic Diversity and Variation of Nigerian Pigeon (Columba livia domestica) Populations Based on the Mitochondrial Coi Gene
Authors: Foluke E. Sola-Ojo, Ibraheem A. Abubakar, Semiu F. Bello, Isiaka H. Fatima, Sule Bisola, Adesina M. Olusegun, Adeniyi C. Adeola
Abstract:
The domesticated pigeon, Columba livia domestica, has many valuable characteristics, including high nutritional value and fast growth rate. There is a lack of information on its genetic diversity in Nigeria; thus, the genetic variability in mitochondrial cytochrome oxidase subunit I (COI) sequences of 150 domestic pigeons from four different locations was examined. Three haplotypes (HT) were identified in Nigerian populations; the most common haplotype, HT1, was shared with wild and domestic pigeons from Europe, America, and Asia, while HT2 and HT3 were unique to Nigeria. The overall haplotype diversity was 0.052± 0.025, and nucleotide diversity was 0.026± 0.068 across the four investigated populations. The phylogenetic tree showed significant clustering and genetic relationship of Nigerian domestic pigeons with other global pigeons. The median-joining network showed a star-like pattern suggesting population expansion. AMOVA results indicated that genetic variations in Nigerian pigeons mainly occurred within populations (99.93%), while the Neutrality tests results suggested that the Nigerian domestic pigeons’ population experienced recent expansion. This study showed a low genetic diversity and population differentiation among Nigerian domestic pigeons consistent with a relatively conservative COI sequence with few polymorphic sites. Furthermore, the COI gene could serve as a candidate molecular marker to investigate the genetic diversity and origin of pigeon species. The current data is insufficient for further conclusions; therefore, more research evidence from multiple molecular markers is required.Keywords: Nigeria pigeon, COI, genetic diversity, genetic variation, conservation
Procedia PDF Downloads 1951026 Speech Detection Model Based on Deep Neural Networks Classifier for Speech Emotions Recognition
Authors: Aisultan Shoiynbek, Darkhan Kuanyshbay, Paulo Menezes, Akbayan Bekarystankyzy, Assylbek Mukhametzhanov, Temirlan Shoiynbek
Abstract:
Speech emotion recognition (SER) has received increasing research interest in recent years. It is a common practice to utilize emotional speech collected under controlled conditions recorded by actors imitating and artificially producing emotions in front of a microphone. There are four issues related to that approach: emotions are not natural, meaning that machines are learning to recognize fake emotions; emotions are very limited in quantity and poor in variety of speaking; there is some language dependency in SER; consequently, each time researchers want to start work with SER, they need to find a good emotional database in their language. This paper proposes an approach to create an automatic tool for speech emotion extraction based on facial emotion recognition and describes the sequence of actions involved in the proposed approach. One of the first objectives in the sequence of actions is the speech detection issue. The paper provides a detailed description of the speech detection model based on a fully connected deep neural network for Kazakh and Russian. Despite the high results in speech detection for Kazakh and Russian, the described process is suitable for any language. To investigate the working capacity of the developed model, an analysis of speech detection and extraction from real tasks has been performed.Keywords: deep neural networks, speech detection, speech emotion recognition, Mel-frequency cepstrum coefficients, collecting speech emotion corpus, collecting speech emotion dataset, Kazakh speech dataset
Procedia PDF Downloads 261025 Graph-Oriented Summary for Optimized Resource Description Framework Graphs Streams Processing
Authors: Amadou Fall Dia, Maurras Ulbricht Togbe, Aliou Boly, Zakia Kazi Aoul, Elisabeth Metais
Abstract:
Existing RDF (Resource Description Framework) Stream Processing (RSP) systems allow continuous processing of RDF data issued from different application domains such as weather station measuring phenomena, geolocation, IoT applications, drinking water distribution management, and so on. However, processing window phase often expires before finishing the entire session and RSP systems immediately delete data streams after each processed window. Such mechanism does not allow optimized exploitation of the RDF data streams as the most relevant and pertinent information of the data is often not used in a due time and almost impossible to be exploited for further analyzes. It should be better to keep the most informative part of data within streams while minimizing the memory storage space. In this work, we propose an RDF graph summarization system based on an explicit and implicit expressed needs through three main approaches: (1) an approach for user queries (SPARQL) in order to extract their needs and group them into a more global query, (2) an extension of the closeness centrality measure issued from Social Network Analysis (SNA) to determine the most informative parts of the graph and (3) an RDF graph summarization technique combining extracted user query needs and the extended centrality measure. Experiments and evaluations show efficient results in terms of memory space storage and the most expected approximate query results on summarized graphs compared to the source ones.Keywords: centrality measures, RDF graphs summary, RDF graphs stream, SPARQL query
Procedia PDF Downloads 2031024 Applying Concurrent Development Process for the Web Using Aspect-Oriented Approach
Authors: Hiroaki Fukuda
Abstract:
This paper shows a concurrent development process for modern web application, called Rich Internet Application (RIA), and describes its effect using a non-trivial application development. In the last years, RIAs such as Ajax and Flex have become popular based mainly on high-speed network. RIA provides sophisticated interfaces and user experiences, therefore, the development of RIA requires two kinds of engineer: a developer who implements business logic, and a designer who designs interface and experiences. Although collaborative works are becoming important for the development of RIAs, shared resources such as source code make it difficult. For example, if a design of interface is modified after developers have finished business logic implementations, they need to repeat the same implementations, and also tests to verify application’s behavior. MVC architecture and Object-oriented programming (OOP) enables to dividing an application into modules such as interfaces and logic, however, developers and/or designers have to write pieces of code (e.g., event handlers) that make these modules work as an application. On the other hand, Aspect-oriented programming (AOP) is ex- pected to solve complexity of application software development nowadays. AOP provides methods to separate crosscutting concerns that are scattered pieces of code from primary concerns. In this paper, we provide a concurrent development process for RIAs by introducing AOP concept. This process makes it possible to reduce shared resources between developers and designers, therefore they can perform their tasks concurrently. In addition, we describe experiences of development for a practical application using our proposed development process to show its availability.Keywords: aspect-oriented programming, concurrent, development process, rich internet application
Procedia PDF Downloads 3011023 Expression Profiling and Immunohistochemical Analysis of Squamous Cell Carcinoma of Head and Neck (Tumor, Transition Zone, Normal) by Whole Genome Scale Sequencing
Authors: Veronika Zivicova, Petr Broz, Zdenek Fik, Alzbeta Mifkova, Jan Plzak, Zdenek Cada, Herbert Kaltner, Jana Fialova Kucerova, Hans-Joachim Gabius, Karel Smetana Jr.
Abstract:
The possibility to determine genome-wide expression profiles of cells and tissues opens a new level of analysis in the quest to define dysregulation in malignancy and thus identify new tumor markers. Toward this long-term aim, we here address two issues on this level for head and neck cancer specimen: i) defining profiles in different regions, i.e. the tumor, the transition zone and normal control and ii) comparing complete data sets for seven individual patients. Special focus in the flanking immunohistochemical part is given to adhesion/growth-regulatory galectins that upregulate chemo- and cytokine expression in an NF-κB-dependent manner, to these regulators and to markers of differentiation, i.e. keratins. The detailed listing of up- and down-regulations, also available in printed form (1), not only served to unveil new candidates for testing as marker but also let the impact of the tumor in the transition zone become apparent. The extent of interindividual variation raises a strong cautionary note on assuming uniformity of regulatory events, to be noted when considering therapeutic implications. Thus, a combination of test targets (and a network analysis for galectins and their downstream effectors) is (are) advised prior to reaching conclusions on further perspectives.Keywords: galectins, genome scale sequencing, squamous cell carcinoma, transition zone
Procedia PDF Downloads 2391022 Digital Adoption of Sales Support Tools for Farmers: A Technology Organization Environment Framework Analysis
Authors: Sylvie Michel, François Cocula
Abstract:
Digital agriculture is an approach that exploits information and communication technologies. These encompass data acquisition tools like mobile applications, satellites, sensors, connected devices, and smartphones. Additionally, it involves transfer and storage technologies such as 3G/4G coverage, low-bandwidth terrestrial or satellite networks, and cloud-based systems. Furthermore, embedded or remote processing technologies, including drones and robots for process automation, along with high-speed communication networks accessible through supercomputers, are integral components of this approach. While farm-level adoption studies regarding digital agricultural technologies have emerged in recent years, they remain relatively limited in comparison to other agricultural practices. To bridge this gap, this study delves into understanding farmers' intention to adopt digital tools, employing the technology, organization, environment framework. A qualitative research design encompassed semi-structured interviews, totaling fifteen in number, conducted with key stakeholders both prior to and following the 2020-2021 COVID-19 lockdowns in France. Subsequently, the interview transcripts underwent thorough thematic content analysis, and the data and verbatim were triangulated for validation. A coding process aimed to systematically organize the data, ensuring an orderly and structured classification. Our research extends its contribution by delineating sub-dimensions within each primary dimension. A total of nine sub-dimensions were identified, categorized as follows: perceived usefulness for communication, perceived usefulness for productivity, and perceived ease of use constitute the first dimension; technological resources, financial resources, and human capabilities constitute the second dimension, while market pressure, institutional pressure, and the COVID-19 situation constitute the third dimension. Furthermore, this analysis enriches the TOE framework by incorporating entrepreneurial orientation as a moderating variable. Managerial orientation emerges as a pivotal factor influencing adoption intention, with producers acknowledging the significance of utilizing digital sales support tools to combat "greenwashing" and elevate their overall brand image. Specifically, it illustrates that producers recognize the potential of digital tools in time-saving and streamlining sales processes, leading to heightened productivity. Moreover, it highlights that the intent to adopt digital sales support tools is influenced by a market mimicry effect. Additionally, it demonstrates a negative association between the intent to adopt these tools and the pressure exerted by institutional partners. Finally, this research establishes a positive link between the intent to adopt digital sales support tools and economic fluctuations, notably during the COVID-19 pandemic. The adoption of sales support tools in agriculture is a multifaceted challenge encompassing three dimensions and nine sub-dimensions. The research delves into the adoption of digital farming technologies at the farm level through the TOE framework. This analysis provides significant insights beneficial for policymakers, stakeholders, and farmers. These insights are instrumental in making informed decisions to facilitate a successful digital transition in agriculture, effectively addressing sector-specific challenges.Keywords: adoption, digital agriculture, e-commerce, TOE framework
Procedia PDF Downloads 601021 Parameter Identification Analysis in the Design of Rock Fill Dams
Authors: G. Shahzadi, A. Soulaimani
Abstract:
This research work aims to identify the physical parameters of the constitutive soil model in the design of a rockfill dam by inverse analysis. The best parameters of the constitutive soil model, are those that minimize the objective function, defined as the difference between the measured and numerical results. The Finite Element code (Plaxis) has been utilized for numerical simulation. Polynomial and neural network-based response surfaces have been generated to analyze the relationship between soil parameters and displacements. The performance of surrogate models has been analyzed and compared by evaluating the root mean square error. A comparative study has been done based on objective functions and optimization techniques. Objective functions are categorized by considering measured data with and without uncertainty in instruments, defined by the least square method, which estimates the norm between the predicted displacements and the measured values. Hydro Quebec provided data sets for the measured values of the Romaine-2 dam. Stochastic optimization, an approach that can overcome local minima, and solve non-convex and non-differentiable problems with ease, is used to obtain an optimum value. Genetic Algorithm (GA), Particle Swarm Optimization (PSO) and Differential Evolution (DE) are compared for the minimization problem, although all these techniques take time to converge to an optimum value; however, PSO provided the better convergence and best soil parameters. Overall, parameter identification analysis could be effectively used for the rockfill dam application and has the potential to become a valuable tool for geotechnical engineers for assessing dam performance and dam safety.Keywords: Rockfill dam, parameter identification, stochastic analysis, regression, PLAXIS
Procedia PDF Downloads 1461020 A Research on the Coordinated Development of Chengdu-Chongqing Economic Circle under the Background of New Urbanization
Authors: Deng Tingting
Abstract:
The coordinated and integrated development of regions is an inevitable requirement for China to move towards high-quality, sustainable development. As one of the regions with the best economic foundation and the strongest economic strength in western China, it is a typical area with national importance and strong network connection characteristics in terms of the comprehensive effect of linking the inland hinterland and connecting the western and national urban networks. The integrated development of the Chengdu-Chongqing economic circle is of great strategic significance for the rapid and high-quality development of the western region. In the context of new urbanization, this paper takes 16 urban units within the economic circle as the research object, based on the 5-year panel data of population, regional economy, and spatial construction and development from 2016 to 2020, using the entropy method and Theil index to analyze the three target layers, and cause analysis. The research shows that there are temporal and spatial differences in the Chengdu-Chongqing economic circle, and there are significant differences between the core city and the surrounding cities. Therefore, by reforming and innovating the regional coordinated development mechanism, breaking administrative barriers, and strengthening the "polar nucleus" radiation function to release the driving force for economic development, especially in the gully areas of economic development belts, not only promote the coordinated development of internal regions but also promote the coordinated and sustainable development of the western region and take a high-quality development path.Keywords: Chengdu-Chongqing economic circle, new urbanization, coordinated regional development, Theil Index
Procedia PDF Downloads 117