Search results for: computer usage
2967 Introducing Design Principles for Clinical Decision Support Systems
Authors: Luca Martignoni
Abstract:
The increasing usage of clinical decision support systems in healthcare and the demand for software that enables doctors to take informed decisions is changing everyday clinical practice. However, as technology advances not only are the benefits of technology growing, but so are the potential risks. A growing danger is the doctors’ over-reliance on the proposed decision of the clinical decision support system, leading towards deskilling and rash decisions by doctors. In that regard, identifying doctors' requirements for software and developing approaches to prevent technological over-reliance is of utmost importance. In this paper, we report the results of a design science research study, focusing on the requirements and design principles of ultrasound software. We conducted a total of 15 interviews with experts about poten-tial ultrasound software functions. Subsequently, we developed meta-requirements and design principles to design future clinical decision support systems efficiently and as free from the occur-rence of technological over-reliance as possible.Keywords: clinical decision support systems, technological over-reliance, design principles, design science research
Procedia PDF Downloads 992966 A Resource Optimization Strategy for CPU (Central Processing Unit) Intensive Applications
Authors: Junjie Peng, Jinbao Chen, Shuai Kong, Danxu Liu
Abstract:
On the basis of traditional resource allocation strategies, the usage of resources on physical servers in cloud data center is great uncertain. It will cause waste of resources if the assignment of tasks is not enough. On the contrary, it will cause overload if the assignment of tasks is too much. This is especially obvious when the applications are the same type because of its resource preferences. Considering CPU intensive application is one of the most common types of application in the cloud, we studied the optimization strategy for CPU intensive applications on the same server. We used resource preferences to analyze the case that multiple CPU intensive applications run simultaneously, and put forward a model which can predict the execution time for CPU intensive applications which run simultaneously. Based on the prediction model, we proposed the method to select the appropriate number of applications for a machine. Experiments show that the model can predict the execution time accurately for CPU intensive applications. To improve the execution efficiency of applications, we propose a scheduling model based on priority for CPU intensive applications. Extensive experiments verify the validity of the scheduling model.Keywords: cloud computing, CPU intensive applications, resource optimization, strategy
Procedia PDF Downloads 2762965 The Usage of Nitrogen Gas and Alum for Sludge Dewatering
Authors: Mamdouh Yousef Saleh, Medhat Hosny El-Zahar, Shymaa El-Dosoky
Abstract:
In most cases, the associated processing cost of dewatering sludge increase with the solid particles concentration. All experiments in this study were conducted on biological sludge type. All experiments help to reduce the greenhouse gases in addition, the technology used was faster in time and less in cost compared to other methods. First, the bubbling pressure was used to dissolve N₂ gas into the sludge, second alum was added to accelerate the process of coagulation of the sludge particles and facilitate their flotation, and third nitrogen gas was used to help floating the sludge particles and reduce the processing time because of the nitrogen gas from the inert gases. The conclusions of this experiment were as follows: first, the best conditions were obtained when the bubbling pressure was 0.6 bar. Second, the best alum dose was determined to help the sludge agglomerate and float. During the experiment, the best alum dose was 80 mg/L. It increased concentration of the sludge by 7-8 times. Third, the economic dose of nitrogen gas was 60 mg/L with separation efficiency of 85%. The sludge concentration was about 8-9 times. That happened due to the gas released tiny bubbles which adhere to the suspended matter causing them to float to the surface of the water where it could be then removed.Keywords: nitrogen gas, biological treatment, alum, dewatering sludge, greenhouse gases
Procedia PDF Downloads 2162964 The Diverse Impact of Internet Addiction on College Students: An Analysis of Behavioral and Academic Consequences
Authors: Mozadded Hossen
Abstract:
This study investigates the varied effects of internet addiction on college students, specifically examining the behavioral and academic outcomes. The widespread use of the Internet in academic settings has substantially impacted students' mental well-being and academic achievements. The study investigates the correlation between excessive internet usage and addiction, which manifests through symptoms including social isolation, anxiety, despair, and sleep disruptions. Additionally, the study examines the relationship between internet addiction and academic results, finding that kids with more severe addiction levels generally have lower academic performance, experience diminished focus, and show reduced involvement in academic tasks. The study intends to analyze the many consequences of internet addiction to gain insights into its ramifications. It also urges educational institutions to develop techniques that can reduce the negative impact of internet addiction and encourage healthier internet use among students. The results emphasize the necessity of implementing comprehensive measures to tackle the behavioral and academic difficulties caused by internet addiction among college students.Keywords: internet addiction, behavioral consequences, college students, social isolation
Procedia PDF Downloads 282963 Design of a Solar Water Heating System with Thermal Storage for a Three-Bedroom House in Newfoundland
Authors: Ahmed Aisa, Tariq Iqbal
Abstract:
This letter talks about the ready-to-use design of a solar water heating system because, in Canada, the average consumption of hot water per person is approximately 50 to 75 L per day and the average Canadian household uses 225 L. Therefore, this paper will demonstrate the method of designing a solar water heating system with thermal storage. It highlights the renewable hybrid power system, allowing you to obtain a reliable, independent system with the optimization of the ingredient size and at an improved capital cost. The system can provide hot water for a big building. The main power for the system comes from solar panels. Solar Advisory Model (SAM) and HOMER are used. HOMER and SAM are design models that calculate the consumption of hot water and cost for a house. Some results, obtained through simulation, were for monthly energy production, annual energy production, after tax cash flow, the lifetime of the system and monthly energy usage represented by three types of energy. These are system energy, electricity load electricity and net metering credit.Keywords: water heating, thermal storage, capital cost solar, consumption
Procedia PDF Downloads 4272962 Digital Forensics Compute Cluster: A High Speed Distributed Computing Capability for Digital Forensics
Authors: Daniel Gonzales, Zev Winkelman, Trung Tran, Ricardo Sanchez, Dulani Woods, John Hollywood
Abstract:
We have developed a distributed computing capability, Digital Forensics Compute Cluster (DFORC2) to speed up the ingestion and processing of digital evidence that is resident on computer hard drives. DFORC2 parallelizes evidence ingestion and file processing steps. It can be run on a standalone computer cluster or in the Amazon Web Services (AWS) cloud. When running in a virtualized computing environment, its cluster resources can be dynamically scaled up or down using Kubernetes. DFORC2 is an open source project that uses Autopsy, Apache Spark and Kafka, and other open source software packages. It extends the proven open source digital forensics capabilities of Autopsy to compute clusters and cloud architectures, so digital forensics tasks can be accomplished efficiently by a scalable array of cluster compute nodes. In this paper, we describe DFORC2 and compare it with a standalone version of Autopsy when both are used to process evidence from hard drives of different sizes.Keywords: digital forensics, cloud computing, cyber security, spark, Kubernetes, Kafka
Procedia PDF Downloads 3912961 Logistics Information Systems in the Distribution of Flour in Nigeria
Authors: Cornelius Femi Popoola
Abstract:
This study investigated logistics information systems in the distribution of flour in Nigeria. A case study design was used and 50 staff of Honeywell Flour Mill was sampled for the study. Data generated through a questionnaire were analysed using correlation and regression analysis. The findings of the study revealed that logistic information systems such as e-commerce, interactive telephone systems and electronic data interchange positively correlated with the distribution of flour in Honeywell Flour Mill. Finding also deduced that e-commerce, interactive telephone systems and electronic data interchange jointly and positively contribute to the distribution of flour in Honeywell Flour Mill in Nigeria (R = .935; Adj. R2 = .642; F (3,47) = 14.739; p < .05). The study therefore recommended that Honeywell Flour Mill should upgrade their logistic information systems to computer-to-computer communication of business transactions and documents, as well adopt new technology such as, tracking-and-tracing systems (barcode scanning for packages and palettes), tracking vehicles with Global Positioning System (GPS), measuring vehicle performance with ‘black boxes’ (containing logistic data), and Automatic Equipment Identification (AEI) into their systems.Keywords: e-commerce, electronic data interchange, flour distribution, information system, interactive telephone systems
Procedia PDF Downloads 5512960 Identification of Failures Occurring on a System on Chip Exposed to a Neutron Beam for Safety Applications
Authors: S. Thomet, S. De-Paoli, F. Ghaffari, J. M. Daveau, P. Roche, O. Romain
Abstract:
In this paper, we present a hardware module dedicated to understanding the fail reason of a System on Chip (SoC) exposed to a particle beam. Impact of Single-Event Effects (SEE) on processor-based SoCs is a concern that has increased in the past decade, particularly for terrestrial applications with automotive safety increasing requirements, as well as consumer and industrial domains. The SEE created by the impact of a particle on an SoC may have consequences that can end to instability or crashes. Specific hardening techniques for hardware and software have been developed to make such systems more reliable. SoC is then qualified using cosmic ray Accelerated Soft-Error Rate (ASER) to ensure the Soft-Error Rate (SER) remains in mission profiles. Understanding where errors are occurring is another challenge because of the complexity of operations performed in an SoC. Common techniques to monitor an SoC running under a beam are based on non-intrusive debug, consisting of recording the program counter and doing some consistency checking on the fly. To detect and understand SEE, we have developed a module embedded within the SoC that provide support for recording probes, hardware watchpoints, and a memory mapped register bank dedicated to software usage. To identify CPU failure modes and the most important resources to probe, we have carried out a fault injection campaign on the RTL model of the SoC. Probes are placed on generic CPU registers and bus accesses. They highlight the propagation of errors and allow identifying the failure modes. Typical resulting errors are bit-flips in resources creating bad addresses, illegal instructions, longer than expected loops, or incorrect bus accesses. Although our module is processor agnostic, it has been interfaced to a RISC-V by probing some of the processor registers. Probes are then recorded in a ring buffer. Associated hardware watchpoints are allowing to do some control, such as start or stop event recording or halt the processor. Finally, the module is also providing a bank of registers where the firmware running on the SoC can log information. Typical usage is for operating system context switch recording. The module is connected to a dedicated debug bus and is interfaced to a remote controller via a debugger link. Thus, a remote controller can interact with the monitoring module without any intrusiveness on the SoC. Moreover, in case of CPU unresponsiveness, or system-bus stall, the recorded information can still be recovered, providing the fail reason. A preliminary version of the module has been integrated into a test chip currently being manufactured at ST in 28-nm FDSOI technology. The module has been triplicated to provide reliable information on the SoC behavior. As the primary application domain is automotive and safety, the efficiency of the module will be evaluated by exposing the test chip under a fast-neutron beam by the end of the year. In the meantime, it will be tested with alpha particles and electromagnetic fault injection (EMFI). We will report in the paper on fault-injection results as well as irradiation results.Keywords: fault injection, SoC fail reason, SoC soft error rate, terrestrial application
Procedia PDF Downloads 2292959 Task Based Language Learning: A Paradigm Shift in ESL/EFL Teaching and Learning: A Case Study Based Approach
Authors: Zehra Sultan
Abstract:
The study is based on the task-based language teaching approach which is found to be very effective in the EFL/ESL classroom. This approach engages learners to acquire the usage of authentic language skills by interacting with the real world through sequence of pedagogical tasks. The use of technology enhances the effectiveness of this approach. This study throws light on the historical background of TBLT and its efficacy in the EFL/ESL classroom. In addition, this study precisely talks about the implementation of this approach in the General Foundation Programme of Muscat College, Oman. It furnishes the list of the pedagogical tasks embedded in the language curriculum of General Foundation Programme (GFP) which are skillfully allied to the College Graduate Attributes. Moreover, the study also discusses the challenges pertaining to this approach from the point of view of teachers, students, and its classroom application. Additionally, the operational success of this methodology is gauged through formative assessments of the GFP, which is apparent in the students’ progress.Keywords: task-based language teaching, authentic language, communicative approach, real world activities, ESL/EFL activities
Procedia PDF Downloads 1222958 Multiplayer Game System for Therapeutic Exercise in Which Players with Different Athletic Abilities Can Participate on an Even Competitive Footing
Authors: Kazumoto Tanaka, Takayuki Fujino
Abstract:
Sports games conducted as a group are a form of therapeutic exercise for aged people with decreased strength and for people suffering from permanent damage of stroke and other conditions. However, it is difficult for patients with different athletic abilities to play a game on an equal footing. This study specifically examines a computer video game designed for therapeutic exercise, and a game system with support given depending on athletic ability. Thereby, anyone playing the game can participate equally. This video-game, to be specific, is a popular variant of balloon volleyball, in which players hit a balloon by hand before it falls to the floor. In this game system, each player plays the game watching a monitor on which the system displays tailor-made video-game images adjusted to the person’s athletic ability, providing players with player-adaptive assist support. We have developed a multiplayer game system with an image generation technique for the tailor-made video-game and conducted tests to evaluate it.Keywords: therapeutic exercise, computer video game, disability-adaptive assist, tailor-made video-game image
Procedia PDF Downloads 5602957 Multi-Spectral Deep Learning Models for Forest Fire Detection
Authors: Smitha Haridasan, Zelalem Demissie, Atri Dutta, Ajita Rattani
Abstract:
Aided by the wind, all it takes is one ember and a few minutes to create a wildfire. Wildfires are growing in frequency and size due to climate change. Wildfires and its consequences are one of the major environmental concerns. Every year, millions of hectares of forests are destroyed over the world, causing mass destruction and human casualties. Thus early detection of wildfire becomes a critical component to mitigate this threat. Many computer vision-based techniques have been proposed for the early detection of forest fire using video surveillance. Several computer vision-based methods have been proposed to predict and detect forest fires at various spectrums, namely, RGB, HSV, and YCbCr. The aim of this paper is to propose a multi-spectral deep learning model that combines information from different spectrums at intermediate layers for accurate fire detection. A heterogeneous dataset assembled from publicly available datasets is used for model training and evaluation in this study. The experimental results show that multi-spectral deep learning models could obtain an improvement of about 4.68 % over those based on a single spectrum for fire detection.Keywords: deep learning, forest fire detection, multi-spectral learning, natural hazard detection
Procedia PDF Downloads 2382956 Sentiment Analysis: An Enhancement of Ontological-Based Features Extraction Techniques and Word Equations
Authors: Mohd Ridzwan Yaakub, Muhammad Iqbal Abu Latiffi
Abstract:
Online business has become popular recently due to the massive amount of information and medium available on the Internet. This has resulted in the huge number of reviews where the consumers share their opinion, criticisms, and satisfaction on the products they have purchased on the websites or the social media such as Facebook and Twitter. However, to analyze customer’s behavior has become very important for organizations to find new market trends and insights. The reviews from the websites or the social media are in structured and unstructured data that need a sentiment analysis approach in analyzing customer’s review. In this article, techniques used in will be defined. Definition of the ontology and description of its possible usage in sentiment analysis will be defined. It will lead to empirical research that related to mobile phones used in research and the ontology used in the experiment. The researcher also will explore the role of preprocessing data and feature selection methodology. As the result, ontology-based approach in sentiment analysis can help in achieving high accuracy for the classification task.Keywords: feature selection, ontology, opinion, preprocessing data, sentiment analysis
Procedia PDF Downloads 1992955 Simulation with Uncertainties of Active Controlled Vibration Isolation System for Astronaut’s Exercise Platform
Authors: Shield B. Lin, Ziraguen O. Williams
Abstract:
In a task to assist NASA in analyzing the dynamic forces caused by operational countermeasures of an astronaut’s exercise platform impacting the spacecraft, an active proportional-integral-derivative controller commanding a linear actuator is proposed in a vibration isolation system to regulate the movement of the exercise platform. Computer simulation shows promising results that most exciter forces can be reduced or even eliminated. This paper emphasizes on parameter uncertainties, variations and exciter force variations. Drift and variations of system parameters in the vibration isolation system for astronaut’s exercise platform are analyzed. An active controlled scheme is applied with the goals to reduce the platform displacement and to minimize the force being transmitted to the spacecraft structure. The controller must be robust enough to accommodate the wide variations of system parameters and exciter forces. Computer simulation for the vibration isolation system was performed via MATLAB/Simulink and Trick. The simulation results demonstrate the achievement of force reduction with small platform displacement under wide ranges of variations in system parameters.Keywords: control, counterweight, isolation, vibration
Procedia PDF Downloads 1442954 N-Type GaN Thinning for Enhancing Light Extraction Efficiency in GaN-Based Thin-Film Flip-Chip Ultraviolet (UV) Light Emitting Diodes (LED)
Authors: Anil Kawan, Soon Jae Yu, Jong Min Park
Abstract:
GaN-based 365 nm wavelength ultraviolet (UV) light emitting diodes (LED) have various applications: curing, molding, purification, deodorization, and disinfection etc. However, their usage is limited by very low output power, because of the light absorption in the GaN layers. In this study, we demonstrate a method utilizing removal of 365 nm absorption layer buffer GaN and thinning the n-type GaN so as to improve the light extraction efficiency of the GaN-based 365 nm UV LED. The UV flip chip LEDs of chip size 1.3 mm x 1.3 mm were fabricated using GaN epilayers on a sapphire substrate. Via-hole n-type contacts and highly reflective Ag metal were used for efficient light extraction. LED wafer was aligned and bonded to AlN carrier wafer. To improve the extraction efficiency of the flip chip LED, sapphire substrate and absorption layer buffer GaN were removed by using laser lift-off and dry etching, respectively. To further increase the extraction efficiency of the LED, exposed n-type GaN thickness was reduced by using inductively coupled plasma etching.Keywords: extraction efficiency, light emitting diodes, n-GaN thinning, ultraviolet
Procedia PDF Downloads 4242953 Local Boundary Analysis for Generative Theory of Tonal Music: From the Aspect of Classic Music Melody Analysis
Authors: Po-Chun Wang, Yan-Ru Lai, Sophia I. C. Lin, Alvin W. Y. Su
Abstract:
The Generative Theory of Tonal Music (GTTM) provides systematic approaches to recognizing local boundaries of music. The rules have been implemented in some automated melody segmentation algorithms. Besides, there are also deep learning methods with GTTM features applied to boundary detection tasks. However, these studies might face constraints such as a lack of or inconsistent label data. The GTTM database is currently the most widely used GTTM database, which includes manually labeled GTTM rules and local boundaries. Even so, we found some problems with these labels. They are sometimes discrepancies with GTTM rules. In addition, since it is labeled at different times by multiple musicians, they are not within the same scope in some cases. Therefore, in this paper, we examine this database with musicians from the aspect of classical music and relabel the scores. The relabeled database - GTTM Database v2.0 - will be released for academic research usage. Despite the experimental and statistical results showing that the relabeled database is more consistent, the improvement in boundary detection is not substantial. It seems that we need more clues than GTTM rules for boundary detection in the future.Keywords: dataset, GTTM, local boundary, neural network
Procedia PDF Downloads 1442952 F-VarNet: Fast Variational Network for MRI Reconstruction
Authors: Omer Cahana, Maya Herman, Ofer Levi
Abstract:
Magnetic resonance imaging (MRI) is a long medical scan that stems from a long acquisition time. This length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach, such as compress sensing (CS) or parallel imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. In order to achieve that, two properties have to exist: i) the signal must be sparse under a known transform domain, ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm needs to be applied to recover the signal. While the rapid advance in the deep learning (DL) field, which has demonstrated tremendous successes in various computer vision task’s, the field of MRI reconstruction is still in an early stage. In this paper, we present an extension of the state-of-the-art model in MRI reconstruction -VarNet. We utilize VarNet by using dilated convolution in different scales, which extends the receptive field to capture more contextual information. Moreover, we simplified the sensitivity map estimation (SME), for it holds many unnecessary layers for this task. Those improvements have shown significant decreases in computation costs as well as higher accuracy.Keywords: MRI, deep learning, variational network, computer vision, compress sensing
Procedia PDF Downloads 1582951 Unfolding the Social Clash between Online and Non-Online Transportation Providers in Bandung
Authors: Latifah Putti Tiananda, Sasti Khoirunnisa, Taniadiana Yapwito, Jessica Noviena
Abstract:
Innovations are often met with two responses, acceptance or rejection. In the past few years, Indonesia is experiencing a revolution of transportation service, which utilizes online platform for its operation. Such improvement is welcomed by consumers and challenged by conventional or ‘non-online’ transportation providers simultaneously. Conflicts arise as the existence of this online transportation mode results in declining income of non-online transportation workers. Physical confrontations and demonstrations demand policing from central authority. However, the obscurity of legal measures from the government persists the social instability. Bandung, a city in West Java with the highest rate of online transportation usage, has recently issued a recommendation withholding the operation of online transportation services to maintain peace and order. Thus, this paper seeks to elaborate the social unrest between the two contesting transportation actors in Bandung and explore community-based approaches to solve this problem. Using qualitative research method, this paper will also feature in-depth interviews with directly involved sources from Bandung.Keywords: Bandung, market competition, online transportation services, social unrest
Procedia PDF Downloads 2722950 Use of Social Media Among University Student and Its Effect on the Achievement of Students
Authors: Saba Latif
Abstract:
The use of social media among university students is a topic of ongoing debate, with conflicting views on its impact on academic achievement. This study aimed to explore the relationship between social media use and academic achievement among university students and to identify factors that may contribute to positive or negative effects. The study used a mixed-methods design, including a survey of 500 university students and qualitative interviews with a subset of participants. The survey results showed that social media use was prevalent among students, with Facebook and Instagram are the most commonly used platforms. The findings also indicated a positive relationship between social media use and academic achievement, with students who reported higher levels of social media use also reporting higher GPAs. However, the qualitative interviews revealed that excessive use of social media could be a distraction that hinders academic performance, especially when students use it to procrastinate or to stay up late at night. Overall, the findings suggest that social media use can have both positive and negative effects on academic achievement among university students. Responsible and balanced use of social media, such as setting limits on usage and avoiding procrastination, may help students maximize the benefits while minimizing the risks.Keywords: social media, university, achievement, effective, learning
Procedia PDF Downloads 812949 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500
Authors: Mustafa Elfituri, Jonathan Cook
Abstract:
Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.Keywords: graph computation, graph500 benchmark, parallel architectures, parallel programming, workload characterization.
Procedia PDF Downloads 1462948 Optimization of Multiplier Extraction Digital Filter On FPGA
Authors: Shiksha Jain, Ramesh Mishra
Abstract:
One of the most widely used complex signals processing operation is filtering. The most important FIR digital filter are widely used in DSP for filtering to alter the spectrum according to some given specifications. Power consumption and Area complexity in the algorithm of Finite Impulse Response (FIR) filter is mainly caused by multipliers. So we present a multiplier less technique (DA technique). In this technique, precomputed value of inner product is stored in LUT. Which are further added and shifted with number of iterations equal to the precision of input sample. But the exponential growth of LUT with the order of FIR filter, in this basic structure, makes it prohibitive for many applications. The significant area and power reduction over traditional Distributed Arithmetic (DA) structure is presented in this paper, by the use of slicing of LUT to the desired length. An architecture of 16 tap FIR filter is presented, with different length of slice of LUT. The result of FIR Filter implementation on Xilinx ISE synthesis tool (XST) vertex-4 FPGA Tool by using proposed method shows the increase of the maximum frequency, the decrease of the resources as usage saving in area with more number of slices and the reduction dynamic power.Keywords: multiplier less technique, linear phase symmetric FIR filter, FPGA tool, look up table
Procedia PDF Downloads 3892947 Study of the Late Phase of Core Degradation during Reflooding by Safety Injection System for VVER1000 with ASTECv2 Computer Code
Authors: Antoaneta Stefanova, Rositsa Gencheva, Pavlin Groudev
Abstract:
This paper presents the modeling approach in SBO sequence for VVER 1000 reactors and describes the reactor core behavior at late in-vessel phase in case of late reflooding by HPIS and gives preliminary results for the ASTECv2 validation. The work is focused on investigation of plant behavior during total loss of power and the operator actions. The main goal of these analyses is to assess the phenomena arising during the Station blackout (SBO) followed by primary side high pressure injection system (HPIS) reflooding of already damaged reactor core at very late ‘in-vessel’ phase. The purpose of the analysis is to define how the later HPIS switching on can delay the time of vessel failure or possibly avoid vessel failure. For this purpose has been simulated an SBO scenario with injection of cold water by a high pressure pump (HPP) in cold leg at different stages of core degradation. The times for HPP injection were chosen based on previously performed investigations.Keywords: VVER, operator action validation, reflooding of overheated reactor core, ASTEC computer code
Procedia PDF Downloads 4122946 Development of Polymeric Fluorescence Sensor for the Determination of Bisphenol-A
Authors: Neşe Taşci, Soner Çubuk, Ece Kök Yetimoğlu, M. Vezir Kahraman
Abstract:
Bisphenol-A (BPA), 2,2-bis(4-hydroxyphenly)propane, is one of the highest usage volume chemicals in the world. Studies showed that BPA maybe has negative effects on the central nervous system, immune and endocrine systems. Several of analytical methods for the analysis of BPA have been reported including electrochemical processes, chemical oxidation, ozonization, spectrophotometric, chromatographic techniques. Compared with other conventional analytical techniques, optic sensors are reliable, providing quick results, low cost, easy to use, stands out as a much more advantageous method because of the high precision and sensitivity. In this work, a new photocured polymeric fluorescence sensor was prepared and characterized for Bisphenol-A (BPA) analysis. Characterization of the membrane was carried out by Attenuated Total Reflectance Fourier Transform Infrared Spectroscopy (ATR-FTIR) and Scanning Electron Microscope (SEM) techniques. The response characteristics of the sensor including dynamic range, pH effect and response time were systematically investigated. Acknowledgment: This work was supported by the Scientific and Technological Research Council of Turkey (TUBITAK) under Grant 115Y469.Keywords: bisphenol-a, fluorescence, photopolymerization, polymeric sensor
Procedia PDF Downloads 2342945 AI-Based Autonomous Plant Health Monitoring and Control System with Visual Health-Scoring Models
Authors: Uvais Qidwai, Amor Moursi, Mohamed Tahar, Malek Hamad, Hamad Alansi
Abstract:
This paper focuses on the development and implementation of an advanced plant health monitoring system with an AI backbone and IoT sensory network. Our approach involves addressing the critical environmental factors essential for preserving a plant’s well-being, including air temperature, soil moisture, soil temperature, soil conductivity, pH, water levels, and humidity, as well as the presence of essential nutrients like nitrogen, phosphorus, and potassium. Central to our methodology is the utilization of computer vision technology, particularly a night vision camera. The captured data is then compared against a reference database containing different health statuses. This comparative analysis is implemented using an AI deep learning model, which enables us to generate accurate assessments of plant health status. By combining the AI-based decision-making approach, our system aims to provide precise and timely insights into the overall health and well-being of plants, offering a valuable tool for effective plant care and management.Keywords: deep learning image model, IoT sensing, cloud-based analysis, remote monitoring app, computer vision, fuzzy control
Procedia PDF Downloads 532944 Performance Analysis of Solar Assisted Air Condition Using Carbon Dioxide as Refrigerant
Authors: Olusola Bamisile, Ferdinard Dika, Mustafa Dagbasi, Serkan Abbasoglu
Abstract:
The aim of this study was to model an air conditioning system that brings about effective cooling and reduce fossil fuel consumption with solar energy as an alternative source of energy. The objective of the study is to design a system with high COP, low usage of electricity and to integrate solar energy into AC systems. A hybrid solar assisted air conditioning system is designed to produce 30kW cooling capacity and R744 (CO₂) is used as a refrigerant. The effect of discharge pressure on the performance of the system is studied. The subcool temperature, evaporating temperature (5°C) and suction gas return temperature (12°C) are kept constant for the four different discharge pressures considered. The cooling gas temperature is set at 25°C, and the discharge pressure includes 80, 85, 90 and 95 bars. Copeland Scroll software is used for the simulation. A pressure-enthalpy graph is also used to deduce each enthalpy point while numerical methods were used in making other calculations. From the result of the study, it is observed that a higher COP is achieved with the use of solar assisted systems. As much as 46% of electricity requirements will be save using solar input at compressor stage.Keywords: air conditioning, solar energy, performance, energy saving
Procedia PDF Downloads 1442943 On the Influence of Sleep Habits for Predicting Preterm Births: A Machine Learning Approach
Authors: C. Fernandez-Plaza, I. Abad, E. Diaz, I. Diaz
Abstract:
Births occurring before the 37th week of gestation are considered preterm births. A threat of preterm is defined as the beginning of regular uterine contractions, dilation and cervical effacement between 23 and 36 gestation weeks. To author's best knowledge, the factors that determine the beginning of the birth are not completely defined yet. In particular, the incidence of sleep habits on preterm births is weekly studied. The aim of this study is to develop a model to predict the factors affecting premature delivery on pregnancy, based on the above potential risk factors, including those derived from sleep habits and light exposure at night (introduced as 12 variables obtained by a telephone survey using two questionnaires previously used by other authors). Thus, three groups of variables were included in the study (maternal, fetal and sleep habits). The study was approved by Research Ethics Committee of the Principado of Asturias (Spain). An observational, retrospective and descriptive study was performed with 481 births between January 1, 2015 and May 10, 2016 in the University Central Hospital of Asturias (Spain). A statistical analysis using SPSS was carried out to compare qualitative and quantitative variables between preterm and term delivery. Chi-square test qualitative variable and t-test for quantitative variables were applied. Statistically significant differences (p < 0.05) between preterm vs. term births were found for primiparity, multi-parity, kind of conception, place of residence or premature rupture of membranes and interruption during nights. In addition to the statistical analysis, machine learning methods to look for a prediction model were tested. In particular, tree based models were applied as the trade-off between performance and interpretability is especially suitable for this study. C5.0, recursive partitioning, random forest and tree bag models were analysed using caret R-package. Cross validation with 10-folds and parameter tuning to optimize the methods were applied. In addition, different noise reduction methods were applied to the initial data using NoiseFiltersR package. The best performance was obtained by C5.0 method with Accuracy 0.91, Sensitivity 0.93, Specificity 0.89 and Precision 0.91. Some well known preterm birth factors were identified: Cervix Dilation, maternal BMI, Premature rupture of membranes or nuchal translucency analysis in the first trimester. The model also identifies other new factors related to sleep habits such as light through window, bedtime on working days, usage of electronic devices before sleeping from Mondays to Fridays or change of sleeping habits reflected in the number of hours, in the depth of sleep or in the lighting of the room. IF dilation < = 2.95 AND usage of electronic devices before sleeping from Mondays to Friday = YES and change of sleeping habits = YES, then preterm is one of the predicting rules obtained by C5.0. In this work a model for predicting preterm births is developed. It is based on machine learning together with noise reduction techniques. The method maximizing the performance is the one selected. This model shows the influence of variables related to sleep habits in preterm prediction.Keywords: machine learning, noise reduction, preterm birth, sleep habit
Procedia PDF Downloads 1452942 Determining the Electrospinning Parameters of Poly(ε-Caprolactone)
Authors: M. Kagan Keler, Sibel Daglilar, Isil Kerti, Oguzhan Gunduz
Abstract:
Electrospinning is a versatile way to occur fibers at nano-scale and polycaprolactone is a biomedical material which has a wide usage in cartilage defects and tissue regeneration. PCL is biocompatible and durable material which can be used in bio-implants. Therefore, electrospinning process was chosen as a fabrication method to get PCL fibers in an effective way because of its significant adjustments. In this research study, electrospinning parameters was evaluated during the producing of polymer tissue scaffolds. Polycaprolactone’s molecular weight was 80.000 Da and was employed as a tissue material in the electrospinning process. PCL was decomposed in dimethylformamid(DMF) and chloroform(CF) with the weight ratio of 1:1. Different compositions (1%, 3%, 5%, 10% and 20 %) of PCL was prepared in the laboratory conditions. All solvents with different percentages of PCL have been taken into the syringe and loaded into the electrospinning system. In electrospinning dozens of trial were applied to get homogeneously uniform scaffold samples. Taylor cone which is crucial point for electrospinning characteristic was occurred and changed in different voltages up to the material compositions’ conductivity. While the PCL percentages were increasing in the electrospinning, structure started to arise with droplets, which was an expressive problem for tissue scaffold. The vertical and horizontal layouts were applied to produce non-woven structures at all.Keywords: tissue engineering, artificial scaffold, electrospinning, biocomposites
Procedia PDF Downloads 3462941 Improving Cost and Time Control of Construction Projects Management Practices in Nigeria
Authors: Mustapha Yakubu, Ahmed Usman, Hashim Ambursa
Abstract:
This paper presents the findings of a research which sought to investigate techniques used to improve cost and time control of construction projects management practice in Nigeria. However, there is limited research on issues surrounding the practical usage of these techniques. Data were collected through a questionnaire distributed to construction experts through a survey conducted on the 100 construction organisations and 50 construction consultancy firms in the Nigeria aimed at identifying common project cost and time control practices and factors inhibiting effective project control in practice. The study reveals that despite the vast application of control techniques a high proportion of respondents still experienced cost and time overruns on a significant proportion of their projects. Analysis of the survey results concluded that more effort should be geared at the management of the identified top project control inhibiting factors. This paper has outlined some measures for mitigating these inhibiting factors so that the outcome of project time and cost control can be improved in practice.Keywords: construction project, cost control, Nigeria, time control
Procedia PDF Downloads 3112940 Experimental Studies on the Corrosion Effects of the Concrete Made with Tannery Effluent
Authors: K. Nirmalkumar
Abstract:
An acute water scarcity is prevailing in the dry season in and around Perundurai (Erode district, Tamil Nadu, India) where there are more number of tannery units. Hence an attempt was made to use the effluent from the tannery industry for construction purpose. The mechanical properties such as compressive strength, tensile strength, flexural strength and the special properties such as chloride attack, sulphate attack and chemical attack were studied by casting various concrete specimens in form of cube, cylinders and beams, etc. It was observed that the concrete had some reduction in strength while subjected to chloride attack, sulphate attack and chemical attack. So admixtures were selected and optimized in suitable proportion to counter act the adverse effects and the results were found to be satisfactory. In this research study the corrosion results of specimens prepared by using treated and untreated tannery effluent were compared with the concrete specimens prepared by using potable water. It was observed that by the addition of admixtures, the adverse effects due to the usage of the treated and untreated tannery effluent are counteracted.Keywords: corrosion, calcium nitrite, concrete, fly ash
Procedia PDF Downloads 2672939 Diagnosis of Alzheimer Diseases in Early Step Using Support Vector Machine (SVM)
Authors: Amira Ben Rabeh, Faouzi Benzarti, Hamid Amiri, Mouna Bouaziz
Abstract:
Alzheimer is a disease that affects the brain. It causes degeneration of nerve cells (neurons) and in particular cells involved in memory and intellectual functions. Early diagnosis of Alzheimer Diseases (AD) raises ethical questions, since there is, at present, no cure to offer to patients and medicines from therapeutic trials appear to slow the progression of the disease as moderate, accompanying side effects sometimes severe. In this context, analysis of medical images became, for clinical applications, an essential tool because it provides effective assistance both at diagnosis therapeutic follow-up. Computer Assisted Diagnostic systems (CAD) is one of the possible solutions to efficiently manage these images. In our work; we proposed an application to detect Alzheimer’s diseases. For detecting the disease in early stage we used the three sections: frontal to extract the Hippocampus (H), Sagittal to analysis the Corpus Callosum (CC) and axial to work with the variation features of the Cortex(C). Our method of classification is based on Support Vector Machine (SVM). The proposed system yields a 90.66% accuracy in the early diagnosis of the AD.Keywords: Alzheimer Diseases (AD), Computer Assisted Diagnostic(CAD), hippocampus, Corpus Callosum (CC), cortex, Support Vector Machine (SVM)
Procedia PDF Downloads 3832938 Design and Implementation of a Nano-Power Wireless Sensor Device for Smart Home Security
Authors: Chia-Chi Chang
Abstract:
Most battery-driven wireless sensor devices will enter in sleep mode as soon as possible to extend the overall lifetime of a sensor network. It is necessary to turn off unnecessary radio and peripheral functions, especially the radio unit always consumes more energy than other components during wireless communication. The microcontroller is the most important part of the wireless sensor device. It is responsible for the manipulation of sensing data and communication protocols. The microcontroller always has different sleep modes, each with a different level of energy usage. The deeper the sleep, the lower the energy consumption. Most wireless sensor devices can only enter the sleep mode: the external low-frequency oscillator is still running to wake up the sleeping microcontroller when the sleep timer expires. In this paper, our sensor device can enter the extended sleep mode: none of the oscillator is running and the wireless sensor device has the nanoampere consumption and self-awaking ability. Finally, these wireless sensor devices were deployed in a smart home security network.Keywords: wireless sensor network, battery-driven, sleep mode, home security
Procedia PDF Downloads 305