Search results for: real time kernel preemption
18375 Microwave Imaging by Application of Information Theory Criteria in MUSIC Algorithm
Authors: Majid Pourahmadi
Abstract:
The performance of time-reversal MUSIC algorithm will be dramatically degrades in presence of strong noise and multiple scattering (i.e. when scatterers are close to each other). This is due to error in determining the number of scatterers. The present paper provides a new approach to alleviate such a problem using an information theoretic criterion referred as minimum description length (MDL). The merits of the novel approach are confirmed by the numerical examples. The results indicate the time-reversal MUSIC yields accurate estimate of the target locations with considerable noise and multiple scattering in the received signals.Keywords: microwave imaging, time reversal, MUSIC algorithm, minimum description length (MDL)
Procedia PDF Downloads 33718374 Discrete Tracking Control of Nonholonomic Mobile Robots: Backstepping Design Approach
Authors: Alexander S. Andreev, Olga A. Peregudova
Abstract:
In this paper, we propose a discrete tracking control of nonholonomic mobile robots with two degrees of freedom. The electro-mechanical model of a mobile robot moving on a horizontal surface without slipping, with two rear wheels controlled by two independent DC electric, and one front roal wheel is considered. We present back-stepping design based on the Euler approximate discrete-time model of a continuous-time plant. Theoretical considerations are verified by numerical simulation. The work was supported by RFFI (15-01-08482).Keywords: actuator dynamics, back stepping, discrete-time controller, Lyapunov function, wheeled mobile robot
Procedia PDF Downloads 41518373 Multivariate Output-Associative RVM for Multi-Dimensional Affect Predictions
Authors: Achut Manandhar, Kenneth D. Morton, Peter A. Torrione, Leslie M. Collins
Abstract:
The current trends in affect recognition research are to consider continuous observations from spontaneous natural interactions in people using multiple feature modalities, and to represent affect in terms of continuous dimensions, incorporate spatio-temporal correlation among affect dimensions, and provide fast affect predictions. These research efforts have been propelled by a growing effort to develop affect recognition system that can be implemented to enable seamless real-time human-computer interaction in a wide variety of applications. Motivated by these desired attributes of an affect recognition system, in this work a multi-dimensional affect prediction approach is proposed by integrating multivariate Relevance Vector Machine (MVRVM) with a recently developed Output-associative Relevance Vector Machine (OARVM) approach. The resulting approach can provide fast continuous affect predictions by jointly modeling the multiple affect dimensions and their correlations. Experiments on the RECOLA database show that the proposed approach performs competitively with the OARVM while providing faster predictions during testing.Keywords: dimensional affect prediction, output-associative RVM, multivariate regression, fast testing
Procedia PDF Downloads 28618372 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment
Authors: Seun Mayowa Sunday
Abstract:
Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud
Procedia PDF Downloads 13618371 Stochastic Model Predictive Control for Linear Discrete-Time Systems with Random Dither Quantization
Authors: Tomoaki Hashimoto
Abstract:
Recently, feedback control systems using random dither quantizers have been proposed for linear discrete-time systems. However, the constraints imposed on state and control variables have not yet been taken into account for the design of feedback control systems with random dither quantization. Model predictive control is a kind of optimal feedback control in which control performance over a finite future is optimized with a performance index that has a moving initial and terminal time. An important advantage of model predictive control is its ability to handle constraints imposed on state and control variables. Based on the model predictive control approach, the objective of this paper is to present a control method that satisfies probabilistic state constraints for linear discrete-time feedback control systems with random dither quantization. In other words, this paper provides a method for solving the optimal control problems subject to probabilistic state constraints for linear discrete-time feedback control systems with random dither quantization.Keywords: optimal control, stochastic systems, random dither, quantization
Procedia PDF Downloads 44518370 Optimizing of the Micro EDM Parameters in Drilling of Titanium Ti-6Al-4V Alloy for Higher Machining Accuracy-Fuzzy Modelling
Authors: Ahmed A. D. Sarhan, Mum Wai Yip, M. Sayuti, Lim Siew Fen
Abstract:
Ti6Al4V alloy is highly used in the automotive and aerospace industry due to its good machining characteristics. Micro EDM drilling is commonly used to drill micro hole on extremely hard material with very high depth to diameter ratio. In this study, the parameters of micro-electrical discharge machining (EDM) in drilling of Ti6Al4V alloy is optimized for higher machining accuracy with less hole-dilation and hole taper ratio. The micro-EDM machining parameters includes, peak current and pulse on time. Fuzzy analysis was developed to evaluate the machining accuracy. The analysis shows that hole-dilation and hole-taper ratio are increased with the increasing of peak current and pulse on time. However, the surface quality deteriorates as the peak current and pulse on time increase. The combination that gives the optimum result for hole dilation is medium peak current and short pulse on time. Meanwhile, the optimum result for hole taper ratio is low peak current and short pulse on time.Keywords: Micro EDM, Ti-6Al-4V alloy, fuzzy logic based analysis, optimization, machining accuracy
Procedia PDF Downloads 49618369 Modeling Residential Electricity Consumption Function in Malaysia: Time Series Approach
Authors: L. L. Ivy-Yap, H. A. Bekhet
Abstract:
As the Malaysian residential electricity consumption continued to increase rapidly, effective energy policies, which address factors affecting residential electricity consumption, is urgently needed. This study attempts to investigate the relationship between residential electricity consumption (EC), real disposable income (Y), price of electricity (Pe) and population (Po) in Malaysia for 1978-2011 periods. Unlike previous studies on Malaysia, the current study focuses on the residential sector, a sector that is important for the contemplation of energy policy. The Phillips-Perron (P-P) unit root test is employed to infer the stationary of each variable while the bound test is executed to determine the existence of co-integration relationship among the variables, modeled in an Autoregressive Distributed Lag (ARDL) framework. The CUSUM and CUSUM of squares tests are applied to ensure the stability of the model. The results suggest the existence of long-run equilibrium relationship and bidirectional Granger causality between EC and the macroeconomic variables. The empirical findings will help policy makers of Malaysia in developing new monitoring standards of energy consumption. As it is the major contributing factor in economic growth and CO2 emission, there is a need for more proper planning in Malaysia to attain future targets in order to cut emissions.Keywords: co-integration, elasticity, granger causality, Malaysia, residential electricity consumption
Procedia PDF Downloads 26518368 ECG Based Reliable User Identification Using Deep Learning
Authors: R. N. Begum, Ambalika Sharma, G. K. Singh
Abstract:
Identity theft has serious ramifications beyond data and personal information loss. This necessitates the implementation of robust and efficient user identification systems. Therefore, automatic biometric recognition systems are the need of the hour, and ECG-based systems are unquestionably the best choice due to their appealing inherent characteristics. The CNNs are the recent state-of-the-art techniques for ECG-based user identification systems. However, the results obtained are significantly below standards, and the situation worsens as the number of users and types of heartbeats in the dataset grows. As a result, this study proposes a highly accurate and resilient ECG-based person identification system using CNN's dense learning framework. The proposed research explores explicitly the calibre of dense CNNs in the field of ECG-based human recognition. The study tests four different configurations of dense CNN which are trained on a dataset of recordings collected from eight popular ECG databases. With the highest FAR of 0.04 percent and the highest FRR of 5%, the best performing network achieved an identification accuracy of 99.94 percent. The best network is also tested with various train/test split ratios. The findings show that DenseNets are not only extremely reliable but also highly efficient. Thus, they might also be implemented in real-time ECG-based human recognition systems.Keywords: Biometrics, Dense Networks, Identification Rate, Train/Test split ratio
Procedia PDF Downloads 16118367 A Discussion on the Design Practice of College Students for Virtual Avatars in Social Media Ecology
Authors: Mei-Chun Chang
Abstract:
Due to digital transformation and social media development in recent years, various real-time interactive digital tools have been developed to meet the design demands for virtual reality avatars, which also promote digital content learners' active participation in the creation process. As a result, new social media design tools have the characteristics of intuitive operation with a simplified interface for fast production, from which works can be simply created. This study carried out observations, records, questionnaire surveys, and interviews on the creation and learning of visual avatars made by students of the National Taiwan University of Science and Technology (NTUST) with the VRoid Studio 3D modeling tool so as to explore their learning effectiveness on the design of visual avatars. According to the results of this study, the VRoid Studio 3D character modeling tool has a positive impact on the learners and helps to improve their learning effectiveness. Students with low academic achievements said that they could complete the conceived modeling with their own thinking by using the design tool, which increased their sense of accomplishment. Conclusions are drawn according to the results, and relevant future suggestions are put forward.Keywords: virtual avatar, character design, social media, vroid studio, creation, digital learning
Procedia PDF Downloads 19018366 Microbial Diversity Assessment in Household Point-of-Use Water Sources Using Spectroscopic Approach
Authors: Syahidah N. Zulkifli, Herlina A. Rahim, Nurul A. M. Subha
Abstract:
Sustaining water quality is critical in order to avoid any harmful health consequences for end-user consumers. The detection of microbial impurities at the household level is the foundation of water security. Water quality is now monitored only at water utilities or infrastructure, such as water treatment facilities or reservoirs. This research provides a first-hand scientific understanding of microbial composition presence in Malaysia’s household point-of-use (POUs) water supply influenced by seasonal fluctuations, standstill periods, and flow dynamics by using the NIR-Raman spectroscopic technique. According to the findings, 20% of water samples were contaminated by pathogenic bacteria, which are Legionella and Salmonella cells. A comparison of the spectra reveals significant signature peaks (420 cm⁻¹ to 1800 cm⁻¹), including species-specific bands. This demonstrates the importance of regularly monitoring POUs water quality to provide a safe and clean water supply to homeowners. Conventional Raman spectroscopy, up-to-date, is no longer suited for real-time monitoring. Therefore, this study introduced an alternative micro-spectrometer to give a rapid and sustainable way of monitoring POUs water quality. Assessing microbiological threats in water supply becomes more reliable and efficient by leveraging IoT protocol.Keywords: microbial contaminants, water quality, water monitoring, Raman spectroscopy
Procedia PDF Downloads 11018365 Smart Demand Response: A South African Pragmatic, Non-Destructive and Alternative Advanced Metering Infrastructure-Based Maximum Demand Reduction Methodology
Authors: Christo Nicholls
Abstract:
The National Electricity Grid (NEG) in South Africa has been under strain for the last five years. This overburden of the NEG led Eskom (the State-Owned Entity responsible for the NEG) to implement a blunt methodology to assist them in reducing the maximum demand (MD) on the NEG, when required, called Loadshedding. The challenge of this methodology is that not only does it lead to immense technical issues with the distribution network equipment, e.g., transformers, due to the frequent abrupt off and on switching, it also has a broader negative fiscal impact on the distributors, as their key consumers (commercial & industrial) are now grid defecting due to the lack of Electricity Security Provision (ESP). This paper provides a pragmatic alternative methodology utilizing specific functionalities embedded within direct-connect single and three-phase Advanced Meter Infrastructure (AMI) Solutions deployed within the distribution network, in conjunction with a Multi-Agent Systems Based AI implementation focused on Automated Negotiation Peer-2-Peer trading. The results of this research clearly illustrate, not only does methodology provide a factual percentage contribution towards the NEG MD at the point of consideration, it also allows the distributor to leverage the real-time MD data from key consumers to activate complex, yet impact-measurable Demand Response (DR) programs.Keywords: AI, AMI, demand response, multi-agent
Procedia PDF Downloads 11218364 Quantum Mechanics as A Limiting Case of Relativistic Mechanics
Authors: Ahmad Almajid
Abstract:
The idea of unifying quantum mechanics with general relativity is still a dream for many researchers, as physics has only two paths, no more. Einstein's path, which is mainly based on particle mechanics, and the path of Paul Dirac and others, which is based on wave mechanics, the incompatibility of the two approaches is due to the radical difference in the initial assumptions and the mathematical nature of each approach. Logical thinking in modern physics leads us to two problems: - In quantum mechanics, despite its success, the problem of measurement and the problem of wave function interpretation is still obscure. - In special relativity, despite the success of the equivalence of rest-mass and energy, but at the speed of light, the fact that the energy becomes infinite is contrary to logic because the speed of light is not infinite, and the mass of the particle is not infinite too. These contradictions arise from the overlap of relativistic and quantum mechanics in the neighborhood of the speed of light, and in order to solve these problems, one must understand well how to move from relativistic mechanics to quantum mechanics, or rather, to unify them in a way different from Dirac's method, in order to go along with God or Nature, since, as Einstein said, "God doesn't play dice." From De Broglie's hypothesis about wave-particle duality, Léon Brillouin's definition of the new proper time was deduced, and thus the quantum Lorentz factor was obtained. Finally, using the Euler-Lagrange equation, we come up with new equations in quantum mechanics. In this paper, the two problems in modern physics mentioned above are solved; it can be said that this new approach to quantum mechanics will enable us to unify it with general relativity quite simply. If the experiments prove the validity of the results of this research, we will be able in the future to transport the matter at speed close to the speed of light. Finally, this research yielded three important results: 1- Lorentz quantum factor. 2- Planck energy is a limited case of Einstein energy. 3- Real quantum mechanics, in which new equations for quantum mechanics match and exceed Dirac's equations, these equations have been reached in a completely different way from Dirac's method. These equations show that quantum mechanics is a limited case of relativistic mechanics. At the Solvay Conference in 1927, the debate about quantum mechanics between Bohr, Einstein, and others reached its climax, while Bohr suggested that if particles are not observed, they are in a probabilistic state, then Einstein said his famous claim ("God does not play dice"). Thus, Einstein was right, especially when he didn't accept the principle of indeterminacy in quantum theory, although experiments support quantum mechanics. However, the results of our research indicate that God really does not play dice; when the electron disappears, it turns into amicable particles or an elastic medium, according to the above obvious equations. Likewise, Bohr was right also, when he indicated that there must be a science like quantum mechanics to monitor and study the motion of subatomic particles, but the picture in front of him was blurry and not clear, so he resorted to the probabilistic interpretation.Keywords: lorentz quantum factor, new, planck’s energy as a limiting case of einstein’s energy, real quantum mechanics, new equations for quantum mechanics
Procedia PDF Downloads 7718363 Hybrid Subspace Approach for Time Delay Estimation in MIMO Systems
Authors: Mojtaba Saeedinezhad, Sarah Yousefi
Abstract:
In this paper, we present a hybrid subspace approach for Time Delay Estimation (TDE) in multivariable systems. While several methods have been proposed for time delay estimation in SISO systems, delay estimation in MIMO systems were always a big challenge. In these systems the existing TDE methods have significant limitations because most of procedures are just based on system response estimation or correlation analysis. We introduce a new hybrid method for TDE in MIMO systems based on subspace identification and explicit output error method; and compare its performance with previously introduced procedures in presence of different noise levels and in a statistical manner. Then the best method is selected with multi objective decision making technique. It is shown that the performance of new approach is much better than the existing methods, even in low signal-to-noise conditions.Keywords: system identification, time delay estimation, ARX, OE, merit ratio, multi variable decision making
Procedia PDF Downloads 34618362 Environmental Effects on Energy Consumption of Smart Grid Consumers
Authors: S. M. Ali, A. Salam Khan, A. U. Khan, M. Tariq, M. S. Hussain, B. A. Abbasi, I. Hussain, U. Farid
Abstract:
Environment and surrounding plays a pivotal rule in structuring life-style of the consumers. Living standards intern effect the energy consumption of the consumers. In smart grid paradigm, climate drifts, weather parameter and green environmental directly relates to the energy profiles of the various consumers, such as residential, commercial and industrial. Considering above factors helps policy in shaping utility load curves and optimal management of demand and supply. Thus, there is a pressing need to develop correlation models of load and weather parameters and critical analysis of the factors effecting energy profiles of smart grid consumers. In this paper, we elaborated various environment and weather parameter factors effecting demand of consumers. Moreover, we developed correlation models, such as Pearson, Spearman, and Kendall, an inter-relation between dependent (load) parameter and independent (weather) parameters. Furthermore, we validated our discussion with real-time data of Texas State. The numerical simulations proved the effective relation of climatic drifts with energy consumption of smart grid consumers.Keywords: climatic drifts, correlation analysis, energy consumption, smart grid, weather parameter
Procedia PDF Downloads 37518361 High Speed Motion Tracking with Magnetometer in Nonuniform Magnetic Field
Authors: Jeronimo Cox, Tomonari Furukawa
Abstract:
Magnetometers have become more popular in inertial measurement units (IMU) for their ability to correct estimations using the earth's magnetic field. Accelerometer and gyroscope-based packages fail with dead-reckoning errors accumulated over time. Localization in robotic applications with magnetometer-inclusive IMUs has become popular as a way to track the odometry of slower-speed robots. With high-speed motions, the accumulated error increases over smaller periods of time, making them difficult to track with IMU. Tracking a high-speed motion is especially difficult with limited observability. Visual obstruction of motion leaves motion-tracking cameras unusable. When motions are too dynamic for estimation techniques reliant on the observability of the gravity vector, the use of magnetometers is further justified. As available magnetometer calibration methods are limited with the assumption that background magnetic fields are uniform, estimation in nonuniform magnetic fields is problematic. Hard iron distortion is a distortion of the magnetic field by other objects that produce magnetic fields. This kind of distortion is often observed as the offset from the origin of the center of data points when a magnetometer is rotated. The magnitude of hard iron distortion is dependent on proximity to distortion sources. Soft iron distortion is more related to the scaling of the axes of magnetometer sensors. Hard iron distortion is more of a contributor to the error of attitude estimation with magnetometers. Indoor environments or spaces inside ferrite-based structures, such as building reinforcements or a vehicle, often cause distortions with proximity. As positions correlate to areas of distortion, methods of magnetometer localization include the production of spatial mapping of magnetic field and collection of distortion signatures to better aid location tracking. The goal of this paper is to compare magnetometer methods that don't need pre-productions of magnetic field maps. Mapping the magnetic field in some spaces can be costly and inefficient. Dynamic measurement fusion is used to track the motion of a multi-link system with us. Conventional calibration by data collection of rotation at a static point, real-time estimation of calibration parameters each time step, and using two magnetometers for determining local hard iron distortion are compared to confirm the robustness and accuracy of each technique. With opposite-facing magnetometers, hard iron distortion can be accounted for regardless of position, Rather than assuming that hard iron distortion is constant regardless of positional change. The motion measured is a repeatable planar motion of a two-link system connected by revolute joints. The links are translated on a moving base to impulse rotation of the links. Equipping the joints with absolute encoders and recording the motion with cameras to enable ground truth comparison to each of the magnetometer methods. While the two-magnetometer method accounts for local hard iron distortion, the method fails where the magnetic field direction in space is inconsistent.Keywords: motion tracking, sensor fusion, magnetometer, state estimation
Procedia PDF Downloads 8418360 Asset Pricing Model: A Quality Paradigm
Authors: Urmi Khatri
Abstract:
Capital asset pricing model (CAPM) draws a direct relationship between the risk and the expected rate of return. There was a criticism on the beta and the assumptions of CAPM, as they are not applicable in the real world. Fama French Three Factor Model and Fama French Five Factor Model have given different factors, which have an impact on the return of any asset like size, value, investment and profitability. This study proposes to see Capital Asset pricing Model through the lenses of the quality aspect. In the study, the six factors are studied. The Fama French Five Factor Model and addition of the quality dimension are studied. Here, Graham’s seven quality and quantity criteria are measured to determine the score of the sample firms. Thus, this study tries to check the model fit. The beta coefficient of the quality dimension and the R square value is seen to determine validity of the proposed model. The sample is drawn from the firms listed on Indian Stock Exchange (BSE). For the study, only nonfinancial firms are been selected. The time period of the study is from January 1999 to December 2019. Hence, the primary objective of the study is to check how robust the model becomes after giving the quality dimension to the capital asset pricing model in addition to the size, value, profitability and investment.Keywords: asset pricing model, CAPM, Graham’s score, G-score, multifactor model, quality
Procedia PDF Downloads 15818359 Macroeconomic Effects and Dynamics of Natural Disaster Damages: Evidence from SETX on the Resiliency Hypothesis
Authors: Agim Kukelii, Gevorg Sargsyan
Abstract:
This study, focusing on the base regional area (county level), estimates the effect of natural disaster damages on aggregate personal income, aggregate wages, wages per worker, aggregate employment, and aggregate income transfer. The study further estimates the dynamics of personal income, employment, and wages under natural disaster shocks. Southeast Texas, located at the center of Golf Coast, is hit by meteorological and hydrological caused natural disasters yearly. On average, there are more than four natural disasters per year that cane an estimated damage average of 2.2% of real personal income. The study uses the panel data method to estimate the average effect of natural disasters on the area’s economy (personal income, wages, employment, and income transfer). It also uses Panel Vector Autoregressive (PVAR) model to study the dynamics of macroeconomic variables under natural disaster shocks. The study finds that the average effect of natural disasters is positive for personal income and income transfer and is negative for wages and employment. The PVAR and the impulse response function estimates reveal that natural disaster shocks cause a decrease in personal income, employment, and wages. However, the economy’s variables bounce back after three years. The novelty of this study rests on several aspects. First, this is the first study to investigate the effects of natural disasters on macroeconomic variables at a regional level. Second, the study uses direct measures of natural disaster damages. Third, the study estimates that the time that the local economy takes to absorb the natural disaster damages shocks is three years. This is a relatively good reaction to the local economy, therefore, adding to the “resiliency” hypothesis. The study has several implications for policymakers, businesses, and households. First, this study serves to increase the awareness of local stakeholders that natural disaster damages do worsen, macroeconomic variables, such as personal income, employment, and wages beyond the immediate damages to residential and commercial properties, physical infrastructure, and discomfort in daily lives. Second, the study estimates that these effects linger on the economy on average for three years, which would require policymakers to factor in the time area need to be on focus.Keywords: natural disaster damages, macroeconomics effects, PVAR, panel data
Procedia PDF Downloads 8818358 Building Information Modeling-Based Information Exchange to Support Facilities Management Systems
Authors: Sandra T. Matarneh, Mark Danso-Amoako, Salam Al-Bizri, Mark Gaterell
Abstract:
Today’s facilities are ever more sophisticated and the need for available and reliable information for operation and maintenance activities is vital. The key challenge for facilities managers is to have real-time accurate and complete information to perform their day-to-day activities and to provide their senior management with accurate information for decision-making process. Currently, there are various technology platforms, data repositories, or database systems such as Computer-Aided Facility Management (CAFM) that are used for these purposes in different facilities. In most current practices, the data is extracted from paper construction documents and is re-entered manually in one of these computerized information systems. Construction Operations Building information exchange (COBie), is a non-proprietary data format that contains the asset non-geometric data which was captured and collected during the design and construction phases for owners and facility managers use. Recently software vendors developed add-in applications to generate COBie spreadsheet automatically. However, most of these add-in applications are capable of generating a limited amount of COBie data, in which considerable time is still required to enter the remaining data manually to complete the COBie spreadsheet. Some of the data which cannot be generated by these COBie add-ins is essential for facilities manager’s day-to-day activities such as job sheet which includes preventive maintenance schedules. To facilitate a seamless data transfer between BIM models and facilities management systems, we developed a framework that enables automated data generation using the data extracted directly from BIM models to external web database, and then enabling different stakeholders to access to the external web database to enter the required asset data directly to generate a rich COBie spreadsheet that contains most of the required asset data for efficient facilities management operations. The proposed framework is a part of ongoing research and will be demonstrated and validated on a typical university building. Moreover, the proposed framework supplements the existing body of knowledge in facilities management domain by providing a novel framework that facilitates seamless data transfer between BIM models and facilities management systems.Keywords: building information modeling, BIM, facilities management systems, interoperability, information management
Procedia PDF Downloads 11518357 Preparation of Papers - Developing a Leukemia Diagnostic System Based on Hybrid Deep Learning Architectures in Actual Clinical Environments
Authors: Skyler Kim
Abstract:
An early diagnosis of leukemia has always been a challenge to doctors and hematologists. On a worldwide basis, it was reported that there were approximately 350,000 new cases in 2012, and diagnosing leukemia was time-consuming and inefficient because of an endemic shortage of flow cytometry equipment in current clinical practice. As the number of medical diagnosis tools increased and a large volume of high-quality data was produced, there was an urgent need for more advanced data analysis methods. One of these methods was the AI approach. This approach has become a major trend in recent years, and several research groups have been working on developing these diagnostic models. However, designing and implementing a leukemia diagnostic system in real clinical environments based on a deep learning approach with larger sets remains complex. Leukemia is a major hematological malignancy that results in mortality and morbidity throughout different ages. We decided to select acute lymphocytic leukemia to develop our diagnostic system since acute lymphocytic leukemia is the most common type of leukemia, accounting for 74% of all children diagnosed with leukemia. The results from this development work can be applied to all other types of leukemia. To develop our model, the Kaggle dataset was used, which consists of 15135 total images, 8491 of these are images of abnormal cells, and 5398 images are normal. In this paper, we design and implement a leukemia diagnostic system in a real clinical environment based on deep learning approaches with larger sets. The proposed diagnostic system has the function of detecting and classifying leukemia. Different from other AI approaches, we explore hybrid architectures to improve the current performance. First, we developed two independent convolutional neural network models: VGG19 and ResNet50. Then, using both VGG19 and ResNet50, we developed a hybrid deep learning architecture employing transfer learning techniques to extract features from each input image. In our approach, fusing the features from specific abstraction layers can be deemed as auxiliary features and lead to further improvement of the classification accuracy. In this approach, features extracted from the lower levels are combined into higher dimension feature maps to help improve the discriminative capability of intermediate features and also overcome the problem of network gradient vanishing or exploding. By comparing VGG19 and ResNet50 and the proposed hybrid model, we concluded that the hybrid model had a significant advantage in accuracy. The detailed results of each model’s performance and their pros and cons will be presented in the conference.Keywords: acute lymphoblastic leukemia, hybrid model, leukemia diagnostic system, machine learning
Procedia PDF Downloads 18718356 Design and Implementation of Partial Denoising Boundary Image Matching Using Indexing Techniques
Authors: Bum-Soo Kim, Jin-Uk Kim
Abstract:
In this paper, we design and implement a partial denoising boundary image matching system using indexing techniques. Converting boundary images to time-series makes it feasible to perform fast search using indexes even on a very large image database. Thus, using this converting method we develop a client-server system based on the previous partial denoising research in the GUI (graphical user interface) environment. The client first converts a query image given by a user to a time-series and sends denoising parameters and the tolerance with this time-series to the server. The server identifies similar images from the index by evaluating a range query, which is constructed using inputs given from the client, and sends the resulting images to the client. Experimental results show that our system provides much intuitive and accurate matching result.Keywords: boundary image matching, indexing, partial denoising, time-series matching
Procedia PDF Downloads 13718355 An Immersive Serious Game for Firefighting and Evacuation Training in Healthcare Facilities
Authors: Anass Rahouti, Guillaume Salze, Ruggiero Lovreglio, Sélim Datoussaïd
Abstract:
In healthcare facilities, training the staff for firefighting and evacuation in real buildings is very challenging due to the presence of a vulnerable population in such an environment. In a standard environment, traditional approaches, such as fire drills, are often used to train the occupants and provide them with information about fire safety procedures. However, those traditional approaches may be inappropriate for a vulnerable population and can be inefficient from an educational viewpoint as it is impossible to expose the occupants to scenarios similar to a real emergency. Immersive serious games could be used as an alternative to traditional approaches to overcome their limitations. Serious games are already being used in different safety domains such as fires, earthquakes and terror attacks for several building types (e.g., office buildings, train stations, tunnels, etc.). In this study, we developed an immersive serious game to improve the fire safety skills of staff in healthcare facilities. An accurate representation of the healthcare environment was built in Unity3D by including visual and audio stimuli inspired from those employed in commercial action games. The serious game is organised in three levels. In each of them, the trainee is presented with a specific fire emergency and s/he can perform protective actions (e.g., firefighting, helping non-ambulant occupants, etc.) or s/he can ignore the opportunity for action and continue the evacuation. In this paper, we describe all the steps required to develop such a prototype, as well as the key questions that need to be answered, to develop a serious game for firefighting and evacuation in healthcare facilities.Keywords: fire safety, healthcare, serious game, training
Procedia PDF Downloads 45218354 Open Circuit MPPT Control Implemented for PV Water Pumping System
Authors: Rabiaa Gammoudi, Najet Rebei, Othman Hasnaoui
Abstract:
Photovoltaic systems use different techniques for tracking the Maximum Power Point (MPPT) to provide the highest possible power to the load regardless of the climatic conditions variation. In this paper, the proposed method is the Open Circuit (OC) method with sudden and random variations of insolation. The simulation results of the water pumping system controlled by OC method are validated by an experimental experience in real-time using a test bench composed by a centrifugal pump powered by a PVG via a boost chopper for the adaptation between the source and the load. The output of the DC/DC converter supplies the motor pump LOWARA type, assembly by means of a DC/AC inverter. The control part is provided by a computer incorporating a card DS1104 running environment Matlab/Simulink for visualization and data acquisition. These results show clearly the effectiveness of our control with a very good performance. The results obtained show the usefulness of the developed algorithm in solving the problem of degradation of PVG performance depending on the variation of climatic factors with a very good yield.Keywords: PVWPS (PV Water Pumping System), maximum power point tracking (MPPT), open circuit method (OC), boost converter, DC/AC inverter
Procedia PDF Downloads 45418353 Reliability-based Condition Assessment of Offshore Wind Turbines using SHM data
Authors: Caglayan Hizal, Hasan Emre Demirci, Engin Aktas, Alper Sezer
Abstract:
Offshore wind turbines consist of a long slender tower with a heavy fixed mass on the top of the tower (nacelle), together with a heavy rotating mass (blades and hub). They are always subjected to environmental loads including wind and wave loads in their service life. This study presents a three-stage methodology for reliability-based condition assessment of offshore wind-turbines against the seismic, wave and wind induced effects considering the soil-structure interaction. In this context, failure criterions are considered as serviceability limits of a monopile supporting an Offshore Wind Turbine: (a) allowable horizontal displacement at pile head should not exceed 0.2 m, (b) rotations at pile head should not exceed 0.5°. A Bayesian system identification framework is adapted to the classical reliability analysis procedure. Using this framework, a reliability assessment can be directly implemented to the updated finite element model without performing time-consuming methods. For numerical verification, simulation data of the finite model of a real offshore wind-turbine structure is investigated using the three-stage methodology.Keywords: Offshore wind turbines, SHM, reliability assessment, soil-structure interaction
Procedia PDF Downloads 53218352 Determination of Surface Deformations with Global Navigation Satellite System Time Series
Authors: Ibrahim Tiryakioglu, Mehmet Ali Ugur, Caglar Ozkaymak
Abstract:
The development of GNSS technology has led to increasingly widespread and successful applications of GNSS surveys for monitoring crustal movements. However, multi-period GPS survey solutions have not been applied in monitoring vertical surface deformation. This study uses long-term GNSS time series that are required to determine vertical deformations. In recent years, the surface deformations that are parallel and semi-parallel to Bolvadin fault have occurred in Western Anatolia. These surface deformations have continued to occur in Bolvadin settlement area that is located mostly on alluvium ground. Due to these surface deformations, a number of cracks in the buildings located in the residential areas and breaks in underground water and sewage systems have been observed. In order to determine the amount of vertical surface deformations, two continuous GNSS stations have been established in the region. The stations have been operating since 2015 and 2017, respectively. In this study, GNSS observations from the mentioned two GNSS stations were processed with GAMIT/GLOBK (GNSS Analysis Massachusetts Institute of Technology/GLOBal Kalman) program package to create a coordinate time series. With the time series analyses, the GNSS stations’ behavior models (linear, periodical, etc.), the causes of these behaviors, and mathematical models were determined. The study results from the time series analysis of these two 2 GNSS stations shows approximately 50-80 mm/yr vertical movement.Keywords: Bolvadin fault, GAMIT, GNSS time series, surface deformations
Procedia PDF Downloads 16518351 Integrating Building Information Modeling into Facilities Management Operations
Authors: Mojtaba Valinejadshoubi, Azin Shakibabarough, Ashutosh Bagchi
Abstract:
Facilities such as residential buildings, office buildings, and hospitals house large density of occupants. Therefore, a low-cost facility management program (FMP) should be used to provide a satisfactory built environment for these occupants. Facility management (FM) has been recently used in building projects as a critical task. It has been effective in reducing operation and maintenance cost of these facilities. Issues of information integration and visualization capabilities are critical for reducing the complexity and cost of FM. Building information modeling (BIM) can be used as a strong visual modeling tool and database in FM. The main objective of this study is to examine the applicability of BIM in the FM process during a building’s operational phase. For this purpose, a seven-storey office building is modeled Autodesk Revit software. Authors integrated the cloud-based environment using a visual programming tool, Dynamo, for the purpose of having a real-time cloud-based communication between the facility managers and the participants involved in the project. An appropriate and effective integrated data source and visual model such as BIM can reduce a building’s operational and maintenance costs by managing the building life cycle properly.Keywords: building information modeling, facility management, operational phase, building life cycle
Procedia PDF Downloads 15518350 Gender Specific Nature of the Fiction Conflict in Modern Feminine Prose
Authors: Baglan Bazylova
Abstract:
The purpose of our article is to consider the social and psychological conflicts in Lyudmila Petrushevskaya’s stories as an artistic presentation of gender structure of modern society; to reveal originality of the characters’ inner world, the models of their behavior expressing the gender specific nature of modern feminine prose. Gender conflicts have taken the leading place in the modern prose. L. Petrushevskaya represents different types of conflicts including those which are shown in the images of real contradictions in the stories "Narratrix", "Thanks to Life”, "Virgin's Case", "Father and Mother". In the prose of Petrushevskaya the gender conflicts come out in two dimensions: The first one is love relations between a man and a woman. Because of the financial indigence, neediness a woman can’t afford herself even to fall in love and arrange her family happiness. The second dimension is the family conflict because of the male adultery. Petrushevskaya fixed on the unmanifistated conflict in detail. In the real life such gender conflict can appear in different forms but for the writer is important to show it as a life basis, hidden behind the externally safe facade of “the family happiness”. In the stories of L. Petrushevskaya the conflicts reflect the common character of the social and historical situations in which her heroines find themselves, in situations where a woman feels her opposition to the customary mode of life. The types of gender conflicts of these stories differ in character of verbal images. They are presented by the verbal and event ranks creating the conflicts just in operation.Keywords: gender behavior of heroes, gender conflict, gender picture of the world, gender structure
Procedia PDF Downloads 51018349 Minimizing Total Completion Time in No-Wait Flowshops with Setup Times
Authors: Ali Allahverdi
Abstract:
The m-machine no-wait flowshop scheduling problem is addressed in this paper. The objective is to minimize total completion time subject to the constraint that the makespan value is not greater than a certain value. Setup times are treated as separate from processing times. Several recent algorithms are adapted and proposed for the problem. An extensive computational analysis has been conducted for the evaluation of the proposed algorithms. The computational analysis indicates that the best proposed algorithm performs significantly better than the earlier existing best algorithm.Keywords: scheduling, no-wait flowshop, algorithm, setup times, total completion time, makespan
Procedia PDF Downloads 34018348 In vivo Determination of Anticoagulant Property of the Tentacle Extract of Aurelia aurita (Moon Jellyfish) Using Sprague-Dawley Rats
Authors: Bea Carmel H. Casiding, Charmaine A. Guy, Funny Jovis P. Malasan, Katrina Chelsea B. Manlutac, Danielle Ann N. Novilla, Marianne R. Oliveros, Magnolia C. Sibulo
Abstract:
Moon jellyfish, Aurelia aurita, has become a popular research organism for diverse studies. Recent studies have verified the prevention of blood clotting properties of the moon jellyfish tentacle extract through in vitro methods. The purpose of this study was to validate the blood clotting ability of A. aurita tentacle extract using in vivo method of experimentation. The tentacles of A. aurita jellyfish were excised and filtered then centrifuged at 3000xg for 10 minutes. The crude nematocyst extract was suspended in 1:6 ratios with phosphate buffer solution and sonicated for three periods of 20 seconds each at 50 Hz. Protein concentration of the extract was determined using Bradford Assay. Bovine serum albumin was the standard solution used with the following concentrations: 35.0, 70.0, 105.0, 140.0, 175.0, 210.0, 245.0, and 280.0 µg/mL. The absorbance was read at 595 nm. Toxicity testing from OECD guidelines was adapted. The extract suspended in phosphate-buffered saline solution was arbitrarily set into three doses (0.1mg/kg, 0.3mg/kg, 0.5mg/kg) and were administered daily for five days to the experimental groups of five male Sprague-Dawley rats (one dose per group). Before and after the administration period, bleeding time and clotting time tests were performed. The One-way Analysis of Variance (ANOVA) was used to analyze the difference of before and after bleeding time and clotting time from the three treatment groups, time, positive and negative control groups. The average protein concentration of the sonicated crude tentacle extract was 206.5 µg/mL. The highest dose administered (0.5mg/kg) produced significant increase in the time for both bleeding and clotting tests. However, the preceding lower dose (0.3mg/kg) only was significantly effective for clotting time test. The protein contained in the tentacle extract with a concentration of 206.5 mcg/mL and dose of 0.3 mg/kg and 0.5 mg/kg of A. aurita elicited anticoagulating activity.Keywords: anticoagulant, bleeding time test, clotting time test, moon jellyfish
Procedia PDF Downloads 39718347 Multisource (RF and Solar) Energy Harvesting for Internet of Things (IoT)
Authors: Emmanuel Ekwueme, Anwar Ali
Abstract:
As the Internet of Things (IoT) continues to expand, the demand for battery-free devices is increasing, which is crucial for the efficiency of 5G networks and eco-friendly industrial systems. The solution is a device that operates indefinitely, requires no maintenance, and has no negative impact on the ambient environment. One promising approach to achieve this is energy harvesting, which involves capturing energy from the ambient environment and transferring it to power devices. This method can revolutionize industries. Such as manufacturing, agriculture, and healthcare by enabling real-time data collection and analysis, reducing maintenance costs, improving efficiency, and contributing to a future with lower carbon emissions. This research explores various energy harvesting techniques, focusing on radio frequencies (RF) and multiple energy sources. It examines RF-based and solar methods for powering battery-free sensors, low-power circuits, and IoT devices. The study investigates a hybrid RF-solar harvesting circuit designed for remote sensing devices. The proposed system includes distinct RF and solar energy harvester circuits, with the RF harvester operating at 2.45GHz and the solar harvester utilizing a maximum power point tracking (MPPT) algorithm to maximize efficiency.Keywords: radio frequency, energy harvesting, Internet of Things (IoT), multisource, solar energy
Procedia PDF Downloads 1018346 Estimation of Time Loss and Costs of Traffic Congestion: The Contingent Valuation Method
Authors: Amira Mabrouk, Chokri Abdennadher
Abstract:
The reduction of road congestion which is inherent to the use of vehicles is an obvious priority to public authority. Therefore, assessing the willingness to pay of an individual in order to save trip-time is akin to estimating the change in price which was the result of setting up a new transport policy to increase the networks fluidity and improving the level of social welfare. This study holds an innovative perspective. In fact, it initiates an economic calculation that has the objective of giving an estimation of the monetized time value during the trips made in Sfax. This research is founded on a double-objective approach. The aim of this study is to i) give an estimation of the monetized value of time; an hour dedicated to trips, ii) determine whether or not the consumer considers the environmental variables to be significant, iii) analyze the impact of applying a public management of the congestion via imposing taxation of city tolls on urban dwellers. This article is built upon a rich field survey led in the city of Sfax. With the use of the contingent valuation method, we analyze the “declared time preferences” of 450 drivers during rush hours. Based on the fond consideration of attributed bias of the applied method, we bring to light the delicacy of this approach with regards to the revelation mode and the interrogative techniques by following the NOAA panel recommendations bearing the exception of the valorization point and other similar studies about the estimation of transportation externality.Keywords: willingness to pay, contingent valuation, time value, city toll
Procedia PDF Downloads 434