Search results for: real estate prediction
3077 Metaverse in Future Personal Healthcare Industry: From Telemedicine to Telepresence
Authors: Mohammed Saeed Jawad
Abstract:
Metaverse involves the convergence of three major technologies trends of AI, VR, and AR. Together these three technologies can provide an entirely new channel for delivering healthcare with great potential to lower costs and improve patient outcomes on a larger scale. Telepresence is the technology that allows people to be together even if they are physically apart. Medical doctors can be symbolic as interactive avatars developed to have smart conversations and medical recommendations for patients at the different stages of the treatment. Medical digital assets such as Medical IoT for real-time remote healthcare monitoring as well as the symbolic doctors’ avatars as well as the hospital and clinical physical constructions and layout can be immersed in extended realities 3D metaverse environments where doctors, nurses, and patients can interact and socialized with the related digital assets that facilitate the data analytics of the sensed and collected personal medical data with visualized interaction of the digital twin of the patient’s body as well as the medical doctors' smart conversation and consultation or even in a guided remote-surgery operation.Keywords: personal healthcare, metaverse, telemedicine, telepresence, avatar, medical consultation, remote-surgery
Procedia PDF Downloads 1353076 Development and Metrological Validation of a Control Strategy in Embedded Island Grids Using Battery-Hybrid-Systems
Authors: L. Wilkening, G. Ackermann, T. T. Do
Abstract:
This article presents an approach for stand-alone and grid-connected mode of a German low-voltage grid with high share of photovoltaic. For this purpose, suitable dynamic system models have been developed. This allows the simulation of dynamic events in very small time ranges and the operation management over longer periods of time. Using these simulations, suitable control parameters could be identified, and their effects on the grid can be analyzed. In order to validate the simulation results, a LV-grid test bench has been implemented at the University of Technology Hamburg. The developed control strategies are to be validated using real inverters, generators and different realistic loads. It is shown that a battery hybrid system installed next to a voltage transformer makes it possible to operate the LV-grid in stand-alone mode without using additional information and communication technology and without intervention in the existing grid units. By simulating critical days of the year, suitable control parameters for stable stand-alone operations are determined and set point specifications for different control strategies are defined.Keywords: battery, e-mobility, photovoltaic, smart grid
Procedia PDF Downloads 1433075 A Near-Optimal Domain Independent Approach for Detecting Approximate Duplicates
Authors: Abdelaziz Fellah, Allaoua Maamir
Abstract:
We propose a domain-independent merging-cluster filter approach complemented with a set of algorithms for identifying approximate duplicate entities efficiently and accurately within a single and across multiple data sources. The near-optimal merging-cluster filter (MCF) approach is based on the Monge-Elkan well-tuned algorithm and extended with an affine variant of the Smith-Waterman similarity measure. Then we present constant, variable, and function threshold algorithms that work conceptually in a divide-merge filtering fashion for detecting near duplicates as hierarchical clusters along with their corresponding representatives. The algorithms take recursive refinement approaches in the spirit of filtering, merging, and updating, cluster representatives to detect approximate duplicates at each level of the cluster tree. Experiments show a high effectiveness and accuracy of the MCF approach in detecting approximate duplicates by outperforming the seminal Monge-Elkan’s algorithm on several real-world benchmarks and generated datasets.Keywords: data mining, data cleaning, approximate duplicates, near-duplicates detection, data mining applications and discovery
Procedia PDF Downloads 3873074 Fault-Detection and Self-Stabilization Protocol for Wireless Sensor Networks
Authors: Ather Saeed, Arif Khan, Jeffrey Gosper
Abstract:
Sensor devices are prone to errors and sudden node failures, which are difficult to detect in a timely manner when deployed in real-time, hazardous, large-scale harsh environments and in medical emergencies. Therefore, the loss of data can be life-threatening when the sensed phenomenon is not disseminated due to sudden node failure, battery depletion or temporary malfunctioning. We introduce a set of partial differential equations for localizing faults, similar to Green’s and Maxwell’s equations used in Electrostatics and Electromagnetism. We introduce a node organization and clustering scheme for self-stabilizing sensor networks. Green’s theorem is applied to regions where the curve is closed and continuously differentiable to ensure network connectivity. Experimental results show that the proposed GTFD (Green’s Theorem fault-detection and Self-stabilization) protocol not only detects faulty nodes but also accurately generates network stability graphs where urgent intervention is required for dynamically self-stabilizing the network.Keywords: Green’s Theorem, self-stabilization, fault-localization, RSSI, WSN, clustering
Procedia PDF Downloads 753073 Laminar Separation Bubble Prediction over an Airfoil Using Transition SST Turbulence Model on Moderate Reynolds Number
Authors: Younes El Khchine, Mohammed Sriti
Abstract:
A parametric study has been conducted to analyse the flow around S809 airfoil of a wind turbine in order to better understand the characteristics and effects of laminar separation bubble (LSB) on aerodynamic design for maximizing wind turbine efficiency. Numerical simulations were performed at low Reynolds numbers by solving the Unsteady Reynolds Averaged Navier-Stokes (URANS) equations based on C-type structural mesh and using the γ-Reθt turbulence model. A two-dimensional study was conducted for the chord Reynolds number of 1×10⁵ and angles of attack (AoA) between 0 and 20.15 degrees. The simulation results obtained for the aerodynamic coefficients at various angles of attack (AoA) were compared with XFoil results. A sensitivity study was performed to examine the effects of Reynolds number and free-stream turbulence intensity on the location and length of the laminar separation bubble and the aerodynamic performances of wind turbines. The results show that increasing the Reynolds number leads to a delay in the laminar separation on the upper surface of the airfoil. The increase in Reynolds number leads to an accelerated transition process, and the turbulent reattachment point moves closer to the leading edge owing to an earlier reattachment of the turbulent shear layer. This leads to a considerable reduction in the length of the separation bubble as the Reynolds number is increased. The increase in the level of free-stream turbulence intensity leads to a decrease in separation bubble length and an increase in the lift coefficient while having negligible effects on the stall angle. When the AoA increased, the bubble on the suction airfoil surface was found to move upstream to the leading edge of the airfoil, that causes earlier laminar separation.Keywords: laminar separation bubble, turbulence intensity, S809 airfoil, transition model, Reynolds number
Procedia PDF Downloads 833072 Enhancing Early Detection of Coronary Heart Disease Through Cloud-Based AI and Novel Simulation Techniques
Authors: Md. Abu Sufian, Robiqul Islam, Imam Hossain Shajid, Mahesh Hanumanthu, Jarasree Varadarajan, Md. Sipon Miah, Mingbo Niu
Abstract:
Coronary Heart Disease (CHD) remains a principal cause of global morbidity and mortality, characterized by atherosclerosis—the build-up of fatty deposits inside the arteries. The study introduces an innovative methodology that leverages cloud-based platforms like AWS Live Streaming and Artificial Intelligence (AI) to early detect and prevent CHD symptoms in web applications. By employing novel simulation processes and AI algorithms, this research aims to significantly mitigate the health and societal impacts of CHD. Methodology: This study introduces a novel simulation process alongside a multi-phased model development strategy. Initially, health-related data, including heart rate variability, blood pressure, lipid profiles, and ECG readings, were collected through user interactions with web-based applications as well as API Integration. The novel simulation process involved creating synthetic datasets that mimic early-stage CHD symptoms, allowing for the refinement and training of AI algorithms under controlled conditions without compromising patient privacy. AWS Live Streaming was utilized to capture real-time health data, which was then processed and analysed using advanced AI techniques. The novel aspect of our methodology lies in the simulation of CHD symptom progression, which provides a dynamic training environment for our AI models enhancing their predictive accuracy and robustness. Model Development: it developed a machine learning model trained on both real and simulated datasets. Incorporating a variety of algorithms including neural networks and ensemble learning model to identify early signs of CHD. The model's continuous learning mechanism allows it to evolve adapting to new data inputs and improving its predictive performance over time. Results and Findings: The deployment of our model yielded promising results. In the validation phase, it achieved an accuracy of 92% in predicting early CHD symptoms surpassing existing models. The precision and recall metrics stood at 89% and 91% respectively, indicating a high level of reliability in identifying at-risk individuals. These results underscore the effectiveness of combining live data streaming with AI in the early detection of CHD. Societal Implications: The implementation of cloud-based AI for CHD symptom detection represents a significant step forward in preventive healthcare. By facilitating early intervention, this approach has the potential to reduce the incidence of CHD-related complications, decrease healthcare costs, and improve patient outcomes. Moreover, the accessibility and scalability of cloud-based solutions democratize advanced health monitoring, making it available to a broader population. This study illustrates the transformative potential of integrating technology and healthcare, setting a new standard for the early detection and management of chronic diseases.Keywords: coronary heart disease, cloud-based ai, machine learning, novel simulation techniques, early detection, preventive healthcare
Procedia PDF Downloads 653071 Mineralized Nanoparticles as a Contrast Agent for Ultrasound and Magnetic Resonance Imaging
Authors: Jae Won Lee, Kyung Hyun Min, Hong Jae Lee, Sang Cheon Lee
Abstract:
To date, imaging techniques have attracted much attention in medicine because the detection of diseases at an early stage provides greater opportunities for successful treatment. Consequently, over the past few decades, diverse imaging modalities including magnetic resonance (MR), positron emission tomography, computed tomography, and ultrasound (US) have been developed and applied widely in the field of clinical diagnosis. However, each of the above-mentioned imaging modalities possesses unique strengths and intrinsic weaknesses, which limit their abilities to provide accurate information. Therefore, multimodal imaging systems may be a solution that can provide improved diagnostic performance. Among the current medical imaging modalities, US is a widely available real-time imaging modality. It has many advantages including safety, low cost and easy access for patients. However, its low spatial resolution precludes accurate discrimination of diseased region such as cancer sites. In contrast, MR has no tissue-penetrating limit and can provide images possessing exquisite soft tissue contrast and high spatial resolution. However, it cannot offer real-time images and needs a comparatively long imaging time. The characteristics of these imaging modalities may be considered complementary, and the modalities have been frequently combined for the clinical diagnostic process. Biominerals such as calcium carbonate (CaCO3) and calcium phosphate (CaP) exhibit pH-dependent dissolution behavior. They demonstrate pH-controlled drug release due to the dissolution of minerals in acidic pH conditions. In particular, the application of this mineralization technique to a US contrast agent has been reported recently. The CaCO3 mineral reacts with acids and decomposes to generate calcium dioxide (CO2) gas in an acidic environment. These gas-generating mineralized nanoparticles generated CO2 bubbles in the acidic environment of the tumor, thereby allowing for strong echogenic US imaging of tumor tissues. On the basis of this previous work, it was hypothesized that the loading of MR contrast agents into the CaCO3 mineralized nanoparticles may be a novel strategy in designing a contrast agent for dual imaging. Herein, CaCO3 mineralized nanoparticles that were capable of generating CO2 bubbles to trigger the release of entrapped MR contrast agents in response to tumoral acidic pH were developed for the purposes of US and MR dual-modality imaging of tumors. Gd2O3 nanoparticles were selected as an MR contrast agent. A key strategy employed in this study was to prepare Gd2O3 nanoparticle-loaded mineralized nanoparticles (Gd2O3-MNPs) using block copolymer-templated CaCO3 mineralization in the presence of calcium cations (Ca2+), carbonate anions (CO32-) and positively charged Gd2O3 nanoparticles. The CaCO3 core was considered suitable because it may effectively shield Gd2O3 nanoparticles from water molecules in the blood (pH 7.4) before decomposing to generate CO2 gas, triggering the release of Gd2O3 nanoparticles in tumor tissues (pH 6.4~7.4). The kinetics of CaCO3 dissolution and CO2 generation from the Gd2O3-MNPs were examined as a function of pH and pH-dependent in vitro magnetic relaxation; additionally, the echogenic properties were estimated to demonstrate the potential of the particles for the tumor-specific US and MR imaging.Keywords: calcium carbonate, mineralization, ultrasound imaging, magnetic resonance imaging
Procedia PDF Downloads 2363070 Wind Interference Effects on Various Plan Shape Buildings Under Wind Load
Authors: Ritu Raj, Hrishikesh Dubey
Abstract:
This paper presents the results of the experimental investigations carried out on two intricate plan shaped buildings to evaluate aerodynamic performance of the building. The purpose is to study the associated environment arising due to wind forces in isolated and interference conditions on a model of scale 1:300 with a prototype having 180m height. Experimental tests were carried out at the boundary layer wind tunnel considering isolated conditions with 0° to 180° isolated wind directions and four interference conditions of twin building (separately for both the models). The research has been undertaken in Terrain Category-II, which is the most widely available terrain in India. A comparative assessment of the two models is performed out in an attempt to comprehend the various consequences of diverse conditions that may emerge in real-life situations, as well as the discrepancies amongst them. Experimental results of wind pressure coefficients of Model-1 and Model-2 shows good agreement with various wind incidence conditions with minute difference in the magnitudes of mean Cp. On the basis of wind tunnel studies, it is distinguished that the performance of Model-2 is better than Model-1in both isolated as well as interference conditions for all wind incidences and orientations respectively.Keywords: interference factor, tall buildings, wind direction, mean pressure-coefficients
Procedia PDF Downloads 1283069 Application of WebGIS-Based Water Environment Capacity Inquiry and Planning System in Water Resources Management
Authors: Tao Ding, Danjia Yan, Jinye Li, Chao Ren, Xinhua Hu
Abstract:
The paper based on the research background of the current situation of water shortage in China and intelligent management of water resources in the information era. And the paper adopts WebGIS technology, combining the mathematical model of water resources management to develop a WebGIS-based water environment capacity inquiry and polluted water emission planning. The research significance of the paper is that it can inquiry the water environment capacity of Jinhua City in real time and plan how to drain polluted water into the river, so as to realize the effective management of water resources. This system makes sewage planning more convenient and faster. For the planning of the discharge enterprise, the decision on the optimal location of the sewage outlet can be achieved through calculation of the Sewage discharge planning model in the river, without the need for site visits. The system can achieve effective management of water resources and has great application value.Keywords: sewerage planning, water environment capacity, water resources management, WebGIS
Procedia PDF Downloads 1833068 Dynamic Log Parsing and Intelligent Anomaly Detection Method Combining Retrieval Augmented Generation and Prompt Engineering
Authors: Liu Linxin
Abstract:
As system complexity increases, log parsing and anomaly detection become more and more important in ensuring system stability. However, traditional methods often face the problems of insufficient adaptability and decreasing accuracy when dealing with rapidly changing log contents and unknown domains. To this end, this paper proposes an approach LogRAG, which combines RAG (Retrieval Augmented Generation) technology with Prompt Engineering for Large Language Models, applied to log analysis tasks to achieve dynamic parsing of logs and intelligent anomaly detection. By combining real-time information retrieval and prompt optimisation, this study significantly improves the adaptive capability of log analysis and the interpretability of results. Experimental results show that the method performs well on several public datasets, especially in the absence of training data, and significantly outperforms traditional methods. This paper provides a technical path for log parsing and anomaly detection, demonstrating significant theoretical value and application potential.Keywords: log parsing, anomaly detection, retrieval-augmented generation, prompt engineering, LLMs
Procedia PDF Downloads 293067 Anomaly Detection with ANN and SVM for Telemedicine Networks
Authors: Edward Guillén, Jeisson Sánchez, Carlos Omar Ramos
Abstract:
In recent years, a wide variety of applications are developed with Support Vector Machines -SVM- methods and Artificial Neural Networks -ANN-. In general, these methods depend on intrusion knowledge databases such as KDD99, ISCX, and CAIDA among others. New classes of detectors are generated by machine learning techniques, trained and tested over network databases. Thereafter, detectors are employed to detect anomalies in network communication scenarios according to user’s connections behavior. The first detector based on training dataset is deployed in different real-world networks with mobile and non-mobile devices to analyze the performance and accuracy over static detection. The vulnerabilities are based on previous work in telemedicine apps that were developed on the research group. This paper presents the differences on detections results between some network scenarios by applying traditional detectors deployed with artificial neural networks and support vector machines.Keywords: anomaly detection, back-propagation neural networks, network intrusion detection systems, support vector machines
Procedia PDF Downloads 3583066 Wireless Sensor Networks for Water Quality Monitoring: Prototype Design
Authors: Cesar Eduardo Hernández Curiel, Victor Hugo Benítez Baltazar, Jesús Horacio Pacheco Ramírez
Abstract:
This paper is devoted to present the advances in the design of a prototype that is able to supervise the complex behavior of water quality parameters such as pH and temperature, via a real-time monitoring system. The current water quality tests that are performed in government water quality institutions in Mexico are carried out in problematic locations and they require taking manual samples. The water samples are then taken to the institution laboratory for examination. In order to automate this process, a water quality monitoring system based on wireless sensor networks is proposed. The system consists of a sensor node which contains one pH sensor, one temperature sensor, a microcontroller, and a ZigBee radio, and a base station composed by a ZigBee radio and a PC. The progress in this investigation shows the development of a water quality monitoring system. Due to recent events that affected water quality in Mexico, the main motivation of this study is to address water quality monitoring systems, so in the near future, a more robust, affordable, and reliable system can be deployed.Keywords: pH measurement, water quality monitoring, wireless sensor networks, ZigBee
Procedia PDF Downloads 4043065 Optimal Feature Extraction Dimension in Finger Vein Recognition Using Kernel Principal Component Analysis
Authors: Amir Hajian, Sepehr Damavandinejadmonfared
Abstract:
In this paper the issue of dimensionality reduction is investigated in finger vein recognition systems using kernel Principal Component Analysis (KPCA). One aspect of KPCA is to find the most appropriate kernel function on finger vein recognition as there are several kernel functions which can be used within PCA-based algorithms. In this paper, however, another side of PCA-based algorithms -particularly KPCA- is investigated. The aspect of dimension of feature vector in PCA-based algorithms is of importance especially when it comes to the real-world applications and usage of such algorithms. It means that a fixed dimension of feature vector has to be set to reduce the dimension of the input and output data and extract the features from them. Then a classifier is performed to classify the data and make the final decision. We analyze KPCA (Polynomial, Gaussian, and Laplacian) in details in this paper and investigate the optimal feature extraction dimension in finger vein recognition using KPCA.Keywords: biometrics, finger vein recognition, principal component analysis (PCA), kernel principal component analysis (KPCA)
Procedia PDF Downloads 3653064 Shear Strength and Consolidation Behavior of Clayey Soil with Vertical and Radial Drainage
Authors: R. Pillai Aparna, S. R. Gandhi
Abstract:
Soft clay deposits having low strength and high compressibility are found all over the world. Preloading with vertical drains is a widely used method for improving such type of soils. The coefficient of consolidation, irrespective of the drainage type, plays an important role in the design of vertical drains and it controls accurate prediction of the rate of consolidation of soil. Also, the increase in shear strength of soil with consolidation is another important factor considered in preloading or staged construction. To our best knowledge no clear guidelines are available to estimate the increase in shear strength for a particular degree of consolidation (U) at various stages during the construction. Various methods are available for finding out the consolidation coefficient. This study mainly focuses on the variation of, consolidation coefficient which was found out using different methods and shear strength with pressure intensity. The variation of shear strength with the degree of consolidation was also studied. The consolidation test was done using two types of highly compressible clays with vertical, radial and a few with combined drainage. The test was carried out at different pressures intensities and for each pressure intensity, once the target degree of consolidation is achieved, vane shear test was done at different locations in the sample, in order to determine the shear strength. The shear strength of clayey soils under the application of vertical stress with vertical and radial drainage with target U value of 70% and 90% was studied. It was found that there is not much variation in cv or cr value beyond 80kPa pressure intensity. Correlations were developed between shear strength ratio and consolidation pressure based on laboratory testing under controlled condition. It was observed that the shear strength of sample with target U value of 90% is about 1.4 to 2 times than that of 70% consolidated sample. Settlement analysis was done using Asaoka’s and hyperbolic method. The variation of strength with respect to the depth of sample was also studied, using large-scale consolidation test. It was found, based on the present study that the gain in strength is more on the top half of the clay layer, and also the shear strength of the sample ensuring radial drainage is slightly higher than that of the vertical drainage.Keywords: consolidation coefficient, degree of consolidation, PVDs, shear strength
Procedia PDF Downloads 2393063 Evidence-Based Investigation of the Phonology of Nigerian Instant Messaging
Authors: Emmanuel Uba, Lily Chimuanya, Maryam Tar
Abstract:
Orthographic engineering is no longer the preserve of the Short Messaging Service (SMS), which is characterised by limited space. Such stylistic creativity or deviation is fast creeping into real-time messaging, popularly known as Instant Messaging (IM), despite the large number of characters allowed. This occurs at various linguistic levels: phonology, morphology, syntax, etc. Nigerians are not immune to this linguistic stylisation. This study investigates the phonological and meta-phonological conventions of the messages sent and received via WhatsApp by Nigerian graduates. This is ontological study of 250 instant messages collected from 98 graduates from different ethnic groups in Nigeria. The selection and analysis of the messages are based on figure and ground principle. The results reveal the use of accent stylisation, phoneme substitution, blending, consonantisation (a specialised form of deletion targeting vowels), numerophony (using a figure/number, usually 1-10, to represent a word or syllable that has the same sound) and phonetic respelling in the IMs sent by Nigerians. The study confirms the existence of linguistic creativity.Keywords: figure and ground principle, instant messaging, linguistic stylisation, meta-phonology
Procedia PDF Downloads 3973062 A Nonlinear Dynamical System with Application
Authors: Abdullah Eqal Al Mazrooei
Abstract:
In this paper, a nonlinear dynamical system is presented. This system is a bilinear class. The bilinear systems are very important kind of nonlinear systems because they have many applications in real life. They are used in biology, chemistry, manufacturing, engineering, and economics where linear models are ineffective or inadequate. They have also been recently used to analyze and forecast weather conditions. Bilinear systems have three advantages: First, they define many problems which have a great applied importance. Second, they give us approximations to nonlinear systems. Thirdly, they have a rich geometric and algebraic structures, which promises to be a fruitful field of research for scientists and applications. The type of nonlinearity that is treated and analyzed consists of bilinear interaction between the states vectors and the system input. By using some properties of the tensor product, these systems can be transformed to linear systems. But, here we discuss the nonlinearity when the state vector is multiplied by itself. So, this model will be able to handle evolutions according to the Lotka-Volterra models or the Lorenz weather models, thus enabling a wider and more flexible application of such models. Here we apply by using an estimator to estimate temperatures. The results prove the efficiency of the proposed system.Keywords: Lorenz models, nonlinear systems, nonlinear estimator, state-space model
Procedia PDF Downloads 2543061 MLOps Scaling Machine Learning Lifecycle in an Industrial Setting
Authors: Yizhen Zhao, Adam S. Z. Belloum, Goncalo Maia Da Costa, Zhiming Zhao
Abstract:
Machine learning has evolved from an area of academic research to a real-word applied field. This change comes with challenges, gaps and differences exist between common practices in academic environments and the ones in production environments. Following continuous integration, development and delivery practices in software engineering, similar trends have happened in machine learning (ML) systems, called MLOps. In this paper we propose a framework that helps to streamline and introduce best practices that facilitate the ML lifecycle in an industrial setting. This framework can be used as a template that can be customized to implement various machine learning experiment. The proposed framework is modular and can be recomposed to be adapted to various use cases (e.g. data versioning, remote training on cloud). The framework inherits practices from DevOps and introduces other practices that are unique to the machine learning system (e.g.data versioning). Our MLOps practices automate the entire machine learning lifecycle, bridge the gap between development and operation.Keywords: cloud computing, continuous development, data versioning, DevOps, industrial setting, MLOps
Procedia PDF Downloads 2653060 Agile Software Effort Estimation Using Regression Techniques
Authors: Mikiyas Adugna
Abstract:
Effort estimation is among the activities carried out in software development processes. An accurate model of estimation leads to project success. The method of agile effort estimation is a complex task because of the dynamic nature of software development. Researchers are still conducting studies on agile effort estimation to enhance prediction accuracy. Due to these reasons, we investigated and proposed a model on LASSO and Elastic Net regression to enhance estimation accuracy. The proposed model has major components: preprocessing, train-test split, training with default parameters, and cross-validation. During the preprocessing phase, the entire dataset is normalized. After normalization, a train-test split is performed on the dataset, setting training at 80% and testing set to 20%. We chose two different phases for training the two algorithms (Elastic Net and LASSO) regression following the train-test-split. In the first phase, the two algorithms are trained using their default parameters and evaluated on the testing data. In the second phase, the grid search technique (the grid is used to search for tuning and select optimum parameters) and 5-fold cross-validation to get the final trained model. Finally, the final trained model is evaluated using the testing set. The experimental work is applied to the agile story point dataset of 21 software projects collected from six firms. The results show that both Elastic Net and LASSO regression outperformed the compared ones. Compared to the proposed algorithms, LASSO regression achieved better predictive performance and has acquired PRED (8%) and PRED (25%) results of 100.0, MMRE of 0.0491, MMER of 0.0551, MdMRE of 0.0593, MdMER of 0.063, and MSE of 0.0007. The result implies LASSO regression algorithm trained model is the most acceptable, and higher estimation performance exists in the literature.Keywords: agile software development, effort estimation, elastic net regression, LASSO
Procedia PDF Downloads 713059 Study and Fine Characterization of the SS 316L Microstructures Obtained by Laser Beam Melting Process
Authors: Sebastien Relave, Christophe Desrayaud, Aurelien Vilani, Alexey Sova
Abstract:
Laser beam melting (LBM) is an additive manufacturing process that enables complex 3D parts to be designed. This process is now commonly employed for various applications such as chemistry or energy, requiring the use of stainless steel grades. LBM can offer comparable and sometimes superior mechanical properties to those of wrought materials. However, we observed an anisotropic microstructure which results from the process, caused by the very high thermal gradients along the building axis. This microstructure can be harmful depending on the application. For this reason, control and prediction of the microstructure are important to ensure the improvement and reproducibility of the mechanical properties. This study is focused on the 316L SS grade and aims at understanding the solidification and transformation mechanisms during process. Experiments to analyse the nucleation and growth of the microstructure obtained by the LBM process according to several conditions. These samples have been designed on different type of support bulk and lattice. Samples are produced on ProX DMP 200 LBM device. For the two conditions the analysis of microstructures, thanks to SEM and EBSD, revealed a single phase Austenite with preferential crystallite growth along the (100) plane. The microstructure was presented a hierarchical structure consisting columnar grains sizes in the range of 20-100 µm and sub grains structure of size 0.5 μm. These sub-grains were found in different shapes (columnar and cellular). This difference can be explained by a variation of the thermal gradient and cooling rate or element segregation while no sign of element segregation was found at the sub-grain boundaries. A high dislocation concentration was observed at sub-grain boundaries. These sub-grains are separated by very low misorientation walls ( < 2°) this causes a lattice of curvature inside large grain. A discussion is proposed on the occurrence of these microstructures formation, in regard of the LBM process conditions.Keywords: selective laser melting, stainless steel, microstructure
Procedia PDF Downloads 1573058 Impacts of Building Design Factors on Auckland School Energy Consumptions
Authors: Bin Su
Abstract:
This study focuses on the impact of school building design factors on winter extra energy consumption which mainly includes space heating, water heating and other appliances related to winter indoor thermal conditions. A number of Auckland schools were randomly selected for the study which introduces a method of using real monthly energy consumption data for a year to calculate winter extra energy data of school buildings. The study seeks to identify the relationships between winter extra energy data related to school building design data related to the main architectural features, building envelope and elements of the sample schools. The relationships can be used to estimate the approximate saving in winter extra energy consumption which would result from a changed design datum for future school development, and identify any major energy-efficient design problems. The relationships are also valuable for developing passive design guides for school energy efficiency.Keywords: building energy efficiency, building thermal design, building thermal performance, school building design
Procedia PDF Downloads 4443057 Non-Population Search Algorithms for Capacitated Material Requirement Planning in Multi-Stage Assembly Flow Shop with Alternative Machines
Authors: Watcharapan Sukkerd, Teeradej Wuttipornpun
Abstract:
This paper aims to present non-population search algorithms called tabu search (TS), simulated annealing (SA) and variable neighborhood search (VNS) to minimize the total cost of capacitated MRP problem in multi-stage assembly flow shop with two alternative machines. There are three main steps for the algorithm. Firstly, an initial sequence of orders is constructed by a simple due date-based dispatching rule. Secondly, the sequence of orders is repeatedly improved to reduce the total cost by applying TS, SA and VNS separately. Finally, the total cost is further reduced by optimizing the start time of each operation using the linear programming (LP) model. Parameters of the algorithm are tuned by using real data from automotive companies. The result shows that VNS significantly outperforms TS, SA and the existing algorithm.Keywords: capacitated MRP, tabu search, simulated annealing, variable neighborhood search, linear programming, assembly flow shop, application in industry
Procedia PDF Downloads 2343056 A Solution for Production Facility Assignment: An Automotive Subcontract Case
Authors: Cihan Çetinkaya, Eren Özceylan, Kerem Elibal
Abstract:
This paper presents a solution method for selection of production facility. The motivation has been taken from a real life case, an automotive subcontractor which has two production facilities at different cities and parts. The problem is to decide which part(s) should be produced at which facility. To the best of our knowledge, until this study, there was no scientific approach about this problem at the firm and decisions were being given intuitively. In this study, some logistic cost parameters have been defined and with these parameters a mathematical model has been constructed. Defined and collected cost parameters are handling cost of parts, shipment cost of parts and shipment cost of welding fixtures. Constructed multi-objective mathematical model aims to minimize these costs while aims to balance the workload between two locations. Results showed that defined model can give optimum solutions in reasonable computing times. Also, this result gave encouragement to develop the model with addition of new logistic cost parameters.Keywords: automotive subcontract, facility assignment, logistic costs, multi-objective models
Procedia PDF Downloads 3663055 An Extended X-Ray Absorption Fine Structure Study of CoTi Thin Films
Authors: Jose Alberto Duarte Moller, Cynthia Deisy Gomez Esparza
Abstract:
The cobalt-titanium system was grown as thin films in an INTERCOVAMEX V3 sputtering system, equipped with four magnetrons assisted by DC pulsed and direct DC. A polished highly oriented (400) silicon wafer was used as substrate and the growing temperature was 500 oC. Xray Absorption Spectroscopy experiments were carried out in the SSRL in the 4-3 beam line. The Extenden X-Ray Absorption Fine Structure spectra have been numerically processed by WINXAS software from the background subtraction until the normalization and FFT adjustment. Analyzing the absorption spectra of cobalt in the CoTi2 phase we can appreciate that they agree in energy with the reference spectra that corresponds to the CoO, which indicates that the valence where upon working is Co2+. The RDF experimental results were then compared with those RDF´s generated theoretically by using FEFF software, from a model compound of CoTi2 phase obtained by XRD. The fitting procedure is a highly iterative process. Fits are also checked in R-space using both the real and imaginary parts of Fourier transform. Finally, the presence of overlapping coordination shells and the correctness of the assumption about the nature of the coordinating atom were checked.Keywords: XAS, EXAFS, FEFF, CoTi
Procedia PDF Downloads 2963054 The Application of FSI Techniques in Modeling of Realist Pulmonary Systems
Authors: Abdurrahim Bolukbasi, Hassan Athari, Dogan Ciloglu
Abstract:
The modeling lung respiratory system which has complex anatomy and biophysics presents several challenges including tissue-driven flow patterns and wall motion. Also, the lung pulmonary system because of that they stretch and recoil with each breath, has not static walls and structures. The direct relationship between air flow and tissue motion in the lung structures naturally prefers an FSI simulation technique. Therefore, in order to toward the realistic simulation of pulmonary breathing mechanics the development of a coupled FSI computational model is an important step. A simple but physiologically-relevant three dimensional deep long geometry is designed and fluid-structure interaction (FSI) coupling technique is utilized for simulating the deformation of the lung parenchyma tissue which produces airflow fields. The real understanding of respiratory tissue system as a complex phenomenon have been investigated with respect to respiratory patterns, fluid dynamics and tissue visco-elasticity and tidal breathing period. Procedia PDF Downloads 3233053 Evaluating Forecasts Through Stochastic Loss Order
Authors: Wilmer Osvaldo Martinez, Manuel Dario Hernandez, Juan Manuel Julio
Abstract:
We propose to assess the performance of k forecast procedures by exploring the distributions of forecast errors and error losses. We argue that non systematic forecast errors minimize when their distributions are symmetric and unimodal, and that forecast accuracy should be assessed through stochastic loss order rather than expected loss order, which is the way it is customarily performed in previous work. Moreover, since forecast performance evaluation can be understood as a one way analysis of variance, we propose to explore loss distributions under two circumstances; when a strict (but unknown) joint stochastic order exists among the losses of all forecast alternatives, and when such order happens among subsets of alternative procedures. In spite of the fact that loss stochastic order is stronger than loss moment order, our proposals are at least as powerful as competing tests, and are robust to the correlation, autocorrelation and heteroskedasticity settings they consider. In addition, since our proposals do not require samples of the same size, their scope is also wider, and provided that they test the whole loss distribution instead of just loss moments, they can also be used to study forecast distributions as well. We illustrate the usefulness of our proposals by evaluating a set of real world forecasts.Keywords: forecast evaluation, stochastic order, multiple comparison, non parametric test
Procedia PDF Downloads 893052 Labview-Based System for Fiber Links Events Detection
Authors: Bo Liu, Qingshan Kong, Weiqing Huang
Abstract:
With the rapid development of modern communication, diagnosing the fiber-optic quality and faults in real-time is widely focused. In this paper, a Labview-based system is proposed for fiber-optic faults detection. The wavelet threshold denoising method combined with Empirical Mode Decomposition (EMD) is applied to denoise the optical time domain reflectometer (OTDR) signal. Then the method based on Gabor representation is used to detect events. Experimental measurements show that signal to noise ratio (SNR) of the OTDR signal is improved by 1.34dB on average, compared with using the wavelet threshold denosing method. The proposed system has a high score in event detection capability and accuracy. The maximum detectable fiber length of the proposed Labview-based system can be 65km.Keywords: empirical mode decomposition, events detection, Gabor transform, optical time domain reflectometer, wavelet threshold denoising
Procedia PDF Downloads 1233051 A Motion Dictionary to Real-Time Recognition of Sign Language Alphabet Using Dynamic Time Warping and Artificial Neural Network
Authors: Marcio Leal, Marta Villamil
Abstract:
Computacional recognition of sign languages aims to allow a greater social and digital inclusion of deaf people through interpretation of their language by computer. This article presents a model of recognition of two of global parameters from sign languages; hand configurations and hand movements. Hand motion is captured through an infrared technology and its joints are built into a virtual three-dimensional space. A Multilayer Perceptron Neural Network (MLP) was used to classify hand configurations and Dynamic Time Warping (DWT) recognizes hand motion. Beyond of the method of sign recognition, we provide a dataset of hand configurations and motion capture built with help of fluent professionals in sign languages. Despite this technology can be used to translate any sign from any signs dictionary, Brazilian Sign Language (Libras) was used as case study. Finally, the model presented in this paper achieved a recognition rate of 80.4%.Keywords: artificial neural network, computer vision, dynamic time warping, infrared, sign language recognition
Procedia PDF Downloads 2173050 Benders Decomposition Approach to Solve the Hybrid Flow Shop Scheduling Problem
Authors: Ebrahim Asadi-Gangraj
Abstract:
Hybrid flow shop scheduling problem (HFS) contains sequencing in a flow shop where, at any stage, there exist one or more related or unrelated parallel machines. This production system is a common manufacturing environment in many real industries, such as the steel manufacturing, ceramic tile manufacturing, and car assembly industries. In this research, a mixed integer linear programming (MILP) model is presented for the hybrid flow shop scheduling problem, in which, the objective consists of minimizing the maximum completion time (makespan). For this purpose, a Benders Decomposition (BD) method is developed to solve the research problem. The proposed approach is tested on some test problems, small to moderate scale. The experimental results show that the Benders decomposition approach can solve the hybrid flow shop scheduling problem in a reasonable time, especially for small and moderate-size test problems.Keywords: hybrid flow shop, mixed integer linear programming, Benders decomposition, makespan
Procedia PDF Downloads 1903049 Violations of Press Freedom
Authors: Khalid Achaat
Abstract:
It is difficult to speak about freedom of the press in Algeria without first talking to fifty-seven journalists killed in the country between 1993 and 1997 and the five missing journalists. No serious investigation was conducted to find the culprits. When a State is not able to guarantee law, there is no justice and violations of the law become "systematic". How to claim the freedom of press in Algeria, when death becomes "banal"? In these circumstances, can we talk of rights of the Algerian press? It is impossible to understand the problems of the press in Algeria, focusing solely legal issues. Take into account technical, financial and political. Their respective roles varies depending on whether one focuses on the collection of information, the regime of the newspaper company or publication and dissemination. Can we say that the Algerian press is "the freest in the Arab world", while the latter reflects only partially the real problems facing the country? While any newspaper company is subject, de facto, to an authorization scheme, permanently subjected to the constant threat of withdrawal of the authorization, suspension, prohibition or closure without it has the right to a remedy? Can it be free when the majority of "media owners", head of the largest daily newspapers are derived from the single party in power since independence? Some of this release does not it serves the interests of the Algerian power?Keywords: freedom, press, power, closure, suspension
Procedia PDF Downloads 3503048 Non-Standard Monetary Policy Measures and Their Consequences
Authors: Aleksandra Nocoń (Szunke)
Abstract:
The study is a review of the literature concerning the consequences of non-standard monetary policy, which are used by central banks during unconventional periods, threatening instability of the banking sector. In particular, the attention was paid to the effects of non-standard monetary policy tools for financial markets. However, the empirical evidence about their effects and real consequences for the financial markets are still not final. The main aim of the study is to survey the consequences of standard and non-standard monetary policy instruments, implemented during the global financial crisis in the United States, United Kingdom and Euroland, with particular attention to the results for the stabilization of global financial markets. The study analyses the consequences for short and long-term market interest rates, interbank interest rates and LIBOR-OIS spread. The study consists mainly of the empirical review, indicating the impact of the implementation of these tools for the financial markets. The following research methods were used in the study: literature studies, including domestic and foreign literature, cause and effect analysis and statistical analysis.Keywords: asset purchase facility, consequences of monetary policy instruments, non-standard monetary policy, quantitative easing
Procedia PDF Downloads 331