Search results for: input faults
1051 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic
Authors: Diogen Babuc
Abstract:
The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison
Procedia PDF Downloads 1031050 An Image Enhancement Method Based on Curvelet Transform for CBCT-Images
Authors: Shahriar Farzam, Maryam Rastgarpour
Abstract:
Image denoising plays extremely important role in digital image processing. Enhancement of clinical image research based on Curvelet has been developed rapidly in recent years. In this paper, we present a method for image contrast enhancement for cone beam CT (CBCT) images based on fast discrete curvelet transforms (FDCT) that work through Unequally Spaced Fast Fourier Transform (USFFT). These transforms return a table of Curvelet transform coefficients indexed by a scale parameter, an orientation and a spatial location. Accordingly, the coefficients obtained from FDCT-USFFT can be modified in order to enhance contrast in an image. Our proposed method first uses a two-dimensional mathematical transform, namely the FDCT through unequal-space fast Fourier transform on input image and then applies thresholding on coefficients of Curvelet to enhance the CBCT images. Consequently, applying unequal-space fast Fourier Transform leads to an accurate reconstruction of the image with high resolution. The experimental results indicate the performance of the proposed method is superior to the existing ones in terms of Peak Signal to Noise Ratio (PSNR) and Effective Measure of Enhancement (EME).Keywords: curvelet transform, CBCT, image enhancement, image denoising
Procedia PDF Downloads 3001049 Leuco Dye-Based Thermochromic Systems for Application in Temperature Sensing
Authors: Magdalena Wilk-Kozubek, Magdalena Rowińska, Krzysztof Rola, Joanna Cybińska
Abstract:
Leuco dye-based thermochromic systems are classified as intelligent materials because they exhibit thermally induced color changes. Thanks to this feature, they are mainly used as temperature sensors in many industrial sectors. For example, placing a thermochromic material on a chemical reactor may warn about exceeding the maximum permitted temperature for a chemical process. Usually two components, a color former and a developer are needed to produce a system with irreversible color change. The color former is an electron donating (proton accepting) compound such as fluoran leuco dye. The developer is an electron accepting (proton donating) compound such as organic carboxylic acid. When the developer melts, the color former - developer complex is created and the termochromic system becomes colored. Typically, the melting point of the applied developer determines the temperature at which the color change occurs. When the lactone ring of the color former is closed, then the dye is in its colorless state. The ring opening, induced by the addition of a proton, causes the dye to turn into its colored state. Since the color former and the developer are often solid, they can be incorporated into polymer films to facilitate their practical use in industry. The objective of this research was to fabricate a leuco dye-based termochromic system that will irreversibly change color after reaching the temperature of 100°C. For this purpose, benzofluoran leuco dye (as color former) and phenoxyacetic acid (as developer with a melting point of 100°C) were introduced into the polymer films during the drop casting process. The film preparation process was optimized in order to obtain thin films with appropriate properties such as transparency, flexibility and homogeneity. Among the optimized factors were the concentration of benzofluoran leuco dye and phenoxyacetic acid, the type, average molecular weight and concentration of the polymer, and the type and concentration of the surfactant. The selected films, containing benzofluoran leuco dye and phenoxyacetic acid, were combined by mild heat treatment. Structural characterization of single and combined films was carried out by FTIR spectroscopy, morphological analysis was performed by optical microscopy and SEM, phase transitions were examined by DSC, color changes were investigated by digital photography and UV-Vis spectroscopy, while emission changes were studied by photoluminescence spectroscopy. The resulting thermochromic system is colorless at room temperature, but after reaching 100°C the developer melts and it turns irreversibly pink. Therefore, it could be used as an additional sensor to warn against boiling of water in power plants using water cooling. Currently used electronic temperature indicators are prone to faults and unwanted third-party actions. The sensor constructed in this work is transparent, thanks to which it can be unnoticed by an outsider and constitute a reliable reference for the person responsible for the apparatus.Keywords: color developer, leuco dye, thin film, thermochromism
Procedia PDF Downloads 991048 Implant Operation Guiding Device for Dental Surgeons
Authors: Daniel Hyun
Abstract:
Dental implants are one of the top 3 reasons to sue a dentist for malpractice. It involves dental implant complications, usually because of the angle of the implant from the surgery. At present, surgeons usually use a 3D-printed navigator that is customized for the patient’s teeth. However, those can’t be reused for other patients as they require time. Therefore, I made a guiding device to assist the surgeon in implant operations. The surgeon can input the objective of the operation, and the device constantly checks if the surgery is heading towards the objective within the set range, telling the surgeon by manipulating the LED. We tested the prototypes’ consistency and accuracy by checking the graph, average standard deviation, and the average change of the calculated angles. The accuracy of performance was also acquired by running the device and checking the outputs. My first prototype used accelerometer and gyroscope sensors from the Arduino MPU6050 sensor, getting a changeable graph, achieving 0.0295 of standard deviations, 0.25 of average change, and 66.6% accuracy of performance. The second prototype used only the gyroscope, and it got a constant graph, achieved 0.0062 of standard deviation, 0.075 of average change, and 100% accuracy of performance, indicating that the accelerometer sensor aggravated the functionality of the device. Using the gyroscope sensor allowed it to measure the orientations of separate axes without affecting each other and also increased the stability and accuracy of the measurements.Keywords: implant, guide, accelerometer, gyroscope, handpiece
Procedia PDF Downloads 431047 Human Action Recognition Using Variational Bayesian HMM with Dirichlet Process Mixture of Gaussian Wishart Emission Model
Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park
Abstract:
In this paper, we present the human action recognition method using the variational Bayesian HMM with the Dirichlet process mixture (DPM) of the Gaussian-Wishart emission model (GWEM). First, we define the Bayesian HMM based on the Dirichlet process, which allows an infinite number of Gaussian-Wishart components to support continuous emission observations. Second, we have considered an efficient variational Bayesian inference method that can be applied to drive the posterior distribution of hidden variables and model parameters for the proposed model based on training data. And then we have derived the predictive distribution that may be used to classify new action. Third, the paper proposes a process of extracting appropriate spatial-temporal feature vectors that can be used to recognize a wide range of human behaviors from input video image. Finally, we have conducted experiments that can evaluate the performance of the proposed method. The experimental results show that the method presented is more efficient with human action recognition than existing methods.Keywords: human action recognition, Bayesian HMM, Dirichlet process mixture model, Gaussian-Wishart emission model, Variational Bayesian inference, prior distribution and approximate posterior distribution, KTH dataset
Procedia PDF Downloads 3531046 Slope Stability Assessment in Metasedimentary Deposit of an Opencast Mine: The Case of the Dikuluwe-Mashamba (DIMA) Mine in the DR Congo
Authors: Dina Kon Mushid, Sage Ngoie, Tshimbalanga Madiba, Kabutakapua Kakanda
Abstract:
Slope stability assessment is still the biggest challenge in mining activities and civil engineering structures. The slope in an opencast mine frequently reaches multiple weak layers that lead to the instability of the pit. Faults and soft layers throughout the rock would increase weathering and erosion rates. Therefore, it is essential to investigate the stability of the complex strata to figure out how stable they are. In the Dikuluwe-Mashamba (DIMA) area, the lithology of the stratum is a set of metamorphic rocks whose parent rocks are sedimentary rocks with a low degree of metamorphism. Thus, due to the composition and metamorphism of the parent rock, the rock formation is different in hardness and softness, which means that when the content of dolomitic and siliceous is high, the rock is hard. It is softer when the content of argillaceous and sandy is high. Therefore, from the vertical direction, it appears as a weak and hard layer, and from the horizontal direction, it seems like a smooth and hard layer in the same rock layer. From the structural point of view, the main structures in the mining area are the Dikuluwe dipping syncline and the Mashamba dipping anticline, and the occurrence of rock formations varies greatly. During the folding process of the rock formation, the stress will concentrate on the soft layer, causing the weak layer to be broken. At the same time, the phenomenon of interlayer dislocation occurs. This article aimed to evaluate the stability of metasedimentary rocks of the Dikuluwe-Mashamba (DIMA) open-pit mine using limit equilibrium and stereographic methods Based on the presence of statistical structural planes, the stereographic projection was used to study the slope's stability and examine the discontinuity orientation data to identify failure zones along the mine. The results revealed that the slope angle is too steep, and it is easy to induce landslides. The numerical method's sensitivity analysis showed that the slope angle and groundwater significantly impact the slope safety factor. The increase in the groundwater level substantially reduces the stability of the slope. Among the factors affecting the variation in the rate of the safety factor, the bulk density of soil is greater than that of rock mass, the cohesion of soil mass is smaller than that of rock mass, and the friction angle in the rock mass is much larger than that in the soil mass. The analysis showed that the rock mass structure types are mostly scattered and fragmented; the stratum changes considerably, and the variation of rock and soil mechanics parameters is significant.Keywords: slope stability, weak layer, safety factor, limit equilibrium method, stereography method
Procedia PDF Downloads 2621045 Interpretable Deep Learning Models for Medical Condition Identification
Authors: Dongping Fang, Lian Duan, Xiaojing Yuan, Mike Xu, Allyn Klunder, Kevin Tan, Suiting Cao, Yeqing Ji
Abstract:
Accurate prediction of a medical condition with straight clinical evidence is a long-sought topic in the medical management and health insurance field. Although great progress has been made with machine learning algorithms, the medical community is still, to a certain degree, suspicious about the model's accuracy and interpretability. This paper presents an innovative hierarchical attention deep learning model to achieve good prediction and clear interpretability that can be easily understood by medical professionals. This deep learning model uses a hierarchical attention structure that matches naturally with the medical history data structure and reflects the member’s encounter (date of service) sequence. The model attention structure consists of 3 levels: (1) attention on the medical code types (diagnosis codes, procedure codes, lab test results, and prescription drugs), (2) attention on the sequential medical encounters within a type, (3) attention on the medical codes within an encounter and type. This model is applied to predict the occurrence of stage 3 chronic kidney disease (CKD3), using three years’ medical history of Medicare Advantage (MA) members from a top health insurance company. The model takes members’ medical events, both claims and electronic medical record (EMR) data, as input, makes a prediction of CKD3 and calculates the contribution from individual events to the predicted outcome. The model outcome can be easily explained with the clinical evidence identified by the model algorithm. Here are examples: Member A had 36 medical encounters in the past three years: multiple office visits, lab tests and medications. The model predicts member A has a high risk of CKD3 with the following well-contributed clinical events - multiple high ‘Creatinine in Serum or Plasma’ tests and multiple low kidneys functioning ‘Glomerular filtration rate’ tests. Among the abnormal lab tests, more recent results contributed more to the prediction. The model also indicates regular office visits, no abnormal findings of medical examinations, and taking proper medications decreased the CKD3 risk. Member B had 104 medical encounters in the past 3 years and was predicted to have a low risk of CKD3, because the model didn’t identify diagnoses, procedures, or medications related to kidney disease, and many lab test results, including ‘Glomerular filtration rate’ were within the normal range. The model accurately predicts members A and B and provides interpretable clinical evidence that is validated by clinicians. Without extra effort, the interpretation is generated directly from the model and presented together with the occurrence date. Our model uses the medical data in its most raw format without any further data aggregation, transformation, or mapping. This greatly simplifies the data preparation process, mitigates the chance for error and eliminates post-modeling work needed for traditional model explanation. To our knowledge, this is the first paper on an interpretable deep-learning model using a 3-level attention structure, sourcing both EMR and claim data, including all 4 types of medical data, on the entire Medicare population of a big insurance company, and more importantly, directly generating model interpretation to support user decision. In the future, we plan to enrich the model input by adding patients’ demographics and information from free-texted physician notes.Keywords: deep learning, interpretability, attention, big data, medical conditions
Procedia PDF Downloads 911044 Fire Characteristic of Commercial Retardant Flame Polycarbonate under Different Oxygen Concentration: Ignition Time and Heat Blockage
Authors: Xuelin Zhang, Shouxiang Lu, Changhai Li
Abstract:
The commercial retardant flame polycarbonate samples as the main high speed train interior carriage material with different thicknesses were investigated in Fire Propagation Apparatus with different external heat fluxes under different oxygen concentration from 12% to 40% to study the fire characteristics and quantitatively analyze the ignition time, mass loss rate and heat blockage. The additives of commercial retardant flame polycarbonate were intumescent and maintained a steady height before ignition when heated. The results showed the transformed ignition time (1/t_ig)ⁿ increased linearly with external flux under different oxygen concentration after deducting the heat blockage due to pyrolysis products, the mass loss rate was taken on linearly with external heat fluxes and the slop of the fitting line for mass loss rate and external heat fluxes decreased with the enhanced oxygen concentration and the heat blockage independent on external heat fluxes rose with oxygen concentration increasing. The inquired data as the input of the fire simulation model was the most important to be used to evaluate the fire risk of commercial retardant flame polycarbonate.Keywords: ignition time, mass loss rate, heat blockage, fire characteristic
Procedia PDF Downloads 2821043 Educational Experiences in Engineering in the COVID Era and Their Comparative Analysis, Spain, March to June 2020
Authors: Borja Bordel, Ramón Alcarria, Marina Pérez
Abstract:
In March 2020, in Spain, a sanitary and unexpected crisis caused by COVID-19 was declared. All of a sudden, all degrees, classes and evaluation tests and projects had to be transformed into online activities. However, the chaotic situation generated by a complex operation like that, executed without any well-established procedure, led to very different experiences and, finally, results. In this paper, we are describing three experiences in two different Universities in Madrid. On the one hand, the Technical University of Madrid, a public university with little experience in online education. On the other hand, Alfonso X el Sabio University, a private university with more than five years of experience in online teaching. All analyzed subjects were related to computer engineering. Professors and students answered a survey and personal interviews were also carried out. Besides, the professors’ workload and the students’ academic results were also compared. From the comparative analysis of all these experiences, we are extracting the most successful strategies, methodologies, and activities. The recommendations in this paper will be useful for courses during the next months when the sanitary situation is still affecting an educational organization. While, at the same time, they will be considered as input for the upcoming digitalization process of higher education.Keywords: educational experience, online education, higher education digitalization, COVID, Spain
Procedia PDF Downloads 1401042 Urban Resilince and Its Prioritised Components: Analysis of Industrial Township Greater Noida
Authors: N. Mehrotra, V. Ahuja, N. Sridharan
Abstract:
Resilience is an all hazard and a proactive approach, require a multidisciplinary input in the inter related variables of the city system. This research based to identify and operationalize indicators for assessment in domain of institutions, infrastructure and knowledge, all three operating in task oriented community networks. This paper gives a brief account of the methodology developed for assessment of Urban Resilience and its prioritized components for a target population within a newly planned urban complex integrating Surajpur and Kasna village as nodes. People’s perception of Urban Resilience has been examined by conducting questionnaire survey among the target population of Greater Noida. As defined by experts, Urban Resilience of a place is considered to be both a product and process of operation to regain normalcy after an event of disturbance of certain level. Based on this methodology, six indicators are identified that contribute to perception of urban resilience both as in the process of evolution and as an outcome. The relative significance of 6 R’ has also been identified. The dependency factor of various resilience indicators have been explored in this paper, which helps in generating new perspective for future research in disaster management. Based on the stated factors this methodology can be applied to assess urban resilience requirements of a well planned town, which is not an end in itself, but calls for new beginnings.Keywords: disaster, resilience, system, urban
Procedia PDF Downloads 4581041 Theoretical Analysis of the Solid State and Optical Characteristics of Calcium Sulpide Thin Film
Authors: Emmanuel Ifeanyi Ugwu
Abstract:
Calcium Sulphide which is one of Chalcogenide group of thin films has been analyzed in this work using a theoretical approach in which a scalar wave was propagated through the material thin film medium deposited on a glass substrate with the assumption that the dielectric medium has homogenous reference dielectric constant term, and a perturbed dielectric function, representing the deposited thin film medium on the surface of the glass substrate as represented in this work. These were substituted into a defined scalar wave equation that was solved first of all by transforming it into Volterra equation of second type and solved using the method of separation of variable on scalar wave and subsequently, Green’s function technique was introduced to obtain a model equation of wave propagating through the thin film that was invariably used in computing the propagated field, for different input wavelengths representing UV, Visible and Near-infrared regions of field considering the influence of the dielectric constants of the thin film on the propagating field. The results obtained were used in turn to compute the band gaps, solid state and optical properties of the thin film.Keywords: scalar wave, dielectric constant, calcium sulphide, solid state, optical properties
Procedia PDF Downloads 1181040 Stock Market Prediction Using Convolutional Neural Network That Learns from a Graph
Authors: Mo-Se Lee, Cheol-Hwi Ahn, Kee-Young Kwahk, Hyunchul Ahn
Abstract:
Over the past decade, deep learning has been in spotlight among various machine learning algorithms. In particular, CNN (Convolutional Neural Network), which is known as effective solution for recognizing and classifying images, has been popularly applied to classification and prediction problems in various fields. In this study, we try to apply CNN to stock market prediction, one of the most challenging tasks in the machine learning research. In specific, we propose to apply CNN as the binary classifier that predicts stock market direction (up or down) by using a graph as its input. That is, our proposal is to build a machine learning algorithm that mimics a person who looks at the graph and predicts whether the trend will go up or down. Our proposed model consists of four steps. In the first step, it divides the dataset into 5 days, 10 days, 15 days, and 20 days. And then, it creates graphs for each interval in step 2. In the next step, CNN classifiers are trained using the graphs generated in the previous step. In step 4, it optimizes the hyper parameters of the trained model by using the validation dataset. To validate our model, we will apply it to the prediction of KOSPI200 for 1,986 days in eight years (from 2009 to 2016). The experimental dataset will include 14 technical indicators such as CCI, Momentum, ROC and daily closing price of KOSPI200 of Korean stock market.Keywords: convolutional neural network, deep learning, Korean stock market, stock market prediction
Procedia PDF Downloads 4251039 The Environmental Concerns in Coal Mining, and Utilization in Pakistan
Authors: S. R. H. Baqri, T. Shahina, M. T. Hasan
Abstract:
Pakistan is facing acute shortage of energy and looking for indigenous resources of the energy mix to meet the short fall. After the discovery of huge coal resources in Thar Desert of Sindh province, focus has shifted to coal power generation. The government of Pakistan has planned power generation of 20000 MW on coal by the year 2025. This target will be achieved by mining and power generation in Thar coal Field and on imported coal in different parts of Pakistan. Total indigenous coal production of around 3.0 million tons is being utilized in brick kilns, cement and sugar industry. Coal-based power generation is only limited to three units of 50 MW near Hyderabad from nearby Lakhra Coal field. The purpose of this presentation is to identify and redressal of issues of coal mining and utilization with reference to environmental hazards. Thar coal resource is estimated at 175 billion tons out of a total resource estimate of 184 billion tons in Pakistan. Coal of Pakistan is of Tertiary age (Palaeocene/Eocene) and classified from lignite to sub-bituminous category. Coal characterization has established three main pollutants such as Sulphur, Carbon dioxide and Methane besides some others associated with coal and rock types. The element Sulphur occurs in organic as well as inorganic forms associated with coals as free sulphur and as pyrite, gypsum, respectively. Carbon dioxide, methane and minerals are mostly associated with fractures, joints local faults, seatearth and roof rocks. The abandoned and working coal mines give kerosene odour due to escape of methane in the atmosphere. While the frozen methane/methane ices in organic matter rich sediments have also been reported from the Makran coastal and offshore areas. The Sulphur escapes into the atmosphere during mining and utilization of coal in industry. The natural erosional processes due to rivers, streams, lakes and coastal waves erode over lying sediments allowing pollutants to escape into air and water. Power plants emissions should be controlled through application of appropriate clean coal technology and need to be regularly monitored. Therefore, the systematic and scientific studies will be required to estimate the quantity of methane, carbon dioxide and sulphur at various sites such as abandoned and working coal mines, exploratory wells for coal, oil and gas. Pressure gauges on gas pipes connecting the coal-bearing horizons will be installed on surface to know the quantity of gas. The quality and quantity of gases will be examined according to the defined intervals of times. This will help to design and recommend the methods and procedures to stop the escape of gases into atmosphere. The element of Sulphur can be removed partially by gravity and chemical methods after grinding and before industrial utilization of coal.Keywords: atmosphere, coal production, energy, pollutants
Procedia PDF Downloads 4351038 Law and its Implementation and Consequences in Pakistan
Authors: Amir Shafiq, Asif Shahzad, Shabbar Mehmood, Muhammad Saeed, Hamid Mustafa
Abstract:
Legislation includes the law or the statutes which is being reputable by a sovereign authority and generally can be implemented by the courts of law time to time to accomplish the objectives. Historically speaking upon the emergence of Pakistan in 1947, the intact laws of the British Raj remained effective after ablution by Islamic Ideology. Thus, there was an intention to begin the statutes book afresh for Pakistan's legal history. In consequence thereof, the process of developing detailed plans, procedures and mechanisms to ensure legislative and regulatory requirements are achieved began keeping in view the cultural values and the local customs. This article is an input to the enduring discussion about implementing rule of law in Pakistan whereas; the rule of law requires the harmony of laws which is mostly in the arrangement of codified state laws. Pakistan has legal plural civilizations where completely different and independent systems of law like the Mohammadan law, the state law and the traditional law exist. The prevailing practiced law in Pakistan is actually the traditional law though the said law is not acknowledged by the State. This caused the main problem of the rule of law in the difference between the state laws and the cultural values. These values, customs and so-called traditional laws are the main obstacle to enforce the State law in true letter and spirit which has caused dissatisfaction of the masses and distrust upon the judicial system of the country.Keywords: consequences, implement, law, Pakistan
Procedia PDF Downloads 4331037 Estimations of Spectral Dependence of Tropospheric Aerosol Single Scattering Albedo in Sukhothai, Thailand
Authors: Siriluk Ruangrungrote
Abstract:
Analyses of available data from MFR-7 measurement were performed and discussed on the study of tropospheric aerosol and its consequence in Thailand. Since, ASSA (w) is one of the most important parameters for a determination of aerosol effect on radioactive forcing. Here the estimation of w was directly determined in terms of the ratio of aerosol scattering optical depth to aerosol extinction optical depth (ωscat/ωext) without any utilization of aerosol computer code models. This is of benefit for providing the elimination of uncertainty causing by the modeling assumptions and the estimation of actual aerosol input data. Diurnal w of 5 cloudless-days in winter and early summer at 5 distinct wavelengths of 415, 500, 615, 673 and 870 nm with the consideration of Rayleigh scattering and atmospheric column NO2 and Ozone contents were investigated, respectively. Besides, the tendency of spectral dependence of ω representing two seasons was observed. The characteristic of spectral results reveals that during wintertime the atmosphere of the inland rural vicinity for the period of measurement possibly dominated with a lesser amount of soil dust aerosols loading than one in early summer. Hence, the major aerosol loading particularly in summer was subject to a mixture of both soil dust and biomass burning aerosols.Keywords: aerosol scattering optical depth, aerosol extinction optical depth, biomass burning aerosol, soil dust aerosol
Procedia PDF Downloads 4051036 Facility Data Model as Integration and Interoperability Platform
Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes
Abstract:
Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.Keywords: airport ontology, energy management, facility data model, ontology modeling
Procedia PDF Downloads 4481035 Verification of Geophysical Investigation during Subsea Tunnelling in Qatar
Authors: Gary Peach, Furqan Hameed
Abstract:
Musaimeer outfall tunnel is one of the longest storm water tunnels in the world, with a total length of 10.15 km. The tunnel will accommodate surface and rain water received from the drainage networks from 270 km of urban areas in southern Doha with a pumping capacity of 19.7m³/sec. The tunnel is excavated by Tunnel Boring Machine (TBM) through Rus Formation, Midra Shales, and Simsima Limestone. Water inflows at high pressure, complex mixed ground, and weaker ground strata prone to karstification with the presence of vertical and lateral fractures connected to the sea bed were also encountered during mining. In addition to pre-tender geotechnical investigations, the Contractor carried out a supplementary offshore geophysical investigation in order to fine-tune the existing results of geophysical and geotechnical investigations. Electric resistivity tomography (ERT) and Seismic Reflection survey was carried out. Offshore geophysical survey was performed, and interpretations of rock mass conditions were made to provide an overall picture of underground conditions along the tunnel alignment. This allowed the critical tunnelling area and cutter head intervention to be planned accordingly. Karstification was monitored with a non-intrusive radar system facility installed on the TBM. The Boring Electric Ahead Monitoring(BEAM) was installed at the cutter head and was able to predict the rock mass up to 3 tunnel diameters ahead of the cutter head. BEAM system was provided with an online system for real time monitoring of rock mass condition and then correlated with the rock mass conditions predicted during the interpretation phase of offshore geophysical surveys. The further correlation was carried by Samples of the rock mass taken from tunnel face inspections and excavated material produced by the TBM. The BEAM data was continuously monitored to check the variations in resistivity and percentage frequency effect (PFE) of the ground. This system provided information about rock mass condition, potential karst risk, and potential of water inflow. BEAM system was found to be more than 50% accurate in picking up the difficult ground conditions and faults as predicted in the geotechnical interpretative report before the start of tunnelling operations. Upon completion of the project, it was concluded that the combined use of different geophysical investigation results can make the execution stage be carried out in a more confident way with the less geotechnical risk involved. The approach used for the prediction of rock mass condition in Geotechnical Interpretative Report (GIR) and Geophysical Reflection and electric resistivity tomography survey (ERT) Geophysical Reflection surveys were concluded to be reliable as the same rock mass conditions were encountered during tunnelling operations.Keywords: tunnel boring machine (TBM), subsea, karstification, seismic reflection survey
Procedia PDF Downloads 2441034 Gradient Length Anomaly Analysis for Landslide Vulnerability Analysis of Upper Alaknanda River Basin, Uttarakhand Himalayas, India
Authors: Hasmithaa Neha, Atul Kumar Patidar, Girish Ch Kothyari
Abstract:
The northward convergence of the Indian plate has a dominating influence over the structural and geomorphic development of the Himalayan region. The highly deformed and complex stratigraphy in the area arises from a confluence of exogenic and endogenetic geological processes. This region frequently experiences natural hazards such as debris flows, flash floods, avalanches, landslides, and earthquakes due to its harsh and steep topography and fragile rock formations. Therefore, remote sensing technique-based examination and real-time monitoring of tectonically sensitive regions may provide crucial early warnings and invaluable data for effective hazard mitigation strategies. In order to identify unusual changes in the river gradients, the current study demonstrates a spatial quantitative geomorphic analysis of the upper Alaknanda River basin, Uttarakhand Himalaya, India, using gradient length anomaly analysis (GLAA). This basin is highly vulnerable to ground creeping and landslides due to the presence of active faults/thrusts, toe-cutting of slopes for road widening, development of heavy engineering projects on the highly sheared bedrock, and periodic earthquakes. The intersecting joint sets developed in the bedrocks have formed wedges that have facilitated the recurrence of several landslides. The main objective of current research is to identify abnormal gradient lengths, indicating potential landslide-prone zones. High-resolution digital elevation data and geospatial techniques are used to perform this analysis. The results of GLAA are corroborated with the historical landslide events and ultimately used for the generation of landslide susceptibility maps of the current study area. The preliminary results indicate that approximately 3.97% of the basin is stable, while about 8.54% is classified as moderately stable and suitable for human habitation. However, roughly 19.89% fall within the zone of moderate vulnerability, 38.06% are classified as vulnerable, and 29% fall within the highly vulnerable zones, posing risks for geohazards, including landslides, glacial avalanches, and earthquakes. This research provides valuable insights into the spatial distribution of landslide-prone areas. It offers a basis for implementing proactive measures for landslide risk reduction, including land-use planning, early warning systems, and infrastructure development techniques.Keywords: landslide vulnerability, geohazard, GLA, upper Alaknanda Basin, Uttarakhand Himalaya
Procedia PDF Downloads 721033 Bacteriological Safety of Sachet Drinking Water Sold in Benin City, Nigeria
Authors: Stephen Olusanmi Akintayo
Abstract:
Access to safe drinking water remains a major challenge in Nigeria, and where available, the quality of the water is often in doubt. An alternative to the inadequate clean drinking water is being found in treated drinking water packaged in electrically heated sealed nylon and commonly referred to as “sachet water”. “Sachet water” is a common thing in Nigeria as the selling price is within the reach of members of the low socio- economic class and the setting up of a production unit does not require huge capital input. The bacteriological quality of selected “sachet water” stored at room temperature over a period of 56 days was determined to evaluate the safety of the sachet drinking water. Test for the detection of coliform bacteria was performed, and the result showed no coliform bacteria that indicates the absence of fecal contamination throughout 56 days. Heterotrophic plate count (HPC) was done at an interval 14 days, and the samples showed HPC between 0 cfu/mL and 64 cfu/mL. The highest count was observed on day 1. The count decreased between day 1 and 28, while no growths were observed between day 42 and 56. The decrease in HPC suggested the presence of residual disinfectant in the water. The organisms isolated were identified as Staphylococcus epidermis and S. aureus. The presence of these microorganisms in sachet water is indicative for contamination during processing and handling.Keywords: coliform, heterotrophic plate count, sachet water, Staphyloccocus aureus, Staphyloccocus epidermidis
Procedia PDF Downloads 3411032 The Use of Building Energy Simulation Software in Case Studies: A Literature Review
Authors: Arman Ameen, Mathias Cehlin
Abstract:
The use of Building Energy Simulation (BES) software has increased in the last two decades, parallel to the development of increased computing power and easy to use software applications. This type of software is primarily used to simulate the energy use and the indoor environment for a building. The rapid development of these types of software has raised their level of user-friendliness, better parameter input options and the increased possibility of analysis, both for a single building component or an entire building. This, in turn, has led to many researchers utilizing BES software in their research in various degrees. The aim of this paper is to carry out a literature review concerning the use of the BES software IDA Indoor Climate and Energy (IDA ICE) in the scientific community. The focus of this paper will be specifically the use of the software for whole building energy simulation, number and types of articles and publications dates, the area of application, types of parameters used, the location of the studied building, type of building, type of analysis and solution methodology. Another aspect that is examined, which is of great interest, is the method of validations regarding the simulation results. The results show that there is an upgoing trend in the use of IDA ICE and that researchers use the software in their research in various degrees depending on case and aim of their research. The satisfactory level of validation of the simulations carried out in these articles varies depending on the type of article and type of analysis.Keywords: building simulation, IDA ICE, literature review, validation
Procedia PDF Downloads 1351031 Reformulation of Theory of Critical Distances to Predict the Strength of Notched Plain Concrete Beams under Quasi Static Loading
Authors: Radhika V., J. M. Chandra Kishen
Abstract:
The theory of critical distances (TCD), due to its appealing characteristics, has been successfully used in the past to predict the strength of brittle as well as ductile materials, weakened by the presence of stress risers under both static and fatigue loading. By utilising most of the TCD's unique features, this paper summarises an attempt for a reformulation of the point method of the TCD to predict the strength of notched plain concrete beams under mode I quasi-static loading. A zone of micro cracks, which is responsible for the non-linearity of concrete, is taken into account considering the concept of an effective elastic crack. An attempt is also made to correlate the value of the material characteristic length required for the application of TCD with the maximum aggregate size in the concrete mix, eliminating the need for any extensive experimentation prior to the application of TCD. The devised reformulation and the proposed power law based relationship is found to yield satisfactory predictions for static strength of notched plain concrete beams, with geometric dimensions of the beam, tensile strength, and maximum aggregate size of the concrete mix being the only needed input parameters.Keywords: characteristic length, effective elastic crack, inherent material strength, modeI loading, theory of critical distances
Procedia PDF Downloads 981030 Image Segmentation Techniques: Review
Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo
Abstract:
Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.Keywords: clustering-based, convolution-network, edge-based, region-growing
Procedia PDF Downloads 961029 Disaggregation the Daily Rainfall Dataset into Sub-Daily Resolution in the Temperate Oceanic Climate Region
Authors: Mohammad Bakhshi, Firas Al Janabi
Abstract:
High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.Keywords: DiMoN Tool, disaggregation, exceedance probability, Kolmogorov-Smirnov test, rainfall
Procedia PDF Downloads 2011028 Design of EV Steering Unit Using AI Based on Estimate and Control Model
Authors: Seong Jun Yoon, Jasurbek Doliev, Sang Min Oh, Rodi Hartono, Kyoojae Shin
Abstract:
Electric power steering (EPS), which is commonly used in electric vehicles recently, is an electric-driven steering device for vehicles. Compared to hydraulic systems, EPS offers advantages such as simple system components, easy maintenance, and improved steering performance. However, because the EPS system is a nonlinear model, difficult problems arise in controller design. To address these, various machine learning and artificial intelligence approaches, notably artificial neural networks (ANN), have been applied. ANN can effectively determine relationships between inputs and outputs in a data-driven manner. This research explores two main areas: designing an EPS identifier using an ANN-based backpropagation (BP) algorithm and enhancing the EPS system controller with an ANN-based Levenberg-Marquardt (LM) algorithm. The proposed ANN-based BP algorithm shows superior performance and accuracy compared to linear transfer function estimators, while the LM algorithm offers better input angle reference tracking and faster response times than traditional PID controllers. Overall, the proposed ANN methods demonstrate significant promise in improving EPS system performance.Keywords: ANN backpropagation modelling, electric power steering, transfer function estimator, electrical vehicle driving system
Procedia PDF Downloads 431027 Forecast of Polyethylene Properties in the Gas Phase Polymerization Aided by Neural Network
Authors: Nasrin Bakhshizadeh, Ashkan Forootan
Abstract:
A major problem that affects the quality control of polymer in the industrial polymerization is the lack of suitable on-line measurement tools to evaluate the properties of the polymer such as melt and density indices. Controlling the polymerization in ordinary method is performed manually by taking samples, measuring the quality of polymer in the lab and registry of results. This method is highly time consuming and leads to producing large number of incompatible products. An online application for estimating melt index and density proposed in this study is a neural network based on the input-output data of the polyethylene production plant. Temperature, the level of reactors' bed, the intensity of ethylene mass flow, hydrogen and butene-1, the molar concentration of ethylene, hydrogen and butene-1 are used for the process to establish the neural model. The neural network is taught based on the actual operational data and back-propagation and Levenberg-Marquart techniques. The simulated results indicate that the neural network process model established with three layers (one hidden layer) for forecasting the density and the four layers for the melt index is able to successfully predict those quality properties.Keywords: polyethylene, polymerization, density, melt index, neural network
Procedia PDF Downloads 1441026 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs
Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar
Abstract:
The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.Keywords: simulation, probability, confidence interval, sensitivity analysis
Procedia PDF Downloads 3821025 A Mixed Integer Programming Model for Optimizing the Layout of an Emergency Department
Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee
Abstract:
During the recent years, demand for healthcare services has dramatically increased. As the demand for healthcare services increases, so does the necessity of constructing new healthcare buildings and redesigning and renovating existing ones. Increasing demands necessitate the use of optimization techniques to improve the overall service efficiency in healthcare settings. However, high complexity of care processes remains the major challenge to accomplish this goal. This study proposes a method based on process mining results to address the high complexity of care processes and to find the optimal layout of the various medical centers in an emergency department. ProM framework is used to discover clinical pathway patterns and relationship between activities. Sequence clustering plug-in is used to remove infrequent events and to derive the process model in the form of Markov chain. The process mining results served as an input for the next phase which consists of the development of the optimization model. Comparison of the current ED design with the one obtained from the proposed method indicated that a carefully designed layout can significantly decrease the distances that patients must travel.Keywords: Mixed Integer programming, Facility layout problem, Process Mining, Healthcare Operation Management
Procedia PDF Downloads 3391024 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models
Authors: Jay L. Fu
Abstract:
Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction
Procedia PDF Downloads 1431023 CTHTC: A Convolution-Backed Transformer Architecture for Temporal Knowledge Graph Embedding with Periodicity Recognition
Authors: Xinyuan Chen, Mohd Nizam Husen, Zhongmei Zhou, Gongde Guo, Wei Gao
Abstract:
Temporal Knowledge Graph Completion (TKGC) has attracted increasing attention for its enormous value; however, existing models lack capabilities to capture both local interactions and global dependencies simultaneously with evolutionary dynamics, while the latest achievements in convolutions and Transformers haven't been employed in this area. What’s more, periodic patterns in TKGs haven’t been fully explored either. To this end, a multi-stage hybrid architecture with convolution-backed Transformers is introduced in TKGC tasks for the first time combining the Hawkes process to model evolving event sequences in a continuous-time domain. In addition, the seasonal-trend decomposition is adopted to identify periodic patterns. Experiments on six public datasets are conducted to verify model effectiveness against state-of-the-art (SOTA) methods. An extensive ablation study is carried out accordingly to evaluate architecture variants as well as the contributions of independent components in addition, paving the way for further potential exploitation. Besides complexity analysis, input sensitivity and safety challenges are also thoroughly discussed for comprehensiveness with novel methods.Keywords: temporal knowledge graph completion, convolution, transformer, Hawkes process, periodicity
Procedia PDF Downloads 781022 Non-Targeted Adversarial Image Classification Attack-Region Modification Methods
Authors: Bandar Alahmadi, Lethia Jackson
Abstract:
Machine Learning model is used today in many real-life applications. The safety and security of such model is important, so the results of the model are as accurate as possible. One challenge of machine learning model security is the adversarial examples attack. Adversarial examples are designed by the attacker to cause the machine learning model to misclassify the input. We propose a method to generate adversarial examples to attack image classifiers. We are modifying the successfully classified images, so a classifier misclassifies them after the modification. In our method, we do not update the whole image, but instead we detect the important region, modify it, place it back to the original image, and then run it through a classifier. The algorithm modifies the detected region using two methods. First, it will add abstract image matrix on back of the detected image matrix. Then, it will perform a rotation attack to rotate the detected region around its axes, and embed the trace of image in image background. Finally, the attacked region is placed in its original position, from where it was removed, and a smoothing filter is applied to smooth the background with foreground. We test our method in cascade classifier, and the algorithm is efficient, the classifier confident has dropped to almost zero. We also try it in CNN (Convolutional neural network) with higher setting and the algorithm was successfully worked.Keywords: adversarial examples, attack, computer vision, image processing
Procedia PDF Downloads 339