Search results for: single valued neutrosophic hesitant fuzzy rough set
2515 Ubiquitous Life People Informatics Engine (U-Life PIE): Wearable Health Promotion System
Authors: Yi-Ping Lo, Shi-Yao Wei, Chih-Chun Ma
Abstract:
Since Google launched Google Glass in 2012, numbers of commercial wearable devices were released, such as smart belt, smart band, smart shoes, smart clothes ... etc. However, most of these devices perform as sensors to show the readings of measurements and few of them provide the interactive feedback to the user. Furthermore, these devices are single task devices which are not able to communicate with each other. In this paper a new health promotion system, Ubiquitous Life People Informatics Engine (U-Life PIE), will be presented. This engine consists of People Informatics Engine (PIE) and the interactive user interface. PIE collects all the data from the compatible devices, analyzes this data comprehensively and communicates between devices via various application programming interfaces. All the data and informations are stored on the PIE unit, therefore, the user is able to view the instant and historical data on their mobile devices any time. It also provides the real-time hands-free feedback and instructions through the user interface visually, acoustically and tactilely. These feedback and instructions suggest the user to adjust their posture or habits in order to avoid the physical injuries and prevent illness.Keywords: machine learning, wearable devices, user interface, user experience, internet of things
Procedia PDF Downloads 2942514 Implementation of Data Science in Field of Homologation
Authors: Shubham Bhonde, Nekzad Doctor, Shashwat Gawande
Abstract:
For the use and the import of Keys and ID Transmitter as well as Body Control Modules with radio transmission in a lot of countries, homologation is required. Final deliverables in homologation of the product are certificates. In considering the world of homologation, there are approximately 200 certificates per product, with most of the certificates in local languages. It is challenging to manually investigate each certificate and extract relevant data from the certificate, such as expiry date, approval date, etc. It is most important to get accurate data from the certificate as inaccuracy may lead to missing re-homologation of certificates that will result in an incompliance situation. There is a scope of automation in reading the certificate data in the field of homologation. We are using deep learning as a tool for automation. We have first trained a model using machine learning by providing all country's basic data. We have trained this model only once. We trained the model by feeding pdf and jpg files using the ETL process. Eventually, that trained model will give more accurate results later. As an outcome, we will get the expiry date and approval date of the certificate with a single click. This will eventually help to implement automation features on a broader level in the database where certificates are stored. This automation will help to minimize human error to almost negligible.Keywords: homologation, re-homologation, data science, deep learning, machine learning, ETL (extract transform loading)
Procedia PDF Downloads 1632513 Effects of Physical Activity on the Association of CETP Gene with HDL Cholesterol Levels in Korean Population
Authors: Jae Woong Sull, Sun Ha Jee
Abstract:
High-density lipoprotein (HDL) cholesterol levels are associated with decreased risk of coronary artery disease. Several genome-wide association studies (GWAS) for HDL cholesterol levels have implicated cholesterol ester transfer protein (CETP) as possibly causal. We tested for the association between single nucleotide polymorphisms (SNPs) in CETP gene and HDL cholesterol levels in Korean population. Subjects were selected from the Korean Metabolic Syndrome Research Initiative study in the Bundang-Gu area. A total of 2,304 individuals from Bundang-Gu were recruited in 2008. Other subjects were selected from the Severance Hospital (N=4,294). SNP rs6499861 in the CETP gene was associated with mean HDL cholesterol levels (effect per allele -2.044 mg/dL, p=7.23×10-7). Subjects with the CG/GG genotype had a 1.46 -fold (range 1.24–1.72-fold) higher risk of having abnormal HDL cholesterol levels (<40 mg/dL) than subjects with the CC genotype. When analyzed by gender, the association of CETP was stronger in women than in men. When analyzed by physical activity behavior, the association with CETP was much stronger in male subjects with low physical activity (OR=1.54, 95% CI: 1.23-1.92, P=0.0001) than in male subjects with high physical activity. This study clearly demonstrates that genetic variants in CETP influence HDL cholesterol levels in Korean adults.Keywords: CETP, HDL cholesterol, physical activity, polymorphisms
Procedia PDF Downloads 2862512 An Efficient Hybrid Approach Based on Multi-Agent System and Emergence Method for the Integration of Systematic Preventive Maintenance Policies
Authors: Abdelhadi Adel, Kadri Ouahab
Abstract:
This paper proposes a hybrid algorithm for the integration of systematic preventive maintenance policies in hybrid flow shop scheduling to minimize makespan. We have implemented a problem-solving approach for optimizing the processing time, methods based on metaheuristics. The proposed approach is inspired by the behavior of the human body. This hybridization is between a multi-agent system and inspirations of the human body, especially genetics. The effectiveness of our approach has been demonstrated repeatedly in this paper. To solve such a complex problem, we proposed an approach which we have used advanced operators such as uniform crossover set and single point mutation. The proposed approach is applied to three preventive maintenance policies. These policies are intended to maximize the availability or to maintain a minimum level of reliability during the production chain. The results show that our algorithm outperforms existing algorithms. We assumed that the machines might be unavailable periodically during the production scheduling.Keywords: multi-agent systems, emergence, genetic algorithm, makespan, systematic maintenance, scheduling, hybrid flow shop scheduling
Procedia PDF Downloads 3362511 Rotor Concepts for the Counter Flow Heat Recovery Fan
Authors: Christoph Speer
Abstract:
Decentralized ventilation systems should combine a small and economical design with high aerodynamic and thermal efficiency. The Counter Flow Heat Recovery Fan (CHRF) provides the ability to meet these requirements by using only one cross flow fan with a large number of blades to generate both airflows and which simultaneously acts as a regenerative counter flow heat exchanger. The successful development of the first laboratory prototype has shown the potential of this ventilation system. Occurring condensate on the surfaces of the fan blades during the cold and dry season can be recovered through the characteristic mode of operation. Hence the CHRF provides the possibility to avoid the need for frost protection and condensate drain. Through the implementation of system-specific solutions for flow balancing and summer bypass the required functionality is assured. The scalability of the CHRF concept allows the use in renovation as well as in new buildings from single-room devices through to systems for office buildings. High aerodynamic and thermal efficiency and the lower number of required mechatronic components should enable a reduction in investment as well as operating costs. The rotor is the key component of the system, the requirements and possible implementation variants are presented.Keywords: CHRF, counter flow heat recovery fan, decentralized ventilation system, renovation
Procedia PDF Downloads 3552510 Multi-Spectral Deep Learning Models for Forest Fire Detection
Authors: Smitha Haridasan, Zelalem Demissie, Atri Dutta, Ajita Rattani
Abstract:
Aided by the wind, all it takes is one ember and a few minutes to create a wildfire. Wildfires are growing in frequency and size due to climate change. Wildfires and its consequences are one of the major environmental concerns. Every year, millions of hectares of forests are destroyed over the world, causing mass destruction and human casualties. Thus early detection of wildfire becomes a critical component to mitigate this threat. Many computer vision-based techniques have been proposed for the early detection of forest fire using video surveillance. Several computer vision-based methods have been proposed to predict and detect forest fires at various spectrums, namely, RGB, HSV, and YCbCr. The aim of this paper is to propose a multi-spectral deep learning model that combines information from different spectrums at intermediate layers for accurate fire detection. A heterogeneous dataset assembled from publicly available datasets is used for model training and evaluation in this study. The experimental results show that multi-spectral deep learning models could obtain an improvement of about 4.68 % over those based on a single spectrum for fire detection.Keywords: deep learning, forest fire detection, multi-spectral learning, natural hazard detection
Procedia PDF Downloads 2412509 Application of Principle Component Analysis for Classification of Random Doppler-Radar Targets during the Surveillance Operations
Authors: G. C. Tikkiwal, Mukesh Upadhyay
Abstract:
During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving army, moving convoys etc. The Radar operator selects one of the promising targets into Single Target Tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper we present a technique using mathematical and statistical methods like Fast Fourier Transformation (FFT) and Principal Component Analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.Keywords: radar target, fft, principal component analysis, eigenvector, octave-notes, dsp
Procedia PDF Downloads 3462508 Effects of Copper and Cobalt Co-Doping on Structural, Optical and Electrical Properties of Tio2 Thin Films Prepared by Sol Gel Method
Authors: Rabah Bensaha, Badreeddine Toubal
Abstract:
Un-doped TiO2, Co single doped TiO2 and (Cu-Co) co-doped TiO2 thin films have been growth on silicon substrates by the sol-gel dip coating technique. We mainly investigated both effects of the dopants and annealing temperature on the structural, optical and electrical properties of TiO2 films using X-ray diffraction (XRD), Raman and FTIR spectroscopy, Atomic force microscopy (AFM), Scanning electron microscopy (SEM), UV–Vis spectroscopy. The chemical compositions of Co-doped and (Cu-Co) co-doped TiO2 films were confirmed by XRD, Raman and FTIR studies. The average grain sizes of CoTiO3-TiO2 nanocomposites were increased with annealing temperature. AFM and SEM reveal a completely the various nanostructures of CoTiO3-TiO2 nanocomposites thin films. The films exhibit a high optical reflectance with a large band gap. The highest electrical conductivity was obtained for the (Cu-Co) co-doped TiO2 films. The polyhedral surface morphology might possibly improve the surface contact between particle sizes and then contribute to better electron mobility as well as conductivity. The obtained results suggest that the prepared TiO2 films can be used for optoelectronic applications.Keywords: sol-gel, TiO2 thin films, CoTiO3-TiO2 nanocomposites films, Electrical conductivity
Procedia PDF Downloads 4422507 LCA/CFD Studies of Artisanal Brick Manufacture in Mexico
Authors: H. A. Lopez-Aguilar, E. A. Huerta-Reynoso, J. A. Gomez, J. A. Duarte-Moller, A. Perez-Hernandez
Abstract:
Environmental performance of artisanal brick manufacture was studied by Lifecycle Assessment (LCA) methodology and Computational Fluid Dynamics (CFD) analysis in Mexico. The main objective of this paper is to evaluate the environmental impact during artisanal brick manufacture. LCA cradle-to-gate approach was complemented with CFD analysis to carry out an Environmental Impact Assessment (EIA). The lifecycle includes the stages of extraction, baking and transportation to the gate. The functional unit of this study was the production of a single brick in Chihuahua, Mexico and the impact categories studied were carcinogens, respiratory organics and inorganics, climate change radiation, ozone layer depletion, ecotoxicity, acidification/ eutrophication, land use, mineral use and fossil fuels. Laboratory techniques for fuel characterization, gas measurements in situ, and AP42 emission factors were employed in order to calculate gas emissions for inventory data. The results revealed that the categories with greater impacts are ecotoxicity and carcinogens. The CFD analysis is helpful in predicting the thermal diffusion and contaminants from a defined source. LCA-CFD synergy complemented the EIA and allowed us to identify the problem of thermal efficiency within the system.Keywords: LCA, CFD, brick, artisanal
Procedia PDF Downloads 3932506 Ag-Cu and Bi-Cd Eutectics Ribbons under Superplastic Tensile Test Regime
Authors: Edgar Ochoa, G. Torres-Villasenor
Abstract:
Superplastic deformation is shown by materials with a fine grain size, usually less than 10 μm, when they are deformed within the strain rate range 10-5 10-1 s-1 at temperatures greater than 0.5Tm, where Tm is the melting point in Kelvin. According to the constitutive equation for superplastic flow, refinement of the grain size would be expected to increase the optimum strain rate and decrease the temperature required for superplastic flow. Ribbons of eutectic Ag-Cu and Bi-Cd alloys were manufactured by using a single roller melt-spinning technique to obtain a fine grain structure for later test in superplastic regime. The eutectics ribbons were examined by scanning electron microscopy and X-Ray diffraction, and the grain size was determined using the image analysis software ImageJ. The average grain size was less than 1 μm. Tensile tests were carried out from 10-4 to 10-1 s-1, at room temperature, to evaluate the superplastic behavior. The largest deformation was shown by the Bi-Cd eutectic ribbons, Ɛ=140 %, despite that these ribbons have a hexagonal unit cell. On the other hand, Ag-Cu eutectic ribbons have a minor grain size and cube unit cell, however they showed a lower deformation in tensile test under the same conditions than Bi-Cd ribbons. This is because the Ag-Cu grew in a strong cube-cube orientation relationship.Keywords: eutectic ribbon, fine grain, superplastic deformation, cube-cube orientation
Procedia PDF Downloads 1692505 Educational Credit in Enhancing Collaboration between Universities and Companies in Smart City
Authors: Eneken Titov, Ly Hobe
Abstract:
The collaboration between the universities and companies has been a challenging topic for many years, and although we have many good experiences, those seem to be single examples between one university and company. In Ülemiste Smart City in Estonia, the new initiative was started in 2020 fall, when five Estonian universities cooperated, led by the Ülemiste City developing company Mainor, intending to provide charge-free university courses for the Ülemiste City companies and their employees to encourage university-company wider collaboration. Every Ülemiste City company gets a certain number of free educational credit hours per year to participate in university courses. A functional and simple web platform was developed to mediate university courses for the companies. From January 2021, the education credit platform is open for all Ülemiste City companies and their employees to join, and universities offer more than 9000 hours of courses (appr 150 ECTS). Just two months later, more than 20% of Ülemiste City companies (82 out of 400) have joined the project, and their employees have registered for more than in total 3000 hours courses. The first results already show that the project supports the university marketing and the continuous education mindset in general, whether 1/4 of the courses are paid courses (e.g., when the company is out of free credit).Keywords: education, educational credit, smart city, university-industry collaboration
Procedia PDF Downloads 2042504 Model-Based Field Extraction from Different Class of Administrative Documents
Authors: Jinen Daghrir, Anis Kricha, Karim Kalti
Abstract:
The amount of incoming administrative documents is massive and manually processing these documents is a costly task especially on the timescale. In fact, this problem has led an important amount of research and development in the context of automatically extracting fields from administrative documents, in order to reduce the charges and to increase the citizen satisfaction in administrations. In this matter, we introduce an administrative document understanding system. Given a document in which a user has to select fields that have to be retrieved from a document class, a document model is automatically built. A document model is represented by an attributed relational graph (ARG) where nodes represent fields to extract, and edges represent the relation between them. Both of vertices and edges are attached with some feature vectors. When another document arrives to the system, the layout objects are extracted and an ARG is generated. The fields extraction is translated into a problem of matching two ARGs which relies mainly on the comparison of the spatial relationships between layout objects. Experimental results yield accuracy rates from 75% to 100% tested on eight document classes. Our proposed method has a good performance knowing that the document model is constructed using only one single document.Keywords: administrative document understanding, logical labelling, logical layout analysis, fields extraction from administrative documents
Procedia PDF Downloads 2132503 Crack Width Analysis of Reinforced Concrete Members under Shrinkage Effect by Pseudo-Discrete Crack Model
Authors: F. J. Ma, A. K. H. Kwan
Abstract:
Crack caused by shrinkage movement of concrete is a serious problem especially when restraint is provided. It may cause severe serviceability and durability problems. The existing prediction methods for crack width of concrete due to shrinkage movement are mainly numerical methods under simplified circumstances, which do not agree with each other. To get a more unified prediction method applicable to more sophisticated circumstances, finite element crack width analysis for shrinkage effect should be developed. However, no existing finite element analysis can be carried out to predict the crack width of concrete due to shrinkage movement because of unsolved reasons of conventional finite element analysis. In this paper, crack width analysis implemented by finite element analysis is presented with pseudo-discrete crack model, which combines traditional smeared crack model and newly proposed crack queuing algorithm. The proposed pseudo-discrete crack model is capable of simulating separate and single crack without adopting discrete crack element. And the improved finite element analysis can successfully simulate the stress redistribution when concrete is cracked, which is crucial for predicting crack width, crack spacing and crack number.Keywords: crack queuing algorithm, crack width analysis, finite element analysis, shrinkage effect
Procedia PDF Downloads 4192502 Tetra Butyl Ammonium Cyanate Mediated Selective Synthesis of Sulfonyltriuret and Their Investigation towards Trypsin Protease Modulation
Authors: Amarjyoti Das Mahapatra, Umesh Kumar, Bhaskar Datta
Abstract:
A pseudo peptide can mimic the biological or structural properties of natural peptides. They have become an increasing attention in medicinal chemistry because of their interesting advantages like more bioavailability and less biodegradation than compare to the physiologically active native peptides which increase their therapeutic applications. Many biologically active compounds contain urea as functional groups, and they have improved pharmacokinetic properties because of their bioavailability and metabolic stability. Recently we have reported a single-step synthesis of sulfonyl urea and sulfonyltriuret from sulfonyl chloride and sodium cyanate. But the yield of sulfonyltriuret was less around 40-60% because of the formation of other products like sulfonamide and sulfonylureas. In the present work, we mainly focused on the selective synthesis of sulfonyltriuret using tetrabutylammonium cyanate and sulfonyl chloride. More precisely, we are interested in the controlled synthesis of oligomeric urea mainly sulfonyltriuret as a new class of pseudo peptide and their application as protease modulators. The distinctive architecture of these molecules in the form of their pseudo-peptide backbone offers promise as a potential pharmacophore. The synthesized molecules have been screened on trypsin enzyme, and we observed that these molecules are the efficient modulator of trypsin enzyme.Keywords: pseudo peptide, pharmacophore, sulfonyltriuret, trypsin
Procedia PDF Downloads 1672501 Air-Coupled Ultrasonic Testing for Non-Destructive Evaluation of Various Aerospace Composite Materials by Laser Vibrometry
Authors: J. Vyas, R. Kazys, J. Sestoke
Abstract:
Air-coupled ultrasonic is the contactless ultrasonic measurement approach which has become widespread for material characterization in Aerospace industry. It is always essential for the requirement of lightest weight, without compromising the durability. To archive the requirements, composite materials are widely used. This paper yields analysis of the air-coupled ultrasonics for composite materials such as CFRP (Carbon Fibre Reinforced Polymer) and GLARE (Glass Fiber Metal Laminate) and honeycombs for the design of modern aircrafts. Laser vibrometry could be the key source of characterization for the aerospace components. The air-coupled ultrasonics fundamentals, including principles, working modes and transducer arrangements used for this purpose is also recounted in brief. The emphasis of this paper is to approach the developed NDT techniques based on the ultrasonic guided waves applications and the possibilities of use of laser vibrometry in different materials with non-contact measurement of guided waves. 3D assessment technique which employs the single point laser head using, automatic scanning relocation of the material to assess the mechanical displacement including pros and cons of the composite materials for aerospace applications with defects and delaminations.Keywords: air-coupled ultrasonics, contactless measurement, laser interferometry, NDT, ultrasonic guided waves
Procedia PDF Downloads 2392500 A Study on the Annual Doses Received by the Workers of Some Medical Practices
Authors: Eltayeb Hamad Elneel Yousif
Abstract:
This paper describes occupational radiation doses of workers in non-destructive testing (NDT) and some medical practices during the year 2007. The annual doses received by the workers of a public hospital are presented in this report. The Department is facilitated with HARSHAW Reader model 6600 and assigned the rule of personal monitoring to contribute in controlling and reducing the doses received by radiation workers. TLD cards with two TLD chips type LiF: Mg, Ti (TLD-100) were calibrated to measure the personal dose equivalent Hp(10). Around 150 medical radiation workers were monitored throughout the year. Each worker received a single TLD card worn on the chest above lead apron and returned for laboratory reading every two months. The average annual doses received by the workers of radiotherapy, nuclear medicine and diagnostic radiology were evaluated. The annual doses for individual radiation workers ranged between 0.55-4.42 mSv, 0.48-1.86 mSv, and 0.48-0.91 mSv for the workers of radiotherapy, nuclear medicine and diagnostic radiology, respectively. The mean dose per worker was 1.29±1, 1.03±0.4, and 0.69±0.2 mSv, respectively. The results showed compliance with international dose limits. Our results reconfirm the importance of personal dosimetry service in assuring the radiation protection of medical staff in developing countries.Keywords: radiation medicine, non-destructive testing, TLD, public hospital
Procedia PDF Downloads 3792499 Design and Analysis of Electric Power Production Unit for Low Enthalpy Geothermal Reservoir Applications
Authors: Ildar Akhmadullin, Mayank Tyagi
Abstract:
The subject of this paper is the design analysis of a single well power production unit from low enthalpy geothermal resources. A complexity of the project is defined by a low temperature heat source that usually makes such projects economically disadvantageous using the conventional binary power plant approach. A proposed new compact design is numerically analyzed. This paper describes a thermodynamic analysis, a working fluid choice, downhole heat exchanger (DHE) and turbine calculation results. The unit is able to produce 321 kW of electric power from a low enthalpy underground heat source utilizing n-Pentane as a working fluid. A geo-pressured reservoir located in Vermilion Parish, Louisiana, USA is selected as a prototype for the field application. With a brine temperature of 126℃, the optimal length of DHE is determined as 304.8 m (1000ft). All units (pipes, turbine, and pumps) are chosen from commercially available parts to bring this project closer to the industry requirements. Numerical calculations are based on petroleum industry standards. The project is sponsored by the Department of Energy of the US.Keywords: downhole heat exchangers, geothermal power generation, organic rankine cycle, refrigerants, working fluids
Procedia PDF Downloads 3152498 3D Numerical Investigation of Asphalt Pavements Behaviour Using Infinite Elements
Authors: K. Sandjak, B. Tiliouine
Abstract:
This article presents the main results of three-dimensional (3-D) numerical investigation of asphalt pavement structures behaviour using a coupled Finite Element-Mapped Infinite Element (FE-MIE) model. The validation and numerical performance of this model are assessed by confronting critical pavement responses with Burmister’s solution and FEM simulation results for multi-layered elastic structures. The coupled model is then efficiently utilised to perform 3-D simulations of a typical asphalt pavement structure in order to investigate the impact of two tire configurations (conventional dual and new generation wide-base tires) on critical pavement response parameters. The numerical results obtained show the effectiveness and the accuracy of the coupled (FE-MIE) model. In addition, the simulation results indicate that, compared with conventional dual tire assembly, single wide base tire caused slightly greater fatigue asphalt cracking and subgrade rutting potentials and can thus be utilised in view of its potential to provide numerous mechanical, economic, and environmental benefits.Keywords: 3-D numerical investigation, asphalt pavements, dual and wide base tires, Infinite elements
Procedia PDF Downloads 2152497 Explainable Graph Attention Networks
Authors: David Pham, Yongfeng Zhang
Abstract:
Graphs are an important structure for data storage and computation. Recent years have seen the success of deep learning on graphs such as Graph Neural Networks (GNN) on various data mining and machine learning tasks. However, most of the deep learning models on graphs cannot easily explain their predictions and are thus often labelled as “black boxes.” For example, Graph Attention Network (GAT) is a frequently used GNN architecture, which adopts an attention mechanism to carefully select the neighborhood nodes for message passing and aggregation. However, it is difficult to explain why certain neighbors are selected while others are not and how the selected neighbors contribute to the final classification result. In this paper, we present a graph learning model called Explainable Graph Attention Network (XGAT), which integrates graph attention modeling and explainability. We use a single model to target both the accuracy and explainability of problem spaces and show that in the context of graph attention modeling, we can design a unified neighborhood selection strategy that selects appropriate neighbor nodes for both better accuracy and enhanced explainability. To justify this, we conduct extensive experiments to better understand the behavior of our model under different conditions and show an increase in both accuracy and explainability.Keywords: explainable AI, graph attention network, graph neural network, node classification
Procedia PDF Downloads 1992496 Computational Study of Flow and Heat Transfer Characteristics of an Incompressible Fluid in a Channel Using Lattice Boltzmann Method
Authors: Imdat Taymaz, Erman Aslan, Kemal Cakir
Abstract:
The Lattice Boltzmann Method (LBM) is performed to computationally investigate the laminar flow and heat transfer of an incompressible fluid with constant material properties in a 2D channel with a built-in triangular prism. Both momentum and energy transport is modelled by the LBM. A uniform lattice structure with a single time relaxation rule is used. Interpolation methods are applied for obtaining a higher flexibility on the computational grid, where the information is transferred from the lattice structure to the computational grid by Lagrange interpolation. The flow is researched on for different Reynolds number, while Prandtl number is keeping constant as a 0.7. The results show how the presence of a triangular prism effects the flow and heat transfer patterns for the steady-state and unsteady-periodic flow regimes. As an evaluation of the accuracy of the developed LBM code, the results are compared with those obtained by a commercial CFD code. It is observed that the present LBM code produces results that have similar accuracy with the well-established CFD code, as an additionally, LBM needs much smaller CPU time for the prediction of the unsteady phonema.Keywords: laminar forced convection, lbm, triangular prism
Procedia PDF Downloads 3732495 Influence of UV/Ozone Treatment on the Electrical Performance of Polystyrene Buffered Pentacene-Based OFETs
Authors: Lin Gong, Holger Göbel
Abstract:
In the present study, we have investigated the influence of UV/ozone treatment on pentacene-based organic field effect transistors (OFETs) with a bilayer gate dielectric. The OFETs for this study were fabricated on heavily n-doped Si substrates with a thermally deposited SiO2 dielectric layer (300nm). On the SiO2 dielectric a very thin (≈ 15nm) buffer layer of polystyrene (PS) was first spin-coated and then treated by UV/ozone to modify the surface prior to the deposition of pentacene. We found out that by extending the UV/ozone treatment time the threshold voltage of the OFETs was monotonically shifted towards positive values, whereas the field effect mobility first decreased but eventually reached a stable value after a treatment time of approximately thirty seconds. Since the field effect mobility of the UV/ozone treated bilayer OFETs was found to be higher than the value of a comparable transistor with a single layer dielectric, we propose that the bilayer (SiO2/PS) structure can be used to shift the threshold voltage to a desired value without sacrificing field effect mobility.Keywords: buffer layer, organic field effect transistors, threshold voltage, UV/ozone treatment
Procedia PDF Downloads 3372494 Enhancing a Recidivism Prediction Tool with Machine Learning: Effectiveness and Algorithmic Fairness
Authors: Marzieh Karimihaghighi, Carlos Castillo
Abstract:
This work studies how Machine Learning (ML) may be used to increase the effectiveness of a criminal recidivism risk assessment tool, RisCanvi. The two key dimensions of this analysis are predictive accuracy and algorithmic fairness. ML-based prediction models obtained in this study are more accurate at predicting criminal recidivism than the manually-created formula used in RisCanvi, achieving an AUC of 0.76 and 0.73 in predicting violent and general recidivism respectively. However, the improvements are small, and it is noticed that algorithmic discrimination can easily be introduced between groups such as national vs foreigner, or young vs old. It is described how effectiveness and algorithmic fairness objectives can be balanced, applying a method in which a single error disparity in terms of generalized false positive rate is minimized, while calibration is maintained across groups. Obtained results show that this bias mitigation procedure can substantially reduce generalized false positive rate disparities across multiple groups. Based on these results, it is proposed that ML-based criminal recidivism risk prediction should not be introduced without applying algorithmic bias mitigation procedures.Keywords: algorithmic fairness, criminal risk assessment, equalized odds, recidivism
Procedia PDF Downloads 1522493 Effect of Stitching Pattern on Composite Tubular Structures Subjected to Quasi-Static Crushing
Authors: Ali Rabiee, Hessam Ghasemnejad
Abstract:
Extensive experimental investigation on the effect of stitching pattern on tubular composite structures was conducted. The effect of stitching reinforcement through thickness on using glass flux yarn on energy absorption of fiber-reinforced polymer (FRP) was investigated under high speed loading conditions at axial loading. Keeping the mass of the structure at 125 grams and applying different pattern of stitching at various locations in theory enables better energy absorption, and also enables the control over the behaviour of force-crush distance curve. The study consists of simple non-stitch absorber comparison with single and multi-location stitching behaviour and its effect on energy absorption capabilities. The locations of reinforcements are 10 mm, 20 mm, 30 mm, 10-20 mm, 10-30 mm, 20-30 mm, 10-20-30 mm and 10-15-20-25-30-35 mm from the top of the specimen. The effect of through the thickness reinforcements has shown increase in energy absorption capabilities and crushing load. The significance of this is that as the stitching locations are closer, the crushing load increases and consequently energy absorption capabilities are also increased. The implementation of this idea would improve the mean force by applying stitching and controlling the behaviour of force-crush distance curve.Keywords: through-thickness stitching, 3D enforcement, energy absorption, tubular composite structures
Procedia PDF Downloads 2622492 A Crop Growth Subroutine for Watershed Resources Management (WRM) Model 1: Description
Authors: Kingsley Nnaemeka Ogbu, Constantine Mbajiorgu
Abstract:
Vegetation has a marked effect on runoff and has become an important component in hydrologic model. The watershed Resources Management (WRM) model, a process-based, continuous, distributed parameter simulation model developed for hydrologic and soil erosion studies at the watershed scale lack a crop growth component. As such, this model assumes a constant parameter values for vegetation and hydraulic parameters throughout the duration of hydrologic simulation. Our approach is to develop a crop growth algorithm based on the original plant growth model used in the Environmental Policy Integrated Climate Model (EPIC) model. This paper describes the development of a single crop growth model which has the capability of simulating all crops using unique parameter values for each crop. Simulated crop growth processes will reflect the vegetative seasonality of the natural watershed system. An existing model was employed for evaluating vegetative resistance by hydraulic and vegetative parameters incorporated into the WRM model. The improved WRM model will have the ability to evaluate the seasonal variation of the vegetative roughness coefficient with depth of flow and further enhance the hydrologic model’s capability for accurate hydrologic studies.Keywords: runoff, roughness coefficient, PAR, WRM model
Procedia PDF Downloads 3782491 Modelling Water Usage for Farming
Authors: Ozgu Turgut
Abstract:
Water scarcity is a problem for many regions which requires immediate action, and solutions cannot be postponed for a long time. It is known that farming consumes a significant portion of usable water. Although in recent years, the efforts to make the transition to dripping or spring watering systems instead of using surface watering started to pay off. It is also known that this transition is not necessarily translated into an increase in the capacity dedicated to other water consumption channels such as city water or power usage. In order to control and allocate the water resource more purposefully, new watering systems have to be used with monitoring abilities that can limit the usage capacity for each farm. In this study, a decision support model which relies on a bi-objective stochastic linear optimization is proposed, which takes crop yield and price volatility into account. The model generates annual planting plans as well as water usage limits for each farmer in the region while taking the total value (i.e., profit) of the overall harvest. The mathematical model is solved using the L-shaped method optimally. The decision support model can be especially useful for regional administrations to plan next year's planting and water incomes and expenses. That is why not only a single optimum but also a set of representative solutions from the Pareto set is generated with the proposed approach.Keywords: decision support, farming, water, tactical planning, optimization, stochastic, pareto
Procedia PDF Downloads 742490 Optimization of Turbocharged Diesel Engines
Authors: Ebrahim Safarian, Kadir Bilen, Akif Ceviz
Abstract:
The turbocharger and turbocharging have been the inherent component of diesel engines, so that critical parameters of such engines, as BSFC(Brake Specific Fuel Consumption) or thermal efficiency, fuel consumption, BMEP(Brake Mean Effective Pressure), the power density output and emission level have been improved extensively. In general, the turbocharger can be considered as the most complex component of diesel engines, because it has closely interrelated turbomachinery concepts of the turbines and the compressors to thermodynamic fundamentals of internal combustion engines and stress analysis of all components. In this paper, a waste gate for a conventional single stage radial turbine is investigated by consideration of turbochargers operation constrains and engine operation conditions, without any detail designs in the turbine and the compressor. Amount of opening waste gate which extended between the ranges of full opened and closed valve, is demonstrated by limiting compressor boost pressure ratio. Obtaining of an optimum point by regard above mentioned items is surveyed by three linked meanline modeling programs together which consist of Turbomatch®, Compal®, Rital®madules in concepts NREC® respectively.Keywords: turbocharger, wastegate, diesel engine, concept NREC programs
Procedia PDF Downloads 2432489 Scale Effects on the Wake Airflow of a Heavy Truck
Authors: Aude Pérard Lecomte, Georges Fokoua, Amine Mehel, Anne Tanière
Abstract:
Air quality in urban areas is deteriorated by pollution, mainly due to the constant increase of the traffic of different types of ground vehicles. In particular, particulate matter pollution with important concentrations in urban areas can cause serious health issues. Characterizing and understanding particle dynamics is therefore essential to establish recommendations to improve air quality in urban areas. To analyze the effects of turbulence on particulate pollutants dispersion, the first step is to focus on the single-phase flow structure and turbulence characteristics in the wake of a heavy truck model. To achieve this, Computational Fluid Dynamics (CFD) simulations were conducted with the aim of modeling the wake airflow of a full- and reduced-scale heavy truck. The Reynolds Average Navier-Stokes (RANS) approach with the Reynolds Stress Model (RSM)as the turbulence model closure was used. The simulations highlight the apparition of a large vortex coming from the under trailer. This vortex belongs to the recirculation region, located in the near-wake of the heavy truck. These vortical structures are expected to have a strong influence on particle dynamics that are emitted by the truck.Keywords: CDF, heavy truck, recirculation region, reduced scale
Procedia PDF Downloads 2192488 Linguistic Insights Improve Semantic Technology in Medical Research and Patient Self-Management Contexts
Authors: William Michael Short
Abstract:
Semantic Web’ technologies such as the Unified Medical Language System Metathesaurus, SNOMED-CT, and MeSH have been touted as transformational for the way users access online medical and health information, enabling both the automated analysis of natural-language data and the integration of heterogeneous healthrelated resources distributed across the Internet through the use of standardized terminologies that capture concepts and relationships between concepts that are expressed differently across datasets. However, the approaches that have so far characterized ‘semantic bioinformatics’ have not yet fulfilled the promise of the Semantic Web for medical and health information retrieval applications. This paper argues within the perspective of cognitive linguistics and cognitive anthropology that four features of human meaning-making must be taken into account before the potential of semantic technologies can be realized for this domain. First, many semantic technologies operate exclusively at the level of the word. However, texts convey meanings in ways beyond lexical semantics. For example, transitivity patterns (distributions of active or passive voice) and modality patterns (configurations of modal constituents like may, might, could, would, should) convey experiential and epistemic meanings that are not captured by single words. Language users also naturally associate stretches of text with discrete meanings, so that whole sentences can be ascribed senses similar to the senses of words (so-called ‘discourse topics’). Second, natural language processing systems tend to operate according to the principle of ‘one token, one tag’. For instance, occurrences of the word sound must be disambiguated for part of speech: in context, is sound a noun or a verb or an adjective? In syntactic analysis, deterministic annotation methods may be acceptable. But because natural language utterances are typically characterized by polyvalency and ambiguities of all kinds (including intentional ambiguities), such methods leave the meanings of texts highly impoverished. Third, ontologies tend to be disconnected from everyday language use and so struggle in cases where single concepts are captured through complex lexicalizations that involve profile shifts or other embodied representations. More problematically, concept graphs tend to capture ‘expert’ technical models rather than ‘folk’ models of knowledge and so may not match users’ common-sense intuitions about the organization of concepts in prototypical structures rather than Aristotelian categories. Fourth, and finally, most ontologies do not recognize the pervasively figurative character of human language. However, since the time of Galen the widespread use of metaphor in the linguistic usage of both medical professionals and lay persons has been recognized. In particular, metaphor is a well-documented linguistic tool for communicating experiences of pain. Because semantic medical knowledge-bases are designed to help capture variations within technical vocabularies – rather than the kinds of conventionalized figurative semantics that practitioners as well as patients actually utilize in clinical description and diagnosis – they fail to capture this dimension of linguistic usage. The failure of semantic technologies in these respects degrades the efficiency and efficacy not only of medical research, where information retrieval inefficiencies can lead to direct financial costs to organizations, but also of care provision, especially in contexts of patients’ self-management of complex medical conditions.Keywords: ambiguity, bioinformatics, language, meaning, metaphor, ontology, semantic web, semantics
Procedia PDF Downloads 1322487 Numerical Study on Parallel Rear-Spoiler on Super Cars
Authors: Anshul Ashu
Abstract:
Computers are applied to the vehicle aerodynamics in two ways. One of two is Computational Fluid Dynamics (CFD) and other is Computer Aided Flow Visualization (CAFV). Out of two CFD is chosen because it shows the result with computer graphics. The simulation of flow field around the vehicle is one of the important CFD applications. The flow field can be solved numerically using panel methods, k-ε method, and direct simulation methods. The spoiler is the tool in vehicle aerodynamics used to minimize unfavorable aerodynamic effects around the vehicle and the parallel spoiler is set of two spoilers which are designed in such a manner that it could effectively reduce the drag. In this study, the standard k-ε model of the simplified version of Bugatti Veyron, Audi R8 and Porsche 911 are used to simulate the external flow field. Flow simulation is done for variable Reynolds number. The flow simulation consists of three different levels, first over the model without a rear spoiler, second for over model with single rear spoiler, and third over the model with parallel rear-spoiler. The second and third level has following parameter: the shape of the spoiler, the angle of attack and attachment position. A thorough analysis of simulations results has been found. And a new parallel spoiler is designed. It shows a little improvement in vehicle aerodynamics with a decrease in vehicle aerodynamic drag and lift. Hence, it leads to good fuel economy and traction force of the model.Keywords: drag, lift, flow simulation, spoiler
Procedia PDF Downloads 5002486 Towards Reliable Mobile Cloud Computing
Authors: Khaled Darwish, Islam El Madahh, Hoda Mohamed, Hadia El Hennawy
Abstract:
Cloud computing has been one of the fastest growing parts in IT industry mainly in the context of the future of the web where computing, communication, and storage services are main services provided for Internet users. Mobile Cloud Computing (MCC) is gaining stream which can be used to extend cloud computing functions, services and results to the world of future mobile applications and enables delivery of a large variety of cloud application to billions of smartphones and wearable devices. This paper describes reliability for MCC by determining the ability of a system or component to function correctly under stated conditions for a specified period of time to be able to deal with the estimation and management of high levels of lifetime engineering uncertainty and risks of failure. The assessment procedures consists of determine Mean Time between Failures (MTBF), Mean Time to Failure (MTTF), and availability percentages for main components in both cloud computing and MCC structures applied on single node OpenStack installation to analyze its performance with different settings governing the behavior of participants. Additionally, we presented several factors have a significant impact on rates of change overall cloud system reliability should be taken into account in order to deliver highly available cloud computing services for mobile consumers.Keywords: cloud computing, mobile cloud computing, reliability, availability, OpenStack
Procedia PDF Downloads 397