Search results for: spatial application.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3842

Search results for: spatial application.

302 An Application of Self-Health Risk Assessment among Populations Living in the Vicinity of a Fiber-Cement Roofing Factory

Authors: Phayong Thepaksorn

Abstract:

The objective of this study was to assess whether living in proximity to a roofing fiber cement factory in southern Thailand was associated with physical, mental, social, and spiritual health domains measured in a self-reported health risk assessment (HRA) questionnaire. A cross-sectional study was conducted among community members divided into two groups: near population (living within 0-2km of factory) and far population (living within 2-5km of factory) (N=198). A greater proportion of those living far from the factory (65.34%) reported physical health problems than the near group (51.04%) (p =0.032). This study has demonstrated that the near population group had higher proportion of participants with positive ratings on mental assessment (30.34%) and social health impacts (28.42%) than far population group (10.59% and 16.67%, respectively) (p <0.001). The near population group (29.79%) had similar proportion of participants with positive ratings in spiritual health impacts compared with far population group (27.08%). Among females, but not males, this study demonstrated that a higher proportion of the near population had a positive summative score for the self-HRA, which included all four health domain, compared to the far population (p<0.001 for females; p = 0.154 for males). In conclusion, this self-HRA of physical, mental, social, and spiritual health domains reflected the risk perceptions of populations living in the vicinity of the roofing fiber cement factory. This type of tool can bring attention to population concerns and complaints in the factory’s surrounding community. Our findings may contribute to future development of self-HRA for HIA development procedure in Thailand.

Keywords: Cement dust, health impact assessment, risk assessment, walk-though survey.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921
301 Two-Dimensional Observation of Oil Displacement by Water in a Petroleum Reservoir through Numerical Simulation and Application to a Petroleum Reservoir

Authors: Ahmad Fahim Nasiry, Shigeo Honma

Abstract:

We examine two-dimensional oil displacement by water in a petroleum reservoir. The pore fluid is immiscible, and the porous media is homogenous and isotropic in the horizontal direction. Buckley-Leverett theory and a combination of Laplacian and Darcy’s law are used to study the fluid flow through porous media, and the Laplacian that defines the dispersion and diffusion of fluid in the sand using heavy oil is discussed. The reservoir is homogenous in the horizontal direction, as expressed by the partial differential equation. Two main factors which are observed are the water saturation and pressure distribution in the reservoir, and they are evaluated for predicting oil recovery in two dimensions by a physical and mathematical simulation model. We review the numerical simulation that solves difficult partial differential reservoir equations. Based on the numerical simulations, the saturation and pressure equations are calculated by the iterative alternating direction implicit method and the iterative alternating direction explicit method, respectively, according to the finite difference assumption. However, to understand the displacement of oil by water and the amount of water dispersion in the reservoir better, an interpolated contour line of the water distribution of the five-spot pattern, that provides an approximate solution which agrees well with the experimental results, is also presented. Finally, a computer program is developed to calculate the equation for pressure and water saturation and to draw the pressure contour line and water distribution contour line for the reservoir.

Keywords: Numerical simulation, immiscible, finite difference, IADI, IADE, waterflooding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1086
300 Prediction Modeling of Alzheimer’s Disease and Its Prodromal Stages from Multimodal Data with Missing Values

Authors: M. Aghili, S. Tabarestani, C. Freytes, M. Shojaie, M. Cabrerizo, A. Barreto, N. Rishe, R. E. Curiel, D. Loewenstein, R. Duara, M. Adjouadi

Abstract:

A major challenge in medical studies, especially those that are longitudinal, is the problem of missing measurements which hinders the effective application of many machine learning algorithms. Furthermore, recent Alzheimer's Disease studies have focused on the delineation of Early Mild Cognitive Impairment (EMCI) and Late Mild Cognitive Impairment (LMCI) from cognitively normal controls (CN) which is essential for developing effective and early treatment methods. To address the aforementioned challenges, this paper explores the potential of using the eXtreme Gradient Boosting (XGBoost) algorithm in handling missing values in multiclass classification. We seek a generalized classification scheme where all prodromal stages of the disease are considered simultaneously in the classification and decision-making processes. Given the large number of subjects (1631) included in this study and in the presence of almost 28% missing values, we investigated the performance of XGBoost on the classification of the four classes of AD, NC, EMCI, and LMCI. Using 10-fold cross validation technique, XGBoost is shown to outperform other state-of-the-art classification algorithms by 3% in terms of accuracy and F-score. Our model achieved an accuracy of 80.52%, a precision of 80.62% and recall of 80.51%, supporting the more natural and promising multiclass classification.

Keywords: eXtreme Gradient Boosting, missing data, Alzheimer disease, early mild cognitive impairment, late mild cognitive impairment, multiclass classification, ADNI, support vector machine, random forest.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 957
299 On the Accuracy of Basic Modal Displacement Method Considering Various Earthquakes

Authors: Seyed Sadegh Naseralavi, Sadegh Balaghi, Ehsan Khojastehfar

Abstract:

Time history seismic analysis is supposed to be the most accurate method to predict the seismic demand of structures. On the other hand, the required computational time of this method toward achieving the result is its main deficiency. While being applied in optimization process, in which the structure must be analyzed thousands of time, reducing the required computational time of seismic analysis of structures makes the optimization algorithms more practical. Apparently, the invented approximate methods produce some amount of errors in comparison with exact time history analysis but the recently proposed method namely, Complete Quadratic Combination (CQC) and Sum Root of the Sum of Squares (SRSS) drastically reduces the computational time by combination of peak responses in each mode. In the present research, the Basic Modal Displacement (BMD) method is introduced and applied towards estimation of seismic demand of main structure. Seismic demand of sampled structure is estimated by calculation of modal displacement of basic structure (in which the modal displacement has been calculated). Shear steel sampled structures are selected as case studies. The error applying the introduced method is calculated by comparison of the estimated seismic demands with exact time history dynamic analysis. The efficiency of the proposed method is demonstrated by application of three types of earthquakes (in view of time of peak ground acceleration).

Keywords: Time history dynamic analysis, basic modal displacement, earthquake induced demands, shear steel structures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1418
298 Toward Understanding and Testing Deep Learning Information Flow in Deep Learning-Based Android Apps

Authors: Jie Zhang, Qianyu Guo, Tieyi Zhang, Zhiyong Feng, Xiaohong Li

Abstract:

The widespread popularity of mobile devices and the development of artificial intelligence (AI) have led to the widespread adoption of deep learning (DL) in Android apps. Compared with traditional Android apps (traditional apps), deep learning based Android apps (DL-based apps) need to use more third-party application programming interfaces (APIs) to complete complex DL inference tasks. However, existing methods (e.g., FlowDroid) for detecting sensitive information leakage in Android apps cannot be directly used to detect DL-based apps as they are difficult to detect third-party APIs. To solve this problem, we design DLtrace, a new static information flow analysis tool that can effectively recognize third-party APIs. With our proposed trace and detection algorithms, DLtrace can also efficiently detect privacy leaks caused by sensitive APIs in DL-based apps. Additionally, we propose two formal definitions to deal with the common polymorphism and anonymous inner-class problems in the Android static analyzer. Using DLtrace, we summarize the non-sequential characteristics of DL inference tasks in DL-based apps and the specific functionalities provided by DL models for such apps. We conduct an empirical assessment with DLtrace on 208 popular DL-based apps in the wild and found that 26.0% of the apps suffered from sensitive information leakage. Furthermore, DLtrace outperformed FlowDroid in detecting and identifying third-party APIs. The experimental results demonstrate that DLtrace expands FlowDroid in understanding DL-based apps and detecting security issues therein.

Keywords: Mobile computing, deep learning apps, sensitive information, static analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 595
297 Personalized Applications for Advanced Healthcare through AI-ML and Blockchain

Authors: Anuja Vyas, Aikel Indurkhya, Hari Krishna Garg

Abstract:

Nearly 25 years have passed since the landmark publication of the Human Genome Project, yet scientists have only begun to scratch the surface of its potential benefits. To bridge this gap, a personalized genomic application has been envisioned as a transformative tool accessible to people worldwide. This innovative solution proposes an integrated framework combining blockchain technology, genome-specific applications, and data compression techniques, ensuring operations to be swift, secure, transparent, and space-efficient. The software harnesses advanced Artificial Intelligence and Machine Learning methodologies, such as neural networks, evaluation matrices, fuzzy logic, and expert systems, to analyze individual genomic data. It generates personalized reports by comparing a user's genome with a reference genome, highlighting significant differences. Blockchain technology, with its inherent security, encryption, and immutability features, is leveraged for robust data transport and storage. In addition, a 'Data Abbreviation' technique ensures that genetic data and reports occupy minimal space. This integrated approach promises to be a significant leap forward, potentially transforming human health and well-being on a global scale.

Keywords: Artificial intelligence in genomics, blockchain technology, data abbreviation, data compression, data security in genomics, data storage, expert systems, fuzzy logic, genome applications, genomic data analysis, human genome project, neural networks, personalized genomics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35
296 Effects of Different Meteorological Variables on Reference Evapotranspiration Modeling: Application of Principal Component Analysis

Authors: Akinola Ikudayisi, Josiah Adeyemo

Abstract:

The correct estimation of reference evapotranspiration (ETₒ) is required for effective irrigation water resources planning and management. However, there are some variables that must be considered while estimating and modeling ETₒ. This study therefore determines the multivariate analysis of correlated variables involved in the estimation and modeling of ETₒ at Vaalharts irrigation scheme (VIS) in South Africa using Principal Component Analysis (PCA) technique. Weather and meteorological data between 1994 and 2014 were obtained both from South African Weather Service (SAWS) and Agricultural Research Council (ARC) in South Africa for this study. Average monthly data of minimum and maximum temperature (°C), rainfall (mm), relative humidity (%), and wind speed (m/s) were the inputs to the PCA-based model, while ETₒ is the output. PCA technique was adopted to extract the most important information from the dataset and also to analyze the relationship between the five variables and ETₒ. This is to determine the most significant variables affecting ETₒ estimation at VIS. From the model performances, two principal components with a variance of 82.7% were retained after the eigenvector extraction. The results of the two principal components were compared and the model output shows that minimum temperature, maximum temperature and windspeed are the most important variables in ETₒ estimation and modeling at VIS. In order words, ETₒ increases with temperature and windspeed. Other variables such as rainfall and relative humidity are less important and cannot be used to provide enough information about ETₒ estimation at VIS. The outcome of this study has helped to reduce input variable dimensionality from five to the three most significant variables in ETₒ modelling at VIS, South Africa.

Keywords: Irrigation, principal component analysis, reference evapotranspiration, Vaalharts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1060
295 Application of AIMSUN Microscopic Simulation Model in Evaluating Side Friction Impacts on Traffic Stream Performance

Authors: H. Naghawi, M. Abu Shattal, W. Idewu

Abstract:

Side friction factors can be defined as all activities taking place at the side of the road and within the traffic stream, which would negatively affect the traffic stream performance. If the effect of these factors is adequately addressed and managed, traffic stream performance and capacity could be improved. The main objective of this paper is to identify and assess the impact of different side friction factors on traffic stream performance of a hypothesized urban arterial road. Hypothetical data were assumed mainly because there is no road operating under ideal conditions, with zero side friction, in the developing countries. This is important for the creation of the base model which is important for comparison purposes. For this purpose, three essential steps were employed. Step one, a hypothetical base model was developed under ideal traffic and geometric conditions. Step two, 18 hypothetical alternative scenarios were developed including side friction factors such as on-road parking, pedestrian movement, and the presence of trucks in the traffic stream. These scenarios were evaluated for one, two, and three lane configurations and under different traffic volumes ranging from low to high. Step three, the impact of side friction, of each scenario, on speed-flow models was evaluated using AIMSUN microscopic traffic simulation software. Generally, it was found that, a noticeable negative shift in the speed flow curves from the base conditions was observed for all scenarios. This indicates negative impact of the side friction factors on free flow speed and traffic stream average speed as well as on capacity.

Keywords: AIMSUN, parked vehicles, pedestrians, side friction, traffic performance, trucks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 859
294 Applying Kinect on the Development of a Customized 3D Mannequin

Authors: Shih-Wen Hsiao, Rong-Qi Chen

Abstract:

In the field of fashion design, 3D Mannequin is a kind of assisting tool which could rapidly realize the design concepts. While the concept of 3D Mannequin is applied to the computer added fashion design, it will connect with the development and the application of design platform and system. Thus, the situation mentioned above revealed a truth that it is very critical to develop a module of 3D Mannequin which would correspond with the necessity of fashion design. This research proposes a concrete plan that developing and constructing a system of 3D Mannequin with Kinect. In the content, ergonomic measurements of objective human features could be attained real-time through the implement with depth camera of Kinect, and then the mesh morphing can be implemented through transformed the locations of the control-points on the model by inputting those ergonomic data to get an exclusive 3D mannequin model. In the proposed methodology, after the scanned points from the Kinect are revised for accuracy and smoothening, a complete human feature would be reconstructed by the ICP algorithm with the method of image processing. Also, the objective human feature could be recognized to analyze and get real measurements. Furthermore, the data of ergonomic measurements could be applied to shape morphing for the division of 3D Mannequin reconstructed by feature curves. Due to a standardized and customer-oriented 3D Mannequin would be generated by the implement of subdivision, the research could be applied to the fashion design or the presentation and display of 3D virtual clothes. In order to examine the practicality of research structure, a system of 3D Mannequin would be constructed with JAVA program in this study. Through the revision of experiments the practicability-contained research result would come out.

Keywords: 3D Mannequin, kinect scanner, interactive closest point, shape morphing, subdivision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2061
293 In Vivo Evaluation of Stable Cream Containing Flavonoids on Hydration and TEWL of Human Skin

Authors: Haji M Shoaib Khan, Naveed Akhtar, Fatima Rasool, Barkat Ali Khan, Tariq Mahmood, Muhammad Shuaib Khan

Abstract:

Antioxidants contribute to endogenous photoprotection and are important for the maintenance of skin health. The study was carried out to compare the skin hydration and transepidermal water loss (TEWL) effects of a stable cosmetic preparation containing flavonoids, following two applications a day over a period of tenth week. The skin trans-epidermal water loss and skin hydration effect was measured at the beginning and up to the end of study period of ten weeks. Any effect produced was measured by Corneometer and TEWA meter (Non-invasive probe). Two formulations were developed for this study design. Formulation one the control formulation in which no apple juice extract( Flavonoids) was incorporated while second one was the active formulation in which the apple juice extract (3%) containing flavonoids was incorporated into water in oil emulsion using Abil EM 90 as an emulsifier. Stable formulations (control and Active) were applied on human cheeks (n = 12) for a study period of 10 weeks. Result of each volunteer of skin hydration and TEWL was measured by corneometer and TEWA meter. By using ANOVA and Paired sample t test as a statistical evaluation, result of both base and formulation were compared. Statistical significant results (p≤0.05) were observed regarding skin hydration and TEWL when two creams, control and Formulation were compared. It showed that desired formulation (Active) may have interesting application as an active moisturizing cream on healthy skin.

Keywords: Apple juice extract, TEWL, Corneometer, flavonoids.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2692
292 Isolation and Screening of Laccase Producing Basidiomycetes via Submerged Fermentations

Authors: Mun Yee Chan, Sin Ming Goh, Lisa Gaik Ai Ong

Abstract:

Approximately 10,000 different types of dyes and pigments are being used in various industrial applications yearly, which include the textile and printing industries. However, these dyes are difficult to degrade naturally once they enter the aquatic system. Their high persistency in natural environment poses a potential health hazard to all form of life. Hence, there is a need for alternative dye removal strategy in the environment via bioremediation. In this study, fungi laccase is investigated via commercial agar dyes plates and submerged fermentation to explore the application of fungi laccase in textile dye wastewater treatment. Two locally isolated basidiomycetes were screened for laccase activity using media added with commercial dyes such as 2, 2-azino-bis (3-ethylbenzothiazoline-6-sulfonic acid (ABTS), guaiacol and Remazol Brillant Blue R (RBBR). Isolate TBB3 (1.70±0.06) and EL2 (1.78±0.08) gave the highest results for ABTS plates with the appearance of greenish halo on around the isolates. Submerged fermentation performed on Isolate TBB3 with the productivity 3.9067 U/ml/day, whereas the laccase activity for Isolate EL2 was much lower (0.2097 U/ml/day). As isolate TBB3 showed higher laccase production, it was subjected to molecular characterization by DNA isolation, PCR amplification and sequencing of ITS region of nuclear ribosomal DNA. After being compared with other sequences in National Center for Biotechnology Information (NCBI database), isolate TBB3 is probably from species Trametes hirsutei. Further research work can be performed on this isolate by upscale the production of laccase in order to meet the demands of the requirement for higher enzyme titer for the bioremediation of textile dyes.

Keywords: Bioremediation, dyes, fermentation, laccase.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2183
291 Possible Number of Dwelling Units Using Waste Plastic Bottle for Construction

Authors: Dibya Jivan Pati, Kazuhisa Iki, Riken Homma

Abstract:

Unlike other metro cities of India, Bhubaneswar–the capital city of Odisha, is expected to reach 1-million-mark population by now. The demands of dwelling unit requirement mostly among urban poor belonging to Economically Weaker section (EWS) and Low Income groups (LIG) is becoming a challenge due to high housing cost and rents. As a matter of fact, it’s also noted that, with increase in population, the solid waste generation also increases subsequently affecting the environment due to inefficiency in collection of waste by local government bodies. Methods of utilizing Solid Waste - especially in form of Plastic bottles, Glass bottles and Metal cans (PGM) are now widely used as an alternative material for construction of low-cost building by Non-Government Organizations (NGOs) in developing countries like India to help the urban poor afford a shelter. The application of disposed plastic bottle used in construction of single dwelling significantly reduces the overall cost of construction to as much as 14% compared to traditional construction material. Therefore, considering its cost-benefit result, it’s possible to provide housing to EWS and LIGs at an affordable price. In this paper, we estimated the quantity of plastic bottles generated in Bhubaneswar which further helped to estimate the possible number of single dwelling unit that can be constructed on yearly basis so as to refrain from further housing shortage. The estimation results will be practically used for planning and managing low-cost housing business by local government and NGOs.

Keywords: Construction, dwelling unit, plastic bottle, solid waste generation, groups.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1083
290 Machine Learning Facing Behavioral Noise Problem in an Imbalanced Data Using One Side Behavioral Noise Reduction: Application to a Fraud Detection

Authors: Salma El Hajjami, Jamal Malki, Alain Bouju, Mohammed Berrada

Abstract:

With the expansion of machine learning and data mining in the context of Big Data analytics, the common problem that affects data is class imbalance. It refers to an imbalanced distribution of instances belonging to each class. This problem is present in many real world applications such as fraud detection, network intrusion detection, medical diagnostics, etc. In these cases, data instances labeled negatively are significantly more numerous than the instances labeled positively. When this difference is too large, the learning system may face difficulty when tackling this problem, since it is initially designed to work in relatively balanced class distribution scenarios. Another important problem, which usually accompanies these imbalanced data, is the overlapping instances between the two classes. It is commonly referred to as noise or overlapping data. In this article, we propose an approach called: One Side Behavioral Noise Reduction (OSBNR). This approach presents a way to deal with the problem of class imbalance in the presence of a high noise level. OSBNR is based on two steps. Firstly, a cluster analysis is applied to groups similar instances from the minority class into several behavior clusters. Secondly, we select and eliminate the instances of the majority class, considered as behavioral noise, which overlap with behavior clusters of the minority class. The results of experiments carried out on a representative public dataset confirm that the proposed approach is efficient for the treatment of class imbalances in the presence of noise.

Keywords: Machine learning, Imbalanced data, Data mining, Big data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1136
289 Robotics and Embedded Systems Applied to the Buried Pipeline Inspection

Authors: Robson C. Santos, Julio C. P. Ribeiro, Iorran M. de Castro, Luan C. F. Rodrigues, Sandro R. L. Silva, Diego M. Quesada

Abstract:

The work aims to develop a robot in the form of autonomous vehicle to detect, inspection and mapping of underground pipelines through the ATmega328 Arduino platform. Hardware prototyping is very similar to C / C ++ language that facilitates its use in robotics open source, resembles PLC used in large industrial processes. The robot will traverse the surface independently of direct human action, in order to automate the process of detecting buried pipes, guided by electromagnetic induction. The induction comes from coils that send the signal to the Arduino microcontroller contained in that will make the difference in intensity and the treatment of the information, and then this determines actions to electrical components such as relays and motors, allowing the prototype to move on the surface and getting the necessary information. This change of direction is performed by a stepper motor with a servo motor. The robot was developed by electrical and electronic assemblies that allowed test your application. The assembly is made up of metal detector coils, circuit boards and microprocessor, which interconnected circuits previously developed can determine, process control and mechanical actions for a robot (autonomous car) that will make the detection and mapping of buried pipelines plates. This type of prototype can prevent and identifies possible landslides and they can prevent the buried pipelines suffer an external pressure on the walls with the possibility of oil leakage and thus pollute the environment.

Keywords: Robotic, metal detector, embedded system, pipeline.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2159
288 Enhancing Hand Efficiency of Smart Glass Cleaning Robot through Generative Design Module

Authors: Pankaj Gupta, Amit Kumar Srivastava, Nitesh Pandey

Abstract:

This article explores the domain of generative design in order to enhance the development of robot designs for innovative and efficient maintenance approaches for tall buildings. This study aims to optimize the design of robotic hands by focusing on minimizing mass and volume while ensuring they can withstand the specified pressure with equal strength. The research procedure is structured and systematic. The purpose of optimization is to enhance the efficiency of the robot and reduce the manufacturing expenses. The project seeks to investigate the application of generative design in order to optimize products. Autodesk Fusion 360 offers the capability to immediately apply the generative design functionality to the solid model. The effort involved creating a solid model of the Smart Glass Cleaning Robot and optimizing one of its components, the Hand, using generative techniques. The article has thoroughly examined the designs, outcomes, and procedure. These loads serve as a benchmark for creating designs that can endure the necessary level of pressure and preserve their structural integrity. The efficacy of the generative design process is contingent upon the selection of materials, as different materials possess distinct physical attributes. The study utilizes five different materials, namely Steel, Stainless Steel, Titanium, Aluminum, and CFRP (Carbon Fiber Reinforced Polymer), in order to investigate a range of design possibilities.

Keywords: Generative design, mass and volume optimization, material strength analysis, generative design, smart glass cleaning robot.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 199
287 Reversible Binary Arithmetic for Integrated Circuit Design

Authors: D. Krishnaveni, M. Geetha Priya

Abstract:

Application of reversible logic in integrated circuits results in the improved optimization of power consumption. This technology can be put into use in a variety of low power applications such as quantum computing, optical computing, nano-technology, and Complementary Metal Oxide Semiconductor (CMOS) Very Large Scale Integrated (VLSI) design etc. Logic gates are the basic building blocks in the design of any logic network and thus integrated circuits. In this paper, reversible Dual Key Gate (DKG) and Dual key Gate Pair (DKGP) gates that work singly as full adder/full subtractor are used to realize the basic building blocks of logic circuits. Reversible full adder/subtractor and parallel adder/ subtractor are designed using other reversible gates available in the literature and compared with that of DKG & DKGP gates. Efficient performance of reversible logic circuits relies on the optimization of the key parameters viz number of constant inputs, garbage outputs and number of reversible gates. The full adder/subtractor and parallel adder/subtractor design with reversible DKGP and DKG gates results in least number of constant inputs, garbage outputs, and number of reversible gates compared to the other designs. Thus, this paper provides a threshold to build more complex arithmetic systems using these reversible logic gates, leading to the enhanced performance of computing systems.

Keywords: Low power CMOS, quantum computing, reversible logic gates, full adder, full subtractor, parallel adder/subtractor, basic gates, universal gates.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1436
286 The Discovery and Application of Perspective Representation in Modern Italy

Authors: Matthias Stange

Abstract:

In the early modern period, a different image of man began to prevail in Europe. The focus was on the self-determined human being and his abilities. At first, these developments could be seen in Italian painting and architecture, which again oriented itself to the concepts and forms of antiquity. For example, through the discovery of perspective representation by Brunelleschi or later the orthogonal projection by Alberti, after the ancient knowledge of optics had been forgotten in the Middle Ages. The understanding of reality in the Middle Ages was not focused on the sensually perceptible world, but was determined by ecclesiastical dogmas. The empirical part of this study examines the rediscovery and development of perspective. With the paradigm of antiquity, the figure of the architect was also recognised again - the cultural man trained theoretically and practically in numerous subjects, as Vitruvius describes him. In this context, the role of the architect, the influence on the painting of the Quattrocento as well as the influence on architectural representation in the Baroque period are examined. Baroque is commonly associated with the idea of illusionistic appearance as opposed to the tangible reality presented in the Renaissance. The study has shown that the central perspective projection developed by Filippo Brunelleschi enabled another understanding of seeing and the dissemination of painted images. Brunelleschi's development made it possible to understand the sight of nature as a reflection of what is presented to the viewer's eye. Alberti later shortened Brunelleschi's central perspective representation for practical use in painting. In early modern Italian architecture and painting, these developments apparently supported each other. The pictorial representation of architecture initially served the development of an art form before it became established in building practice itself.

Keywords: Alberti, Brunelleschi, Central perspective projection, Orthogonal projection, Quattrocento, Baroque.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 175
285 Numerical Simulation of Free Surface Water Wave for the Flow around NACA 0012 Hydrofoil and Wigley Hull Using VOF Method

Authors: Saadia Adjali, Omar Imine, Mohammed Aounallah, Mustapha Belkadi

Abstract:

Steady three-dimensional and two free surface waves generated by moving bodies are presented, the flow problem to be simulated is rich in complexity and poses many modeling challenges because of the existence of breaking waves around the ship hull, and because of the interaction of the two-phase flow with the turbulent boundary layer. The results of several simulations are reported. The first study was performed for NACA0012 of hydrofoil with different meshes, this section is analyzed at h/c= 1, 0345 for 2D. In the second simulation a mathematically defined Wigley hull form is used to investigate the application of a commercial CFD code in prediction of the total resistance and its components from tangential and normal forces on the hull wetted surface. The computed resistance and wave profiles are used to estimate the coefficient of the total resistance for Wigley hull advancing in calm water under steady conditions. The commercial CFD software FLUENT version 12 is used for the computations in the present study. The calculated grid is established using the code computer GAMBIT 2.3.26. The shear stress k-ωSST model is used for turbulence modeling and the volume of fluid technique is employed to simulate the free-surface motion. The second order upwind scheme is used for discretizing the convection terms in the momentum transport equations, the Modified HRIC scheme for VOF discretization. The results obtained compare well with the experimental data.

Keywords: Free surface flows, Breaking waves, Boundary layer, Wigley hull, Volume of fluid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3560
284 Numerical Simulation of Free Surface Water Wave for the Flow around NACA 0012 Hydrofoil and Wigley Hull Using VOF Method

Authors: Saadia Adjali, Omar Imine, Mohammed Aounallah, Mustapha Belkadi

Abstract:

Steady three-dimensional and two free surface waves generated by moving bodies are presented, the flow problem to be simulated is rich in complexity and poses many modeling challenges because of the existence of breaking waves around the ship hull, and because of the interaction of the two-phase flow with the turbulent boundary layer. The results of several simulations are reported. The first study was performed for NACA0012 of hydrofoil with different meshes, this section is analyzed at h/c= 1, 0345 for 2D. In the second simulation a mathematically defined Wigley hull form is used to investigate the application of a commercial CFD code in prediction of the total resistance and its components from tangential and normal forces on the hull wetted surface. The computed resistance and wave profiles are used to estimate the coefficient of the total resistance for Wigley hull advancing in calm water under steady conditions. The commercial CFD software FLUENT version 12 is used for the computations in the present study. The calculated grid is established using the code computer GAMBIT 2.3.26. The shear stress k-ωSST model is used for turbulence modeling and the volume of fluid technique is employed to simulate the free-surface motion. The second order upwind scheme is used for discretizing the convection terms in the momentum transport equations, the Modified HRIC scheme for VOF discretization. The results obtained compare well with the experimental data.

Keywords: Free surface flows, breaking waves, boundary layer, Wigley hull, volume of fluid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3301
283 Software Product Quality Evaluation Model with Multiple Criteria Decision Making Analysis

Authors: C. Ardil

Abstract:

This paper presents a software product quality evaluation model based on the ISO/IEC 25010 quality model. The evaluation characteristics and sub characteristics were identified from the ISO/IEC 25010 quality model. The multidimensional structure of the quality model is based on characteristics such as functional suitability, performance efficiency, compatibility, usability, reliability, security, maintainability, and portability, and associated sub characteristics. Random numbers are generated to establish the decision maker’s importance weights for each sub characteristics. Also, random numbers are generated to establish the decision matrix of the decision maker’s final scores for each software product against each sub characteristics. Thus, objective criteria importance weights and index scores for datasets were obtained from the random numbers. In the proposed model, five different software product quality evaluation datasets under three different weight vectors were applied to multiple criteria decision analysis method, preference analysis for reference ideal solution (PARIS) for comparison, and sensitivity analysis procedure. This study contributes to provide a better understanding of the application of MCDMA methods and ISO/IEC 25010 quality model guidelines in software product quality evaluation process.

Keywords: ISO/IEC 25010 quality model, multiple criteria decisions making, multiple criteria decision making analysis, MCDMA, PARIS, Software Product Quality Evaluation Model, Software Product Quality Evaluation, Software Evaluation, Software Selection, Software

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 447
282 Environmental Decision Making Model for Assessing On-Site Performances of Building Subcontractors

Authors: Buket Metin

Abstract:

Buildings cause a variety of loads on the environment due to activities performed at each stage of the building life cycle. Construction is the first stage that affects both the natural and built environments at different steps of the process, which can be defined as transportation of materials within the construction site, formation and preparation of materials on-site and the application of materials to realize the building subsystems. All of these steps require the use of technology, which varies based on the facilities that contractors and subcontractors have. Hence, environmental consequences of the construction process should be tackled by focusing on construction technology options used in every step of the process. This paper presents an environmental decision-making model for assessing on-site performances of subcontractors based on the construction technology options which they can supply. First, construction technologies, which constitute information, tools and methods, are classified. Then, environmental performance criteria are set forth related to resource consumption, ecosystem quality, and human health issues. Finally, the model is developed based on the relationships between the construction technology components and the environmental performance criteria. The Fuzzy Analytical Hierarchy Process (FAHP) method is used for weighting the environmental performance criteria according to environmental priorities of decision-maker(s), while the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method is used for ranking on-site environmental performances of subcontractors using quantitative data related to the construction technology components. Thus, the model aims to provide an insight to decision-maker(s) about the environmental consequences of the construction process and to provide an opportunity to improve the overall environmental performance of construction sites.

Keywords: Construction process, construction technology, decision making, environmental performance, subcontractors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1170
281 An Improved Total Variation Regularization Method for Denoising Magnetocardiography

Authors: Yanping Liao, Congcong He, Ruigang Zhao

Abstract:

The application of magnetocardiography signals to detect cardiac electrical function is a new technology developed in recent years. The magnetocardiography signal is detected with Superconducting Quantum Interference Devices (SQUID) and has considerable advantages over electrocardiography (ECG). It is difficult to extract Magnetocardiography (MCG) signal which is buried in the noise, which is a critical issue to be resolved in cardiac monitoring system and MCG applications. In order to remove the severe background noise, the Total Variation (TV) regularization method is proposed to denoise MCG signal. The approach transforms the denoising problem into a minimization optimization problem and the Majorization-minimization algorithm is applied to iteratively solve the minimization problem. However, traditional TV regularization method tends to cause step effect and lacks constraint adaptability. In this paper, an improved TV regularization method for denoising MCG signal is proposed to improve the denoising precision. The improvement of this method is mainly divided into three parts. First, high-order TV is applied to reduce the step effect, and the corresponding second derivative matrix is used to substitute the first order. Then, the positions of the non-zero elements in the second order derivative matrix are determined based on the peak positions that are detected by the detection window. Finally, adaptive constraint parameters are defined to eliminate noises and preserve signal peak characteristics. Theoretical analysis and experimental results show that this algorithm can effectively improve the output signal-to-noise ratio and has superior performance.

Keywords: Constraint parameters, derivative matrix, magnetocardiography, regular term, total variation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 696
280 In vitro Studies of Mucoadhesiveness and Release of Nicotinamide Oral Gels Prepared from Bioadhesive Polymers

Authors: Sarunyoo Songkro, Naranut Rajatasereekul, Nipapat Cheewasrirungrueng

Abstract:

The aim of the present study was to evaluate the mucoadhesion and the release of nicotinamide gel formulations using in vitro methods. An agar plate technique was used to investigate the adhesiveness of the gels whereas a diffusion apparatus was employed to determine the release of nicotinamide from the gels. In this respect, 10% w/w nicotinamide gels containing bioadhesive polymers: Carbopol 934P (0.5-2% w/w), hydroxypropylmethyl cellulose (HPMC) (4-10% w/w), sodium carboxymethyl cellulose (SCMC) (4-6% w/w) and methylcellulose 4000 (MC) (3-5% w/w) were prepared. The gel formulations had pH values in the range of 7.14 - 8.17, which were considered appropriate to oral mucosa application. In general, the rank order of pH values appeared to be SCMC > MC4000 > HPMC > Carbopol 934P. Types and concentrations of polymers used somewhat affected the adhesiveness. It was found that anionic polymers (Carbopol 934 and SCMC) adhered more firmly to the agar plate than the neutral polymers (HPMC and MC 4000). The formulation containing 0.5% Carbopol 934P (F1) showed the highest release rate. With the exception of the formulation F1, the neutral polymers tended to give higher relate rates than the anionic polymers. For oral tissue treatment, the optimum has to be balanced between the residence time (adhesiveness) of the formulations and the release rate of the drug. The formulations containing the anionic polymers: Carbopol 934P or SCMC possessed suitable physical properties (appearance, pH and viscosity). In addition, for anionic polymer formulations, justifiable mucoadhesive properties and reasonable release rates of nicotinamide were achieved. Accordingly, these gel formulations may be applied for the treatment of oral mucosal lesions.

Keywords: Nicotinamide, bioadhesive polymer, mucoadhesiveness, release rate, gel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2691
279 Surfactant Stabilized Nanoemulsion: Characterization and Application in Enhanced Oil Recovery

Authors: Ajay Mandal, Achinta Bera

Abstract:

Nanoemulsions are a class of emulsions with a droplet size in the range of 50–500 nm and have attracted a great deal of attention in recent years because it is unique characteristics. The physicochemical properties of nanoemulsion suggests that it can be successfully used to recover the residual oil which is trapped in the fine pore of reservoir rock by capillary forces after primary and secondary recovery. Oil-in-water nanoemulsion which can be formed by high-energy emulsification techniques using specific surfactants can reduce oil-water interfacial tension (IFT) by 3-4 orders of magnitude. The present work is aimed on characterization of oil-inwater nanoemulsion in terms of its phase behavior, morphological studies; interfacial energy; ability to reduce the interfacial tension and understanding the mechanisms of mobilization and displacement of entrapped oil blobs by lowering interfacial tension both at the macroscopic and microscopic level. In order to investigate the efficiency of oil-water nanoemulsion in enhanced oil recovery (EOR), experiments were performed to characterize the emulsion in terms of their physicochemical properties and size distribution of the dispersed oil droplet in water phase. Synthetic mineral oil and a series of surfactants were used to prepare oil-in-water emulsions. Characterization of emulsion shows that it follows pseudo-plastic behaviour and drop size of dispersed oil phase follows lognormal distribution. Flooding experiments were also carried out in a sandpack system to evaluate the effectiveness of the nanoemulsion as displacing fluid for enhanced oil recovery. Substantial additional recoveries (more than 25% of original oil in place) over conventional water flooding were obtained in the present investigation.

Keywords: Nanoemulsion, Characterization, Enhanced Oil Recovery, Particle Size Distribution

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5026
278 Corrosion Mitigation in Gas Facilities Piping through the Use of Fusion Bond Epoxy Coated Pipes and Corrosion Resistant Alloy Girth Welds

Authors: Saad Alkhaldi, Fadi Ghammas, Tariq Alghamdi, Stefano Alexandirs

Abstract:

The operating conditions and corrosive nature of the process fluid in the Haradh and Hawiyah areas are subjecting facility piping to undesirable corrosion phenomena. Therefore, production headers inside remote headers have been internally cladded with high alloy material to mitigate the corrosion damage mechanism. Corrosion mitigation in the jump-over lines, constructed between the existing flowlines and the newly constructed facilities to provide operational flexibility, is proposed. This corrosion mitigation system includes the application of fusion bond epoxy (FBE) coating on the internal surface of the pipe and depositing corrosion-resistant alloy (CRA) weld layers at pipe and fittings ends to protect the carbon steel material. In addition, high alloy CRA weld material is used to deposit the girth weld between the 90-degree elbows and mating internally coated segments. A rigorous testing and qualification protocol was established prior to actual adoption at the Haradh and Hawiyah Field Gas Compression Program, currently being executed by Saudi Aramco. The proposed mitigation system, aimed at applying the cladding at the ends of the internally FBE coated pipes/elbows, will resolve field joint coating challenges, eliminate the use of approximately 1700 breakout flanges, and prevent the potential hydrocarbon leaks.

Keywords: Corrosion, FBE coated sour service, cost savings.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 327
277 Design of a Satellite Solar Panel Deployment Mechanism Using the Brushed DC Motor as Rotational Speed Damper

Authors: Hossein Ramezani Ali-Akbari

Abstract:

This paper presents an innovative method to control the rotational speed of a satellite solar panel during its deployment phase. A brushed DC motor has been utilized in the passive spring driven deployment mechanism to reduce the deployment speed. In order to use the DC motor as a damper, its connector terminals have been connected with an external resistance in a closed circuit. It means that, in this approach, there is no external power supply in the circuit. The working principle of this method is based on the back electromotive force (or back EMF) of the DC motor when an external torque (here the torque produced by the torsional springs) is coupled to the DC motor’s shaft. In fact, the DC motor converts to an electric generator and the current flows into the circuit and then produces the back EMF. Based on Lenz’s law, the generated current produced a torque which acts opposite to the applied external torque, and as a result, the deployment speed of the solar panel decreases. The main advantage of this method is to set an intended damping coefficient to the system via changing the external resistance. To produce the sufficient current, a gearbox has been assembled to the DC motor which magnifies the number of turns experienced by the DC motor. The coupled electro-mechanical equations of the system have been derived and solved, then, the obtained results have been presented. A full-scale prototype of the deployment mechanism has been built and tested. The potential application of brushed DC motors as a rotational speed damper has been successfully demonstrated.

Keywords: Back electromotive force, brushed DC motor, rotational speed damper, satellite solar panel deployment mechanism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1646
276 Bio-Surfactant Production and Its Application in Microbial EOR

Authors: A. Rajesh Kanna, G. Suresh Kumar, Sathyanaryana N. Gummadi

Abstract:

There are various sources of energies available worldwide and among them, crude oil plays a vital role. Oil recovery is achieved using conventional primary and secondary recovery methods. In-order to recover the remaining residual oil, technologies like Enhanced Oil Recovery (EOR) are utilized which is also known as tertiary recovery. Among EOR, Microbial enhanced oil recovery (MEOR) is a technique which enables the improvement of oil recovery by injection of bio-surfactant produced by microorganisms. Bio-surfactant can retrieve unrecoverable oil from the cap rock which is held by high capillary force. Bio-surfactant is a surface active agent which can reduce the interfacial tension and reduce viscosity of oil and thereby oil can be recovered to the surface as the mobility of the oil is increased. Research in this area has shown promising results besides the method is echo-friendly and cost effective compared with other EOR techniques. In our research, on laboratory scale we produced bio-surfactant using the strain Pseudomonas putida (MTCC 2467) and injected into designed simple sand packed column which resembles actual petroleum reservoir. The experiment was conducted in order to determine the efficiency of produced bio-surfactant in oil recovery. The column was made of plastic material with 10 cm in length. The diameter was 2.5 cm. The column was packed with fine sand material. Sand was saturated with brine initially followed by oil saturation. Water flooding followed by bio-surfactant injection was done to determine the amount of oil recovered. Further, the injection of bio-surfactant volume was varied and checked how effectively oil recovery can be achieved. A comparative study was also done by injecting Triton X 100 which is one of the chemical surfactant. Since, bio-surfactant reduced surface and interfacial tension oil can be easily recovered from the porous sand packed column.

Keywords: Bio-surfactant, Bacteria, Interfacial tension, Sand column.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2776
275 Using the Minnesota Multiphasic Personality Inventory-2 and Mini Mental State Examination-2 in Cognitive Behavioral Therapy: Case Studies

Authors: Cornelia-Eugenia Munteanu

Abstract:

From a psychological perspective, psychopathology is the area of clinical psychology that has at its core psychological assessment and psychotherapy. In day-to-day clinical practice, psychodiagnosis and psychotherapy are used independently, according to their intended purpose and their specific methods of application. The paper explores how the Minnesota Multiphasic Personality Inventory-2 (MMPI-2) and Mini Mental State Examination-2 (MMSE-2) psychological tools contribute to enhancing the effectiveness of cognitive behavioral psychotherapy (CBT). This combined approach, psychotherapy in conjunction with assessment of personality and cognitive functions, is illustrated by two cases, a severe depressive episode with psychotic symptoms and a mixed anxiety-depressive disorder. The order in which CBT, MMPI-2, and MMSE-2 were used in the diagnostic and therapeutic process was determined by the particularities of each case. In the first case, the sequence started with psychotherapy, followed by the administration of blue form MMSE-2, MMPI-2, and red form MMSE-2. In the second case, the cognitive screening with blue form MMSE-2 led to a personality assessment using MMPI-2, followed by red form MMSE-2; reapplication of the MMPI-2 due to the invalidation of the first profile, and finally, psychotherapy. The MMPI-2 protocols gathered useful information that directed the steps of therapeutic intervention: a detailed symptom picture of potentially self-destructive thoughts and behaviors otherwise undetected during the interview. The memory loss and poor concentration were confirmed by MMSE-2 cognitive screening. This combined approach, psychotherapy with psychological assessment, aligns with the trend of adaptation of the psychological services to the everyday life of contemporary man and paves the way for deepening and developing the field.

Keywords: Assessment, cognitive behavioral psychotherapy, MMPI-2, MMSE-2, psychopathology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2100
274 Application of Voltage Stability Indices for Proper Placement of STATCOM under Load Increase Scenario

Authors: A. S. Telang, P. P. Bedekar

Abstract:

In today’s world, electrical energy has become an indispensable component of all aspects of modern human life. Reliability, security and stability are the key aspects of any power system. Failure to meet any of these three aspects results into a great impediment to modern life. Modern power systems are being subjected to heavily stressed conditions leading to voltage stability problems. If the voltage stability problems are not mitigated properly through proper voltage stability assessment methods, cascading events may occur which may lead to voltage collapse or blackout events. Modern FACTS devices like STATCOM are one of the measures to overcome the blackout problems. As these devices are very costly, they must be installed properly at suitable locations, mostly at weak bus. Line voltage stability indices such as FVSI, Lmn and LQP play important role for identification of a weak bus. This paper presents evaluation of these line stability indices for the assessment of reliable information about the closeness of the power system to voltage collapse. PSAT is a user-friendly MATLAB toolbox, of which CPF is an important feature which has been extensively used for the placement of STATCOM to assess the stability. Novelty of the present research work lies in that the active and reactive load has been changed simultaneously at all the load buses under consideration. MATLAB code has been developed for the same and tested successfully on various standard IEEE test systems. The results for standard IEEE14 bus test system, specifically, are presented in this paper.

Keywords: Voltage stability analysis, voltage collapse, PSAT, CPF, VSI, FVSI, Lmn, LQP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1782
273 Reducing CO2 Emission Using EDA and Weighted Sum Model in Smart Parking System

Authors: Rahman Ali, Muhammad Sajjad, Farkhund Iqbal, Muhammad Sadiq Hassan Zada, Mohammed Hussain

Abstract:

Emission of Carbon Dioxide (CO2) has adversely affected the environment. One of the major sources of CO2 emission is transportation. In the last few decades, the increase in mobility of people using vehicles has enormously increased the emission of CO2 in the environment. To reduce CO2 emission, sustainable transportation system is required in which smart parking is one of the important measures that need to be established. To contribute to the issue of reducing the amount of CO2 emission, this research proposes a smart parking system. A cloud-based solution is provided to the drivers which automatically searches and recommends the most preferred parking slots. To determine preferences of the parking areas, this methodology exploits a number of unique parking features which ultimately results in the selection of a parking that leads to minimum level of CO2 emission from the current position of the vehicle. To realize the methodology, a scenario-based implementation is considered. During the implementation, a mobile application with GPS signals, vehicles with a number of vehicle features and a list of parking areas with parking features are used by sorting, multi-level filtering, exploratory data analysis (EDA, Analytical Hierarchy Process (AHP)) and weighted sum model (WSM) to rank the parking areas and recommend the drivers with top-k most preferred parking areas. In the EDA process, “2020testcar-2020-03-03”, a freely available dataset is used to estimate CO2 emission of a particular vehicle. To evaluate the system, results of the proposed system are compared with the conventional approach, which reveal that the proposed methodology supersedes the conventional one in reducing the emission of CO2 into the atmosphere.

Keywords: CO2 emission, IoT, EDA, Weighted Sum Model, WSM, regression, smart parking system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 741