Search results for: web application architecture
3882 Application of Moringa Oleifer Seed in Removing Colloids from Turbid Wastewater
Authors: Zemmouri Hassiba, Lounici Hakim, Mameri Nabil
Abstract:
Dried crushed seeds of Moringa oleifera contain an effective soluble protein; a natural cationic polyelectrolyte which causes coagulation. The present study aims to investigate the performance of Moringa oleifera seed extract as natural coagulant in clarification of secondary wastewater treatment highly charged in colloidal. A series of Jar tests was undertaken using raw wastewater providing from secondary decanter of Reghaia municipal wastewater treatment plant (MWWTP) located in East of Algiers, Algeria. Coagulation flocculation performance of Moringa oleifera was evaluated through supernatant residual turbidity. Various influence parameters namely Moringa oleifera dosage and pH have been considered. Tests on Reghaia wastewater, having 129 NTU of initial turbidity, showed a removal of 69.45% of residual turbidity with only 1.5 mg/l of Moringa oleifera. This sufficient removal capability encourages the use of this bioflocculant for treatment of turbid waters. Based on this result, the coagulant seed extract of Moringa oleifera is better suited to clarify municipal wastewater by removing turbidity. Indeed, Moringa oleifera which is a natural resource available locally (South of Algeria) coupled to the non-toxicity, biocompatibility and biodegradability, may be a very interesting alternative to the conventional coagulants used so far.Keywords: coagulation flocculation, colloids, moringa oleifera, secondary wastewater
Procedia PDF Downloads 3143881 Application of Monitoring of Power Generation through GPRS Network in Rural Residênias Cabo Frio/Rj
Authors: Robson C. Santos, David D. Oliveira, Matheus M. Reis, Gerson G. Cunha, Marcos A. C. Moreira
Abstract:
The project demonstrates the construction of a solar power generation, integrated inverter equipment to a "Grid-Tie" by converting direct current generated by solar panels, into alternating current, the same parameters of frequency and voltage concessionaire distribution network. The energy generated is quantified by smart metering module that transmits the information in specified periods of time to a microcontroller via GSM modem. The modem provides the measured data on the internet, using networks and cellular antennas. The monitoring, fault detection and maintenance are performed by a supervisory station. Employed board types, best inverter selection and studies about control equipment and devices have been described. The article covers and explores the global trend of implementing smart distribution electrical energy networks and the incentive to use solar renewable energy. There is the possibility of the excess energy produced by the system be purchased by the local power utility. This project was implemented in residences in the rural community of the municipality of Cabo Frio/RJ. Data could be seen through daily measurements during the month of November 2013.Keywords: rural residence, supervisory, smart grid, solar energy
Procedia PDF Downloads 5953880 Image Segmentation Techniques: Review
Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo
Abstract:
Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.Keywords: clustering-based, convolution-network, edge-based, region-growing
Procedia PDF Downloads 1003879 Application of a Generalized Additive Model to Reveal the Relations between the Density of Zooplankton with Other Variables in the West Daya Bay, China
Authors: Weiwen Li, Hao Huang, Chengmao You, Jianji Liao, Lei Wang, Lina An
Abstract:
Zooplankton are a central issue in the ecology which makes a great contribution to maintaining the balance of an ecosystem. It is critical in promoting the material cycle and energy flow within the ecosystems. A generalized additive model (GAM) was applied to analyze the relationships between the density (individuals per m³) of zooplankton and other variables in West Daya Bay. All data used in this analysis (the survey month, survey station (longitude and latitude), the depth of the water column, the superficial concentration of chlorophyll a, the benthonic concentration of chlorophyll a, the number of zooplankton species and the number of zooplankton species) were collected through monthly scientific surveys during January to December 2016. GLM model (generalized linear model) was used to choose the significant variables’ impact on the density of zooplankton, and the GAM was employed to analyze the relationship between the density of zooplankton and the significant variables. The results showed that the density of zooplankton increased with an increase of the benthonic concentration of chlorophyll a, but decreased with a decrease in the depth of the water column. Both high numbers of zooplankton species and the overall total number of zooplankton individuals led to a higher density of zooplankton.Keywords: density, generalized linear model, generalized additive model, the West Daya Bay, zooplankton
Procedia PDF Downloads 1563878 Cost Analysis of Optimized Fast Network Mobility in IEEE 802.16e Networks
Authors: Seyyed Masoud Seyyedoshohadaei, Borhanuddin Mohd Ali
Abstract:
To support group mobility, the NEMO Basic Support Protocol has been standardized as an extension of Mobile IP that enables an entire network to change its point of attachment to the Internet. Using NEMO in IEEE 802.16e (WiMax) networks causes latency in handover procedure and affects seamless communication of real-time applications. To decrease handover latency and service disruption time, an integrated scheme named Optimized Fast NEMO (OFNEMO) was introduced by authors of this paper. In OFNEMO a pre-establish multi tunnels concept, cross function optimization and cross layer design are used. In this paper, an analytical model is developed to evaluate total cost consisting of signaling and packet delivery costs of the OFNEMO compared with RFC3963. Results show that OFNEMO increases probability of predictive mode compared with RFC3963 due to smaller handover latency. Even though OFNEMO needs extra signalling to pre-establish multi tunnel, it has less total cost thanks to its optimized algorithm. OFNEMO can minimize handover latency for supporting real time application in moving networks.Keywords: fast mobile IPv6, handover latency, IEEE802.16e, network mobility
Procedia PDF Downloads 1983877 Physicochemical and Functional Characteristics of Hemp Protein Isolate
Authors: El-Sohaimy Sobhy A., Androsova Natalia, Toshev Abuvali Djabarovec
Abstract:
The conditions of the isolation of proteins from the hemp seeds were optimized in the current work. Moreover, the physicochemical and functional properties of hemp protein isolate were evaluated for its potential application in food manufacturing. The elastin protein is the most predominant protein in the protein profile with a molecular weight of 58.1 KDa, besides albumin, with a molecular weight of 31.5 KDa. The FTIR spectrum detected the absorption peaks of the amide I in 1750 and 1600 cm⁻¹, which pointed to C=O stretching while N-H was stretching at 1650-1580 cm⁻¹. The peak at 3250 was related to N-H stretching of primary aliphatic amine (3400-3300 cm⁻¹), and the N-H stretching for secondary (II) amine appeared at 3350-3310 cm⁻¹. Hemp protein isolate (HPI) was showed high content of arginine (15.52 g/100 g), phenylalanine+tyrosine (9.63 g/100 g), methionine + cysteine (5.49 g/100 g), leucine + isoleucine (5.21 g/100 g) and valine (4.53 g/100 g). It contains a moderate level of threonine (3.29 g/100 g) and lysine (2.50 g/100 g), with the limiting amino acid being a tryptophan (0.22 g/100 g HPI). HPI showed high water-holding capacity (4.5 ± 2.95 ml/g protein) and oil holding capacity (2.33 ± 1.88 ml/g) values. The foaming capacity of HPI was increased with increasing the pH values to reach the maximum value at pH 11 (67.23±3.20 %). The highest emulsion ability index of HPI was noted at pH 9 (91.3±2.57 m2/g) with low stability (19.15±2.03).Keywords: Cannabis sativa ssp., protein isolate, isolation conditions, amino acid composition, chemical properties, functional properties
Procedia PDF Downloads 1833876 d-Block Metal Nanoparticles Confined in Triphenylphosphine Oxide Functionalized Core-Crosslinked Micelles for the Application in Biphasic Hydrogenation
Authors: C. Joseph Abou-Fayssal, K. Philippot, R. Poli, E. Manoury, A. Riisager
Abstract:
The use of soluble polymer-supported metal nanoparticles (MNPs) has received significant attention for the ease of catalyst recovery and recycling. Of particular interest are MNPs that are supported on polymers that are either soluble or form stable colloidal dispersion in water, as this allows to combine of the advantages of the aqueous biphasic protocol with the catalytical performances of MNPs. The objective is to achieve good confinement of the catalyst in the nanoreactor cores and, thus, a better catalyst recovery in order to overcome the previously witnessed MNP extraction. Inspired by previous results, we are interested in the design of polymeric nanoreactors functionalized with ligands able to solidly anchor metallic nanoparticles in order to control the activity and selectivity of the developed nanocatalysts. The nanoreactors are core-crosslinked micelles (CCM) synthesized by reversible addition-fragmentation chain transfer (RAFT) polymerization. Varying the nature of the core-linked functionalities allows us to get differently stabilized metal nanoparticles and thus compare their performance in the catalyzed aqueous biphasic hydrogenation of model substrates. Particular attention is given to catalyst recyclability.Keywords: biphasic catalysis, metal nanoparticles, polymeric nanoreactors, catalyst recovery, RAFT polymerization
Procedia PDF Downloads 1063875 The Post Thawing Quality of Boer Goat Semen after Freezing by Mr. Frosty System Using Commercial Diluter
Authors: Gatot Ciptadi, Mudawamah, R. P. Putra, S. Wahjuningsih, A. M. Munazaroh
Abstract:
The success rate of Artificial Insemination (AI) application, particularly in the field at the farmer level is highly dependent on the quality of the sperms one post thawing. The objective of this research was to determine the effect of freezing method (-1oC/ minute) using Mr. Frosty system with commercial diluents on the post-thawing quality of Boer goat semen. Method use is experimental design with the completely randomized design (CRD) with 4 treatments of commercial diluter percentage (v/v). Freezing semen was cryopreserved in 2 main final temperatures of –45 oC (Freezer) and –196 oC (liquid nitrogen). Result showed that different commercial diluter is influenced on viability motility and abnormalities of Boer semen. Pre-freezing qualities of viability, motilities and abnormalities was 88.67+4.16 %, 66.33 +1.53 % and 4.67+ 0.57 % respectively. Meanwhile, post-thawing qualities is considered as good as standard qualities at least more than 40 % (51.0+6.5%). The percentage of commercial diluents were influenced highly significant (P<0.01).The best diluents ration is 1:4 (v/v) for both final sperms stocked. However freezing sperm conserved in -196 oC is better than –45 oC (i.e. motility 39.3.94 % vs. 51.0 + 6.5 %). It was concluded that Mr. frosty system was considered as the feasible method for freezing semen in the reason for practical purposes.Keywords: sperm quality, goat, viability, diluteR
Procedia PDF Downloads 2633874 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia
Authors: Yuyun Wabula, B. J. Dewancker
Abstract:
In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.Keywords: geolocation, Twitter, distribution analysis, human mobility
Procedia PDF Downloads 3163873 Computational Analysis and Daily Application of the Key Neurotransmitters Involved in Happiness: Dopamine, Oxytocin, Serotonin, and Endorphins
Authors: Hee Soo Kim, Ha Young Kyung
Abstract:
Happiness and pleasure are a result of dopamine, oxytocin, serotonin, and endorphin levels in the body. In order to increase the four neurochemical levels, it is important to associate daily activities with its corresponding neurochemical releases. This includes setting goals, maintaining social relationships, laughing frequently, and exercising regularly. The likelihood of experiencing happiness increases when all four neurochemicals are released at the optimal level. The achievement of happiness is important because it increases healthiness, productivity, and the ability to overcome adversity. To process emotions, electrical brain waves, brain structure, and neurochemicals must be analyzed. This research uses Chemcraft and Avogadro to determine the theoretical and chemical properties of the four neurochemical molecules. Each neurochemical molecule’s thermodynamic stability is calculated to observe the efficiency of the molecules. The study found that among dopamine, oxytocin, serotonin, alpha-, beta-, and gamma-endorphin, beta-endorphin has the lowest optimized energy of 388.510 kJ/mol. Beta-endorphin, a neurotransmitter involved in mitigating pain and stress, is the most thermodynamically stable and efficient molecule that is involved in the process of happiness. Through examining such properties of happiness neurotransmitters, the science of happiness is better understood.Keywords: happiness, neurotransmitters, positive psychology, dopamine, oxytocin, serotonin, endorphins
Procedia PDF Downloads 1553872 Laser Based Microfabrication of a Microheater Chip for Cell Culture
Authors: Daniel Nieto, Ramiro Couceiro
Abstract:
Microfluidic chips have demonstrated their significant application potentials in microbiological processing and chemical reactions, with the goal of developing monolithic and compact chip-sized multifunctional systems. Heat generation and thermal control are critical in some of the biochemical processes. The paper presents a laser direct-write technique for rapid prototyping and manufacturing of microheater chips and its applicability for perfusion cell culture outside a cell incubator. The aim of the microheater is to take the role of conventional incubators for cell culture for facilitating microscopic observation or other online monitoring activities during cell culture and provides portability of cell culture operation. Microheaters (5 mm × 5 mm) have been successfully fabricated on soda-lime glass substrates covered with aluminum layer of thickness 120 nm. Experimental results show that the microheaters exhibit good performance in temperature rise and decay characteristics, with localized heating at targeted spatial domains. These microheaters were suitable for a maximum long-term operation temperature of 120ºC and validated for long-time operation at 37ºC. for 24 hours. Results demonstrated that the physiology of the cultured SW480 adenocarcinoma of the colon cell line on the developed microheater chip was consistent with that of an incubator.Keywords: laser microfabrication, microheater, bioengineering, cell culture
Procedia PDF Downloads 2993871 Advances in Design Decision Support Tools for Early-stage Energy-Efficient Architectural Design: A Review
Authors: Maryam Mohammadi, Mohammadjavad Mahdavinejad, Mojtaba Ansari
Abstract:
The main driving force for increasing movement towards the design of High-Performance Buildings (HPB) are building codes and rating systems that address the various components of the building and their impact on the environment and energy conservation through various methods like prescriptive methods or simulation-based approaches. The methods and tools developed to meet these needs, which are often based on building performance simulation tools (BPST), have limitations in terms of compatibility with the integrated design process (IDP) and HPB design, as well as use by architects in the early stages of design (when the most important decisions are made). To overcome these limitations in recent years, efforts have been made to develop Design Decision Support Systems, which are often based on artificial intelligence. Numerous needs and steps for designing and developing a Decision Support System (DSS), which complies with the early stages of energy-efficient architecture design -consisting of combinations of different methods in an integrated package- have been listed in the literature. While various review studies have been conducted in connection with each of these techniques (such as optimizations, sensitivity and uncertainty analysis, etc.) and their integration of them with specific targets; this article is a critical and holistic review of the researches which leads to the development of applicable systems or introduction of a comprehensive framework for developing models complies with the IDP. Information resources such as Science Direct and Google Scholar are searched using specific keywords and the results are divided into two main categories: Simulation-based DSSs and Meta-simulation-based DSSs. The strengths and limitations of different models are highlighted, two general conceptual models are introduced for each category and the degree of compliance of these models with the IDP Framework is discussed. The research shows movement towards Multi-Level of Development (MOD) models, well combined with early stages of integrated design (schematic design stage and design development stage), which are heuristic, hybrid and Meta-simulation-based, relies on Big-real Data (like Building Energy Management Systems Data or Web data). Obtaining, using and combining of these data with simulation data to create models with higher uncertainty, more dynamic and more sensitive to context and culture models, as well as models that can generate economy-energy-efficient design scenarios using local data (to be more harmonized with circular economy principles), are important research areas in this field. The results of this study are a roadmap for researchers and developers of these tools.Keywords: integrated design process, design decision support system, meta-simulation based, early stage, big data, energy efficiency
Procedia PDF Downloads 1633870 Energy Analysis of Sugarcane Production: A Case Study in Metehara Sugar Factory in Ethiopia
Authors: Wasihun Girma Hailemariam
Abstract:
Energy is one of the key elements required for every agricultural activity, especially for large scale agricultural production such as sugarcane cultivation which mostly is used to produce sugar and bioethanol from sugarcane. In such kinds of resource (energy) intensive activities, energy analysis of the production system and looking for other alternatives which can reduce energy inputs of the sugarcane production process are steps forward for resource management. The purpose of this study was to determine input energy (direct and indirect) per hectare of sugarcane production sector of Metehara sugar factory in Ethiopia. Total energy consumption of the production system was 61,642 MJ/ha-yr. This total input energy is a cumulative value of different inputs (direct and indirect inputs) in the production system. The contribution of these different inputs is discussed and a scenario of substituting the most influential input by other alternative input which can replace the original input in its nutrient content was discussed. In this study the most influential input for increased energy consumption was application of organic fertilizer which accounted for 50 % of the total energy consumption. Filter cake which is a residue from the sugar production in the factory was used to substitute the organic fertilizer and the reduction in the energy consumption of the sugarcane production was discussedKeywords: energy analysis, organic fertilizer, resource management, sugarcane
Procedia PDF Downloads 1613869 Numerical Investigation on Optimizing Fatigue Life in a Lap Joint Structure
Authors: P. Zamani, S. Mohajerzadeh, R. Masoudinejad, K. Farhangdoost
Abstract:
The riveting process is one of the important ways to keep fastening the lap joints in aircraft structures. Failure of aircraft lap joints directly depends on the stress field in the joint. An important application of riveting process is in the construction of aircraft fuselage structures. In this paper, a 3D finite element method is carried out in order to optimize residual stress field in a riveted lap joint and also to estimate its fatigue life. In continue, a number of experiments are designed and analyzed using design of experiments (DOE). Then, Taguchi method is used to select an optimized case between different levels of each factor. Besides that, the factor which affects the most on residual stress field is investigated. Such optimized case provides the maximum residual stress field. Fatigue life of the optimized joint is estimated by Paris-Erdogan law. Stress intensity factors (SIFs) are calculated using both finite element analysis and experimental formula. In addition, the effect of residual stress field, geometry, and secondary bending are considered in SIF calculation. A good agreement is found between results of such methods. Comparison between optimized fatigue life and fatigue life of other joints has shown an improvement in the joint’s life.Keywords: fatigue life, residual stress, riveting process, stress intensity factor, Taguchi method
Procedia PDF Downloads 4543868 BIM Model and Virtual Prototyping in Construction Management
Authors: Samar Alkindy
Abstract:
Purpose: The BIM model has been used to support the planning of different construction projects in the industry by showing the different stages of the construction process. The model has been instrumental in identifying some of the common errors in the construction process through the spatial arrangement. The continuous use of the BIM model in the construction industry has resulted in various radical changes such as virtual prototyping. Construction virtual prototyping is a highly advanced technology that incorporates a BIM model with realistic graphical simulations, and facilitates the simulation of the project before a product is built in the factory. The paper presents virtual prototyping in the construction industry by examining its application, challenges and benefits to a construction project. Methodology approach: A case study was conducted for this study in four major construction projects, which incorporate virtual construction prototyping in several stages of the construction project. Furthermore, there was the administration of interviews with the project manager and engineer and the planning manager. Findings: Data collected from the methodological approach shows a positive response for virtual construction prototyping in construction, especially concerning communication and visualization. Furthermore, the use of virtual prototyping has increased collaboration and efficiency between construction experts handling a project. During the planning stage, virtual prototyping has increased accuracy, reduced planning time, and reduced the amount of rework during the implementation stage. Irrespective of virtual prototyping being a new concept in the construction industry, the findings outline that the approach will benefit the management of construction projects.Keywords: construction operations, construction planning, process simulation, virtual prototyping
Procedia PDF Downloads 2343867 Analytical Derivative: Importance on Environment and Water Analysis/Cycle
Authors: Adesoji Sodeinde
Abstract:
Analytical derivatives has recently undergone an explosive growth in areas of separation techniques, likewise in detectability of certain compound/concentrated ions. The gloomy and depressing scenario which charaterized the application of analytical derivatives in areas of water analysis, water cycle and the environment should not be allowed to continue unabated. Due to technological advancement in various chemical/biochemical analysis separation techniques is widely used in areas of medical, forensic and to measure and assesses environment and social-economic impact of alternative control strategies. This technological improvement was dully established in the area of comparison between certain separation/detection techniques to bring about vital result in forensic[as Gas liquid chromatography reveals the evidence given in court of law during prosecution of drunk drivers]. The water quality analysis,pH and water temperature analysis can be performed in the field, the concentration of dissolved free amino-acid [DFAA] can also be detected through separation techniques. Some important derivatives/ions used in separation technique. Water analysis : Total water hardness [EDTA to determine ca and mg ions]. Gas liquid chromatography : innovative gas such as helium [He] or nitrogen [N] Water cycle : Animal bone charcoal,activated carbon and ultraviolet light [U.V light].Keywords: analytical derivative, environment, water analysis, chemical/biochemical analysis
Procedia PDF Downloads 3413866 Forecast of Polyethylene Properties in the Gas Phase Polymerization Aided by Neural Network
Authors: Nasrin Bakhshizadeh, Ashkan Forootan
Abstract:
A major problem that affects the quality control of polymer in the industrial polymerization is the lack of suitable on-line measurement tools to evaluate the properties of the polymer such as melt and density indices. Controlling the polymerization in ordinary method is performed manually by taking samples, measuring the quality of polymer in the lab and registry of results. This method is highly time consuming and leads to producing large number of incompatible products. An online application for estimating melt index and density proposed in this study is a neural network based on the input-output data of the polyethylene production plant. Temperature, the level of reactors' bed, the intensity of ethylene mass flow, hydrogen and butene-1, the molar concentration of ethylene, hydrogen and butene-1 are used for the process to establish the neural model. The neural network is taught based on the actual operational data and back-propagation and Levenberg-Marquart techniques. The simulated results indicate that the neural network process model established with three layers (one hidden layer) for forecasting the density and the four layers for the melt index is able to successfully predict those quality properties.Keywords: polyethylene, polymerization, density, melt index, neural network
Procedia PDF Downloads 1453865 Aerodynamic Design of a Light Long Range Blended Wing Body Unmanned Vehicle
Authors: Halison da Silva Pereira, Ciro Sobrinho Campolina Martins, Vitor Mainenti Leal Lopes
Abstract:
Long range performance is a goal for aircraft configuration optimization. Blended Wing Body (BWB) is presented in many works of literature as the most aerodynamically efficient design for a fixed-wing aircraft. Because of its high weight to thrust ratio, BWB is the ideal configuration for many Unmanned Aerial Vehicle (UAV) missions on geomatics applications. In this work, a BWB aerodynamic design for typical light geomatics payload is presented. Aerodynamic non-dimensional coefficients are predicted using low Reynolds number computational techniques (3D Panel Method) and wing parameters like aspect ratio, taper ratio, wing twist and sweep are optimized for high cruise performance and flight quality. The methodology of this work is a summary of tailless aircraft wing design and its application, with appropriate computational schemes, to light UAV subjected to low Reynolds number flows leads to conclusions like the higher performance and flight quality of thicker airfoils in the airframe body and the benefits of using aerodynamic twist rather than just geometric.Keywords: blended wing body, low Reynolds number, panel method, UAV
Procedia PDF Downloads 5893864 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact
Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed
Abstract:
Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).Keywords: Bayesian network, classification, expert knowledge, structure learning, surface water analysis
Procedia PDF Downloads 1293863 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs
Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar
Abstract:
The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.Keywords: simulation, probability, confidence interval, sensitivity analysis
Procedia PDF Downloads 3853862 An Interactive Platform Displaying Mixed Reality Media
Authors: Alfred Chen, Cheng Chieh Hsu, Yu-Pin Ma, Meng-Jie Lin, Fu Pai Chiu, Yi-Yan Sie
Abstract:
This study is attempted to construct a human-computer interactive platform system that has mainly consisted of an augmented hardware system, a software system, a display table, and mixed media. This system has provided with human-computer interaction services through an interactive platform for the tourism industry. A well designed interactive platform, integrating of augmented reality and mixed media, has potential to enhance museum display quality and diversity. Besides, it will create a comprehensive and creative display mode for most museums and historical heritages. Therefore, it is essential to let public understand what the platform is, how it functions, and most importantly how one builds an interactive augmented platform. Hence the authors try to elaborate the construction process of the platform in detail. Thus, there are three issues to be considered, i.e.1) the theory and application of augmented reality, 2) the hardware and software applied, and 3) the mixed media presented. In order to describe how the platform works, Courtesy Door of Tainan Confucius Temple has been selected as case study in this study. As a result, a developed interactive platform has been presented by showing the physical entity object, along with virtual mixing media such as text, images, animation, and video. This platform will result in providing diversified and effective information that will be delivered to the users.Keywords: human-computer interaction, mixed reality, mixed media, tourism
Procedia PDF Downloads 4913861 Convergence and Stability in Federated Learning with Adaptive Differential Privacy Preservation
Authors: Rizwan Rizwan
Abstract:
This paper provides an overview of Federated Learning (FL) and its application in enhancing data security, privacy, and efficiency. FL utilizes three distinct architectures to ensure privacy is never compromised. It involves training individual edge devices and aggregating their models on a server without sharing raw data. This approach not only provides secure models without data sharing but also offers a highly efficient privacy--preserving solution with improved security and data access. Also we discusses various frameworks used in FL and its integration with machine learning, deep learning, and data mining. In order to address the challenges of multi--party collaborative modeling scenarios, a brief review FL scheme combined with an adaptive gradient descent strategy and differential privacy mechanism. The adaptive learning rate algorithm adjusts the gradient descent process to avoid issues such as model overfitting and fluctuations, thereby enhancing modeling efficiency and performance in multi-party computation scenarios. Additionally, to cater to ultra-large-scale distributed secure computing, the research introduces a differential privacy mechanism that defends against various background knowledge attacks.Keywords: federated learning, differential privacy, gradient descent strategy, convergence, stability, threats
Procedia PDF Downloads 343860 Fundamental Theory of the Evolution Force: Gene Engineering utilizing Synthetic Evolution Artificial Intelligence
Authors: L. K. Davis
Abstract:
The effects of the evolution force are observable in nature at all structural levels ranging from small molecular systems to conversely enormous biospheric systems. However, the evolution force and work associated with formation of biological structures has yet to be described mathematically or theoretically. In addressing the conundrum, we consider evolution from a unique perspective and in doing so we introduce the “Fundamental Theory of the Evolution Force: FTEF”. We utilized synthetic evolution artificial intelligence (SYN-AI) to identify genomic building blocks and to engineer 14-3-3 ζ docking proteins by transforming gene sequences into time-based DNA codes derived from protein hierarchical structural levels. The aforementioned served as templates for random DNA hybridizations and genetic assembly. The application of hierarchical DNA codes allowed us to fast forward evolution, while dampening the effect of point mutations. Natural selection was performed at each hierarchical structural level and mutations screened using Blosum 80 mutation frequency-based algorithms. Notably, SYN-AI engineered a set of three architecturally conserved docking proteins that retained motion and vibrational dynamics of native Bos taurus 14-3-3 ζ.Keywords: 14-3-3 docking genes, synthetic protein design, time-based DNA codes, writing DNA code from scratch
Procedia PDF Downloads 1183859 Evaluation of UI for 3D Visualization-Based Building Information Applications
Authors: Monisha Pattanaik
Abstract:
In scenarios where users have to work with large amounts of hierarchical data structures combined with visualizations (For example, Construction 3d Models, Manufacturing equipment's models, Gantt charts, Building Plans), the data structures have a high density in terms of consisting multiple parent nodes up to 50 levels and their siblings to descendants, therefore convey an immediate feeling of complexity. With customers moving to consumer-grade enterprise software, it is crucial to make sophisticated features made available to touch devices or smaller screen sizes. This paper evaluates the UI component that allows users to scroll through all deep density levels using a slider overlay on top of the hierarchy table, performing several actions to focus on one set of objects at any point in time. This overlay component also solves the problem of excessive horizontal scrolling of the entire table on a fixed pane for a hierarchical table. This component can be customized to navigate through parents, only siblings, or a specific component of the hierarchy only. The evaluation of the UI component was done by End Users of application and Human-Computer Interaction (HCI) experts to test the UI component's usability with statistical results and recommendations to handle complex hierarchical data visualizations.Keywords: building information modeling, digital twin, navigation, UI component, user interface, usability, visualization
Procedia PDF Downloads 1403858 Learning to Transform, Transforming to Learn: An Exploration of Teacher Professional Learning in the 4Cs (Communication, Collaboration, Creativity and Critical Reflection) in the Primary (K-6) Setting
Authors: Susan E Orlovich
Abstract:
Ongoing, effective teacher professional learning is acknowledged as a critical influence on teacher practice. However, it is unclear whether the elements of effective professional learning result in transformed teacher practice in the classroom. This research project is interested in 4C teacher professional learning. The professional learning practices to assist teachers in transforming their practice to integrate the 4C capabilities seldom feature in the academic literature. The 4Cs are a shorthand way of representing the concepts of communication, collaboration, creativity, and critical reflection and refer to the capabilities needed for deeper learning, personal growth, and effective participation in society. The New South Wales curriculum review (2020) acknowledges that identifying, teaching, and assessing the 4C capabilities are areas of challenge for teachers. However, it also recognises that it is essential for teachers to build the confidence and capacity to understand, teach and assess the capabilities necessary for learners to thrive in the 21st century. This qualitative research project explores the professional learning experiences of sixteen teachers in four different primaries (K-6) settings in Sydney, Australia, who are learning to integrate, teach and assess the 4Cs. The project draws on the Theory of Practice Architecture as a framework to analyse and interpret teachers' experiences in each site. The sixteen participants in the study are teachers from four primary settings and include early career, experienced, and teachers in leadership roles (including the principal). In addition, some of the participants are also teachers who are learning within a Community of Practice (CoP) as their school setting is engaged in a 4C professional learning, Community of Practice. Qualitative and arts-informed research methods are utilised to examine the cultural-discursive, social-political, and material-economic practice arrangements of the site, explore how these arrangements may have shaped the professional learning experiences of teachers, and in turn, influence the teaching practices of the 4Cs in the setting. The research is in the data analysis stage (October 2022), with preliminary findings pending. The research objective is to investigate the elements of the professional learning experiences undertaken by teachers to teach the 4Cs in the primary setting. The lens of practice architectures theory is used to identify the influence of the practice architectures on critical praxis in each site and examine how the practice arrangements enable or constrain the teaching of 4C capabilities. This research aims to offer deep insight into the practice arrangements which may enable or constrain teacher professional learning in the 4Cs. Such insight from this study may contribute to a better understanding of the practices that enable teachers to transform their practice to achieve the integration, teaching, and assessment of the 4C capabilities.Keywords: 4Cs, communication, collaboration, creativity, critical reflection, teacher professional learning
Procedia PDF Downloads 1103857 Analysis of Two Methods to Estimation Stochastic Demand in the Vehicle Routing Problem
Authors: Fatemeh Torfi
Abstract:
Estimation of stochastic demand in physical distribution in general and efficient transport routs management in particular is emerging as a crucial factor in urban planning domain. It is particularly important in some municipalities such as Tehran where a sound demand management calls for a realistic analysis of the routing system. The methodology involved critically investigating a fuzzy least-squares linear regression approach (FLLRs) to estimate the stochastic demands in the vehicle routing problem (VRP) bearing in mind the customer's preferences order. A FLLR method is proposed in solving the VRP with stochastic demands. Approximate-distance fuzzy least-squares (ADFL) estimator ADFL estimator is applied to original data taken from a case study. The SSR values of the ADFL estimator and real demand are obtained and then compared to SSR values of the nominal demand and real demand. Empirical results showed that the proposed methods can be viable in solving problems under circumstances of having vague and imprecise performance ratings. The results further proved that application of the ADFL was realistic and efficient estimator to face the stochastic demand challenges in vehicle routing system management and solve relevant problems.Keywords: fuzzy least-squares, stochastic, location, routing problems
Procedia PDF Downloads 4373856 GC-MS Analysis of Essential Oil From Satureja Hispidula: A Medicinal Plant from Algeria
Authors: Habiba Rechek, Ammar Haouat, Ratiba Mekkiou, Diana C. G. A. Pinto, Artur M. S. Silva
Abstract:
Satureja hispidula is an aromatic and medicinal plant belonging to the family of Lamiaceae native to Algeria, just like mint or thyme. Although she is less known to the general public than her more famous cousins, this species has many therapeutic properties that have been used for centuries in traditional medicine of some regions. For generations, Satureja hispidula has been used in traditional medicine to treat various ailments, including respiratory diseases and diabetes. Its aroma, often described as close to that of mint, gives it a special interest in aromatherapy. Due to the growing interest in the beneficial properties of plant-derived essential oils, the aim of this study is to analyze the chemical composition of S. hispidula essential oil by gas chromatography coupled with mass spectrometry (GC-MS). Identifying the main constituents of essential oil will allow better understanding its chemical nature and exploring its potential for culinary and therapeutic application. The study of the essential oil of S. hispidula reveals a composition rich in 83 compounds, including menthone, pulegone and piperitone as main constituents. This gas chromatography analysis coupled with mass spectrometry provides valuable information about the chemical nature of this oil. However, more in-depth studies are needed to explore the potentially health-enhancing properties of this essential oil.Keywords: satureja hispidula, GC-MS, essential oil, menthone, pulegone
Procedia PDF Downloads 333855 The Role of QX-314 and Capsaicin in Producing Long-Lasting Local Anesthesia in the Animal Model of Trigeminal Neuralgia
Authors: Ezzati Givi M., Ezzatigivi N., Eimani H.
Abstract:
Trigeminal Neuralgia (TN) consists of painful attacks often triggered with general activities, which cause impairment and disability. The first line of treatment consists of pharmacotherapy. However, the occurrence of many side-effects limits its application. Acute pain relief is crucial for titrating oral drugs and making time for neurosurgical intervention. This study aimed to examine the long-term anesthetic effect of QX-314 and capsaicin in trigeminal neuralgia using an animal model. TN was stimulated by surgical constriction of the infraorbital nerve in rats. After seven days, anesthesia infiltration was done, and the duration of mechanical allodynia was compared. Thirty-five male Wistar rats were randomly divided into seven groups as follows: control (normal saline); lidocaine (2%); QX314 (30 mM); lidocaine (2%)+QX314 (15 mM); lidocaine (2%)+QX314 (22 mM); lidocaine (2%)+QX314 (30 mM); and lidocaine (2%)+QX314 (30 mM) +capsaicin (1μg). QX314 in combination with lidocaine significantly increased the duration of anesthesia, which was dose-dependent. The combination of lidocaine+QX314+capsaicin could significantly increase the duration of anesthesia in trigeminal neuralgia. In the present study, we demonstrated that the combination of QX-314 with lidocaine and capsaicin produced a long-lasting, reversible local anesthesia and was superior to lidocaine alone in the fields of the duration of trigeminal neuropathic pain blockage.Keywords: trigeminal neuralgia, capsaicin, lidocaine, long-lasting
Procedia PDF Downloads 1183854 Application of Residual Correction Method on Hyperbolic Thermoelastic Response of Hollow Spherical Medium in Rapid Transient Heat Conduction
Authors: Po-Jen Su, Huann-Ming Chou
Abstract:
In this article we uses the residual correction method to deal with transient thermoelastic problems with a hollow spherical region when the continuum medium possesses spherically isotropic thermoelastic properties. Based on linear thermoelastic theory, the equations of hyperbolic heat conduction and thermoelastic motion were combined to establish the thermoelastic dynamic model with consideration of the deformation acceleration effect and non-Fourier effect under the condition of transient thermal shock. The approximate solutions of temperature and displacement distributions are obtained using the residual correction method based on the maximum principle in combination with the finite difference method, making it easier and faster to obtain upper and lower approximations of exact solutions. The proposed method is found to be an effective numerical method with satisfactory accuracy. Moreover, the result shows that the effect of transient thermal shock induced by deformation acceleration is enhanced by non-Fourier heat conduction with increased peak stress. The influence on the stress increases with the thermal relaxation time.Keywords: maximum principle, non-Fourier heat conduction, residual correction method, thermo-elastic response
Procedia PDF Downloads 4273853 Reliability Indices Evaluation of SEIG Rotor Core Magnetization with Minimum Capacitive Excitation for WECs
Authors: Lokesh Varshney, R. K. Saket
Abstract:
This paper presents reliability indices evaluation of the rotor core magnetization of the induction motor operated as a self-excited induction generator by using probability distribution approach and Monte Carlo simulation. Parallel capacitors with calculated minimum capacitive value across the terminals of the induction motor operating as a SEIG with unregulated shaft speed have been connected during the experimental study. A three phase, 4 poles, 50Hz, 5.5 hp, 12.3A, 230V induction motor coupled with DC Shunt Motor was tested in the electrical machine laboratory with variable reactive loads. Based on this experimental study, it is possible to choose a reliable induction machine operating as a SEIG for unregulated renewable energy application in remote area or where grid is not available. Failure density function, cumulative failure distribution function, survivor function, hazard model, probability of success and probability of failure for reliability evaluation of the three phase induction motor operating as a SEIG have been presented graphically in this paper.Keywords: residual magnetism, magnetization curve, induction motor, self excited induction generator, probability distribution, Monte Carlo simulation
Procedia PDF Downloads 561