Search results for: fast Fourier algorithms
1872 High Pressure Multiphase Flow Experiments: The Impact of Pressure on Flow Patterns Using an X-Ray Tomography Visualisation System
Authors: Sandy Black, Calum McLaughlin, Alessandro Pranzitelli, Marc Laing
Abstract:
Multiphase flow structures of two-phase multicomponent fluids were experimentally investigated in a large diameter high-pressure pipeline up to 130 bar at TÜV SÜD’s National Engineering Laboratory Advanced Multiphase Facility. One of the main objectives of the experimental test campaign was to evaluate the impact of pressure on multiphase flow patterns as much of the existing information is based on low-pressure measurements. The experiments were performed in a horizontal and vertical orientation in both 4-inch and 6-inch pipework using nitrogen, ExxsolTM D140 oil, and a 6% aqueous solution of NaCl at incremental pressures from 10 bar to 130 bar. To visualise the detailed structure of the flow of the entire cross-section of the pipe, a fast response X-ray tomography system was used. A wide range of superficial velocities from 0.6 m/s to 24.0 m/s for gas and 0.04 m/s and 6.48 m/s for liquid was examined to evaluate different flow regimes. The results illustrated the suppression of instabilities between the gas and the liquid at the measurement location and that intermittent or slug flow was observed less frequently as the pressure was increased. CFD modellings of low and high-pressure simulations were able to successfully predict the likelihood of intermittent flow; however, further tuning is necessary to predict the slugging frequency. The dataset generated is unique as limited datasets exist above 100 bar and is of considerable value to multiphase flow specialists and numerical modellers.Keywords: computational fluid dynamics, high pressure, multiphase, X-ray tomography
Procedia PDF Downloads 1491871 Open Source, Open Hardware Ground Truth for Visual Odometry and Simultaneous Localization and Mapping Applications
Authors: Janusz Bedkowski, Grzegorz Kisala, Michal Wlasiuk, Piotr Pokorski
Abstract:
Ground-truth data is essential for VO (Visual Odometry) and SLAM (Simultaneous Localization and Mapping) quantitative evaluation using e.g. ATE (Absolute Trajectory Error) and RPE (Relative Pose Error). Many open-access data sets provide raw and ground-truth data for benchmark purposes. The issue appears when one would like to validate Visual Odometry and/or SLAM approaches on data captured using the device for which the algorithm is targeted for example mobile phone and disseminate data for other researchers. For this reason, we propose an open source, open hardware groundtruth system that provides an accurate and precise trajectory with a 3D point cloud. It is based on LiDAR Livox Mid-360 with a non-repetitive scanning pattern, on-board Raspberry Pi 4B computer, battery and software for off-line calculations (camera to LiDAR calibration, LiDAR odometry, SLAM, georeferencing). We show how this system can be used for the evaluation of various the state of the art algorithms (Stella SLAM, ORB SLAM3, DSO) in typical indoor monocular VO/SLAM.Keywords: SLAM, ground truth, navigation, LiDAR, visual odometry, mapping
Procedia PDF Downloads 811870 Chemical Composition and Characteristics of Organic Solvent Extracts from the Omani Seaweeds Melanothamnus Somalensis and Gelidium Omanense
Authors: Abdullah Al-Nassri, Ahmed Al-Alawi
Abstract:
Seaweeds are classified into three groups: red, green, and brown. Each group of seaweeds consists of several types that have differences in composition. Even at the species level, there are differences in some ingredients, although in general composition, they are the same. Environmental conditions, availability of nutrients, and maturity stage are the main reasons for composition differences. In this study, two red seaweed species, Melanothamnus somalensis & Gelidium omanense, were collected in September 2021 from Sadh (Dhofar governorate, Oman). Five organic solvents were used sequentially to achieve extraction. The solvents were applied in the following order: hexane, dichloromethane, ethyl acetate, acetone, and methanol. Preparative HPLC (PrepLC) was performed to fraction the extracts. The chemical composition was measured; also, total phenols, flavonoids, and tannins were investigated. The structure of the extracts was analyzed by Fourier-transform infrared spectroscopy (FTIR). Seaweeds demonstrated high differences in terms of chemical composition, total phenolic content (TPC), total flavonoid content (TFC), and total tannin content (TTC). Gelidium omanense showed high moisture content, lipid content and carbohydrates (9.8 ± 0.15 %, 2.29 ± 0.09 % and 70.15 ± 0.42 %, respectively) compared to Melanothamnus somalensis (6.85 ± 0.01 %, 2.05 ± 0.12 % and 52.7 ± 0.36 % respectively). However, Melanothamnus somalensis showed high ash content and protein (27.68 ± 0.40 % and 52.7 ± 0.36 % respectively) compared to Gelidium omanense (8.07 ± 0.39 % and 9.70 ± 0.22 % respectively). Melanothamnus somalensis showed higher elements and minerals content, especially sodium and potassium. This is attributed to the jelly-like structure of Melanothamnus somalensis, which allows storage of more solutes compared to the leafy-like structure of Gelidium omanense. Furthermore, Melanothamnus somalensis had higher TPC in all fractions except the hexane fraction than Gelidium omanense. Except with hexane, TFC in the other solvents’ extracts was significantly different between Gelidium omanense and Melanothamnus somalensis. In all fractions, except dichloromethane and ethyl acetate fractions, there were no significant differences in TTC between Gelidium omanense and Melanothamnus somalensis. FTIR spectra showed variation between fractions, which is an indication of different functional groups.Keywords: chemical composition, organic extract, Omani seaweeds, biological activity, FTIR
Procedia PDF Downloads 751869 Parallelizing the Hybrid Pseudo-Spectral Time Domain/Finite Difference Time Domain Algorithms for the Large-Scale Electromagnetic Simulations Using Massage Passing Interface Library
Authors: Donggun Lee, Q-Han Park
Abstract:
Due to its coarse grid, the Pseudo-Spectral Time Domain (PSTD) method has advantages against the Finite Difference Time Domain (FDTD) method in terms of memory requirement and operation time. However, since the efficiency of parallelization is much lower than that of FDTD, PSTD is not a useful method for a large-scale electromagnetic simulation in a parallel platform. In this paper, we propose the parallelization technique of the hybrid PSTD-FDTD (HPF) method which simultaneously possesses the efficient parallelizability of FDTD and the quick speed and low memory requirement of PSTD. Parallelization cost of the HPF method is exactly the same as the parallel FDTD, but still, it occupies much less memory space and has faster operation speed than the parallel FDTD. Experiments in distributed memory systems have shown that the parallel HPF method saves up to 96% of the operation time and reduces 84% of the memory requirement. Also, by combining the OpenMP library to the MPI library, we further reduced the operation time of the parallel HPF method by 50%.Keywords: FDTD, hybrid, MPI, OpenMP, PSTD, parallelization
Procedia PDF Downloads 1511868 Ethical Considerations of Disagreements Between Clinicians and Artificial Intelligence Recommendations: A Scoping Review
Authors: Adiba Matin, Daniel Cabrera, Javiera Bellolio, Jasmine Stewart, Dana Gerberi (librarian), Nathan Cummins, Fernanda Bellolio
Abstract:
OBJECTIVES: Artificial intelligence (AI) tools are becoming more prevalent in healthcare settings, particularly for diagnostic and therapeutic recommendations, with an expected surge in the incoming years. The bedside use of this technology for clinicians opens the possibility of disagreements between the recommendations from AI algorithms and clinicians’ judgment. There is a paucity in the literature analyzing nature and possible outcomes of these potential conflicts, particularly related to ethical considerations. The goal of this scoping review is to identify, analyze and classify current themes and potential strategies addressing ethical conflicts originating from the conflict between AI and human recommendations. METHODS: A protocol was written prior to the initiation of the study. Relevant literature was searched by a medical librarian for the terms of artificial intelligence, healthcare and liability, ethics, or conflict. Search was run in 2021 in Ovid Cochrane Central Register of Controlled Trials, Embase, Medline, IEEE Xplore, Scopus, and Web of Science Core Collection. Articles describing the role of AI in healthcare that mentioned conflict between humans and AI were included in the primary search. Two investigators working independently and in duplicate screened titles and abstracts and reviewed full-text of potentially eligible studies. Data was abstracted into tables and reported by themes. We followed methodological guidelines for Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR). RESULTS: Of 6846 titles and abstracts, 225 full texts were selected, and 48 articles included in this review. 23 articles were included as original research and review papers. 25 were included as editorials and commentaries with similar themes. There was a lack of consensus in the included articles on who would be held liable for mistakes incurred by following AI recommendations. It appears that there is a dichotomy of the perceived ethical consequences depending on if the negative outcome is a result of a human versus AI conflict or secondary to a deviation from standard of care. Themes identified included transparency versus opacity of recommendations, data bias, liability of outcomes, regulatory framework, and the overall scope of artificial intelligence in healthcare. A relevant issue identified was the concern by clinicians of the “black box” nature of these recommendations and the ability to judge appropriateness of AI guidance. CONCLUSION AI clinical tools are being rapidly developed and adopted, and the use of this technology will create conflicts between AI algorithms and healthcare workers with various outcomes. In turn, these conflicts may have legal, and ethical considerations. There is limited consensus about the focus of ethical and liability for outcomes originated from disagreements. This scoping review identified the importance of framing the problem in terms of conflict between standard of care or not, and informed by the themes of transparency/opacity, data bias, legal liability, absent regulatory frameworks and understanding of the technology. Finally, limited recommendations to mitigate ethical conflicts between AI and humans have been identified. Further work is necessary in this field.Keywords: ethics, artificial intelligence, emergency medicine, review
Procedia PDF Downloads 1001867 Evaluation of Microwave-Assisted Pretreatment for Spent Coffee Grounds
Authors: Shady S. Hassan, Brijesh K. Tiwari, Gwilym A. Williams, Amit K. Jaiswal
Abstract:
Waste materials from a wide range of agro-industrial processes may be used as substrates for microbial growth, and subsequently the production of a range of high value products and bioenergy. In addition, utilization of these agro-residues in bioprocesses has the dual advantage of providing alternative substrates, as well as solving their disposal problems. Spent coffee grounds (SCG) are a by-product (45%) of coffee processing. SCG is a lignocellulosic material, which is composed mainly of cellulose, hemicelluloses, and lignin. Thus, a pretreatment process is required to facilitate an efficient enzymatic hydrolysis of such carbohydrates. In this context, microwave pretreatment of lignocellulosic biomass without the addition of harsh chemicals represents a green technology. Moreover, microwave treatment has a high heating efficiency and is easy to implement. Thus, microwave pretreatment of SCG without adding of harsh chemicals investigated as a green technology to enhance enzyme hydrolysis. In the present work, microwave pretreatment experiments were conducted on SCG at varying power levels (100, 250, 440, 600, and 1000 W) for 60 s. By increasing microwave power to a certain level (which vary by varying biomass), reducing sugar increases, then reducing sugar from biomass start to decrease with microwave power increase beyond this level. Microwave pretreatment of SCG at 60s followed by enzymatic hydrolysis resulted in total reducing sugars of 91.6 ± 7.0 mg/g of biomass (at microwave power of 100 w). Fourier transform Infrared Spectroscopy (FTIR) was employed to investigate changes in functional groups of biomass after pretreatment, while high-performance liquid chromatography (HPLC) was employed for determination of glucose. Pretreatment of lignocellulose using microwave was found to be an effective and energy efficient technology to improve saccharification and glucose yield. Energy performance will be evaluated for the microwave pretreatment, and the enzyme hydrolysate will be used as media component substitute for the production of ethanol and other high value products.Keywords: lignocellulose, microwave, pretreatment, spent coffee grounds
Procedia PDF Downloads 4221866 2D Hexagonal Cellular Automata: The Complexity of Forms
Authors: Vural Erdogan
Abstract:
We created two-dimensional hexagonal cellular automata to obtain complexity by using simple rules same as Conway’s game of life. Considering the game of life rules, Wolfram's works about life-like structures and John von Neumann's self-replication, self-maintenance, self-reproduction problems, we developed 2-states and 3-states hexagonal growing algorithms that reach large populations through random initial states. Unlike the game of life, we used six neighbourhoods cellular automata instead of eight or four neighbourhoods. First simulations explained that whether we are able to obtain sort of oscillators, blinkers, and gliders. Inspired by Wolfram's 1D cellular automata complexity and life-like structures, we simulated 2D synchronous, discrete, deterministic cellular automata to reach life-like forms with 2-states cells. The life-like formations and the oscillators have been explained how they contribute to initiating self-maintenance together with self-reproduction and self-replication. After comparing simulation results, we decided to develop the algorithm for another step. Appending a new state to the same algorithm, which we used for reaching life-like structures, led us to experiment new branching and fractal forms. All these studies tried to demonstrate that complex life forms might come from uncomplicated rules.Keywords: hexagonal cellular automata, self-replication, self-reproduction, self- maintenance
Procedia PDF Downloads 1581865 Formulation and Evaluation of Mouth Dissolving Tablet of Ketorolac Tromethamine by Using Natural Superdisintegrants
Authors: J. P. Lavande, A. V.Chandewar
Abstract:
Mouth dissolving tablet is the speedily growing and highly accepted drug delivery system. This study was aimed at development of Ketorolac Tromethamine mouth dissolving tablet (MDTs), which can disintegrate or dissolve rapidly once placed in the mouth. Conventional Ketorolac tromethamine tablet requires water to swallow it and has limitation like low disintegration rate, low solubility etc. Ketorolac Tromethamine mouth dissolving tablets (formulation) consist of super-disintegrate like Heat Modified Karaya Gum, Co-treated Heat Modified Agar & Filler microcrystalline cellulose (MCC). The tablets were evaluated for weight variation, friability, hardness, in vitro disintegration time, wetting time, in vitro drug release profile, content uniformity. The obtained results showed that low weight variation, good hardness, acceptable friability, fast wetting time. Tablets in all batches disintegrated within 15-50 sec. The formulation containing superdisintegrants namely heat modified karaya gum and heat modified agar showed better performance in disintegration and drug release profile.Keywords: mouth dissolving tablet, Ketorolac tromethamine, disintegration time, heat modified karaya gum, co-treated heat modified agar
Procedia PDF Downloads 2871864 The Response of 4-Hydroxybenzoic Acid on Kv1.4 Potassium Channel Subunit Expressed in Xenopus laevis Oocytes
Authors: Fatin H. Mohamad, Jia H. Wong, Muhammad Bilal, Abdul A. Mohamed Yusoff, Jafri M. Abdullah, Jingli Zhang
Abstract:
Kv1.4 is a Shaker-related member of voltage-gated potassium channel which can be associated with cardiac action potential but can also be found in Schaffer collateral and dentate gyrus. It has two inactivation mechanisms; the fast N-type and slow C-type. Kv1.4 produces rapid current inactivation. This A type potential of Kv1.4 makes it as a target in antiepileptic drugs (AEDs) selection. In this study, 4-hydroxybenzoic acid, which can be naturally found in bamboo shoots, were tested on its enhancement effect on potassium current of Kv1.4 channel expressed in Xenopus laevis oocytes using the two-microelectrode voltage clamp method. Current obtained were recorded and analyzed with pClamp software whereas statistical analysis were done by student t-test. The ratio of final / peak amplitude is an index of the activity of the Kv1.4 channel. The less the ratio, the greater the function of Kv1.4. The decrease of ratio of which by 1µM 4-hydroxybenzoic acid (n= 7), compared with 0.1% DMSO (vehicle), was mean= 47.62%, SE= 13.76%, P= 0.026 (statistically significant). It indicated more opening of Kv1.4 channels under 4-hydroxybenzoic acid. In conclusion, 4-hydroxybenzoic acid can enhance the function of Kv1.4 potassium channels, which is regarded as one of the mechanisms of antiepileptic treatment.Keywords: antiepileptic, Kv1.4 potassium channel, two-microelectrode voltage clamp, Xenopus laevis oocytes, 4-hydroxybenzoic acid
Procedia PDF Downloads 3641863 A Polynomial Time Clustering Algorithm for Solving the Assignment Problem in the Vehicle Routing Problem
Authors: Lydia Wahid, Mona F. Ahmed, Nevin Darwish
Abstract:
The vehicle routing problem (VRP) consists of a group of customers that needs to be served. Each customer has a certain demand of goods. A central depot having a fleet of vehicles is responsible for supplying the customers with their demands. The problem is composed of two subproblems: The first subproblem is an assignment problem where the number of vehicles that will be used as well as the customers assigned to each vehicle are determined. The second subproblem is the routing problem in which for each vehicle having a number of customers assigned to it, the order of visits of the customers is determined. Optimal number of vehicles, as well as optimal total distance, should be achieved. In this paper, an approach for solving the first subproblem (the assignment problem) is presented. In the approach, a clustering algorithm is proposed for finding the optimal number of vehicles by grouping the customers into clusters where each cluster is visited by one vehicle. Finding the optimal number of clusters is NP-hard. This work presents a polynomial time clustering algorithm for finding the optimal number of clusters and solving the assignment problem.Keywords: vehicle routing problems, clustering algorithms, Clarke and Wright Saving Method, agglomerative hierarchical clustering
Procedia PDF Downloads 3971862 Framework for Detecting External Plagiarism from Monolingual Documents: Use of Shallow NLP and N-Gram Frequency Comparison
Authors: Saugata Bose, Ritambhra Korpal
Abstract:
The internet has increased the copy-paste scenarios amongst students as well as amongst researchers leading to different levels of plagiarized documents. For this reason, much of research is focused on for detecting plagiarism automatically. In this paper, an initiative is discussed where Natural Language Processing (NLP) techniques as well as supervised machine learning algorithms have been combined to detect plagiarized texts. Here, the major emphasis is on to construct a framework which detects external plagiarism from monolingual texts successfully. For successfully detecting the plagiarism, n-gram frequency comparison approach has been implemented to construct the model framework. The framework is based on 120 characteristics which have been extracted during pre-processing the documents using NLP approach. Afterwards, filter metrics has been applied to select most relevant characteristics and then supervised classification learning algorithm has been used to classify the documents in four levels of plagiarism. Confusion matrix was built to estimate the false positives and false negatives. Our plagiarism framework achieved a very high the accuracy score.Keywords: lexical matching, shallow NLP, supervised machine learning algorithm, word n-gram
Procedia PDF Downloads 3601861 Antibacterial and Antioxidant Properties of Total Phenolics from Waste Orange Peels
Authors: Kanika Kalra, Harmeet Kaur, Dinesh Goyal
Abstract:
Total phenolics were extracted from waste orange peels by solvent extraction and alkali hydrolysis method. The most efficient solvents for extracting phenolic compounds from waste biomass were methanol (60%) > dimethyl sulfoxide > ethanol (60%) > distilled water. The extraction yields were significantly impacted by solvents (ethanol, methanol, and dimethyl sulfoxide) due to varying polarity and concentrations. Extraction of phenolics using 60% methanol yielded the highest phenolics (in terms of gallic acid equivalent (GAE) per gram of biomass) in orange peels. Alkali hydrolyzed extract from orange peels contained 7.58±0.33 mg GAE g⁻¹. By using the solvent extraction technique, it was observed that 60% methanol is comparatively the best-suited solvent for extracting polyphenolic compounds and gave the maximum yield of 4.68 ± 0.47 mg GAE g⁻¹ in orange peel extracts. DPPH radical scavenging activity and reducing the power of orange peel extract were checked, where 60% methanolic extract showed the highest antioxidant activity, 85.50±0.009% for DPPH, and dimethyl sulfoxide (DMSO) extract gave the highest yield of 1.75±0.01% for reducing power ability of the orange peels extract. Characterization of the polyphenolic compounds was done by using Fourier transformation infrared (FTIR) spectroscopy. Solvent and alkali hydrolysed extracts were evaluated for antibacterial activity using the agar well diffusion method against Gram-positive Bacillus subtilis MTCC441 and Gram-negative Escherichia coli MTCC729. Methanolic extract at 300µl concentration showed an inhibition zone of around 16.33±0.47 mm against Bacillus subtilis, whereas, for Escherichia coli, it was comparatively less. Broth-based turbidimetric assay revealed the antibacterial effect of different volumes of orange peel extracts against both organisms.Keywords: orange peels, total phenolic content, antioxidant, antibacterial
Procedia PDF Downloads 771860 Effect of Citric Acid on Hydrogen-Bond Interactions and Tensile Retention Properties of Citric Acid Modified Thermoplastic Starch Biocomposites
Authors: Da-Wei Wang, Liang Yang, Xuan-Long Peng, Mei-Chuan Kuo, Jen-Taut Yeh
Abstract:
The tensile retention and waterproof properties of thermoplastic starch (TPS) resins were significantly enhanced by modifying with proper amounts of citric acid (CA) and by melt-blending with poly(lactic acid) (PLA), although no distinguished chemical reaction occurred between CA and starch molecules. As evidenced by Fourier transform infrared spectroscopy and Solid-state 13C Nuclear Magnetic Resonance analyses, disruption of intra and interhydrogen-bondings within starch molecules did occur during the modification processes of CA modified TPS (i.e. TPS100CAx) specimens. The tensile strength (σf) retention values of TPS specimens reduced rapidly from 27.8 to 20.5 and 0.4 MPa, respectively, as the conditioning time at 20°C/50% relative humidity (RH) increased from 0 to 7 and 70 days, respectively. While the elongation at break (εf) retention values of TPS specimens increased rapidly from 5.9 to 6.5 and 34.8%, respectively, as the conditioning time increased from 0 to 7 and 70 days. After conditioning at 20°C/50% RH for 70 days, the σf and εf retention values of the best prepared (TPS100CA0.1)30PLA70 specimen are equivalent to 85% and 167% of its initial σf and εf values, respectively, and are more than 105 times higher but 48% lower than those of TPS specimens conditioned at 20°C/50% RH for the same amount of time. Demarcated diffraction peaks, new melting endotherms of recrystallized starch crystals and distinguished ductile characteristics with drawn debris were found for many conditioned TPS specimens, however, only slight retrogradation effect and much less drawn debris was found for most conditioned TPS100CAx and/or (TPS100CA0.1)xPLAy specimens. The significantly improved water proof, tensile retention properties and relatively unchanged in retrogradation effect found for most conditioned TPS100CAx and/or (TPS100CA0.1)xPLAy specimens are apparently due to the efficient blocking of the moisture-absorbing hydroxyl groups (free or hydrogen bonded) by hydrogen-bonding CA with starch molecules during their modification processes.Keywords: thermoplastic starch, hydrogen-bonding, water proof, strength retention
Procedia PDF Downloads 3081859 A Construction Scheduling Model by Applying Pedestrian and Vehicle Simulation
Authors: Akhmad F. K. Khitam, Yi Tai, Hsin-Yun Lee
Abstract:
In the modern research of construction management, the goals of scheduling are not only to finish the project within the limited duration, but also to improve the impact of people and environment. Especially for the impact to the pedestrian and vehicles, the considerable social cost should be estimated in the total performance of a construction project. However, the site environment has many differences between projects. These interactions affect the requirement and goal of scheduling. It is difficult for schedule planners to quantify these interactions. Therefore, this study use 3D dynamic simulation technology to plan the schedule of the construction engineering projects that affect the current space users (i.e., the pedestrians and vehicles). The proposed model can help the project manager find out the optimal schedule to minimize the inconvenience brought to the space users. Besides, a roadwork project and a building renovation project were analyzed for the practical situation of engineering and operations. Then this study integrates the proper optimization algorithms and computer technology to establish a decision support model. The proposed model can generate a near-optimal schedule solution for project planners.Keywords: scheduling, simulation, optimization, pedestrian and vehicle behavior
Procedia PDF Downloads 1441858 The Impact of Public Charging Infrastructure on the Adoption of Electric Vehicles
Authors: Shaherah Jordan, Paula Vandergert
Abstract:
The discussion on public charging infrastructure is usually framed around the ‘chicken-egg’ challenge of consumers feeling reluctant to purchase without the necessary infrastructure and policymakers reluctant to invest in the infrastructure without the demand. However, public charging infrastructure may be more crucial to electric vehicle (EV) adoption than previously thought. Historically, access to residential charging was thought to be a major factor in potential for growth in the EV market as it offered a guaranteed place for a vehicle to be charged. The purpose of this study is to understand how the built environment may encourage uptake of EVs by seeking a correlation between EV ownership and public charging points in an urban and densely populated city such as London. Using a statistical approach with data from the Department for Transport and Zap-Map, a statistically significant correlation was found between the total (slow, fast and rapid) number of public charging points and a number of EV registrations per borough – with the strongest correlation found between EV registrations and rapid chargers. This research does not explicitly prove that there is a cause and effect relationship between public charging points EVs but challenges some of the previous literature which indicates that public charging infrastructure is not as important as home charging. Furthermore, the study provides strong evidence that public charging points play a functional and psychological role in the adoption of EVs and supports the notion that the built environment can influence human behaviour.Keywords: behaviour change, electric vehicles, public charging infrastructure, transportation
Procedia PDF Downloads 2181857 Computing Continuous Skyline Queries without Discriminating between Static and Dynamic Attributes
Authors: Ibrahim Gomaa, Hoda M. O. Mokhtar
Abstract:
Although most of the existing skyline queries algorithms focused basically on querying static points through static databases; with the expanding number of sensors, wireless communications and mobile applications, the demand for continuous skyline queries has increased. Unlike traditional skyline queries which only consider static attributes, continuous skyline queries include dynamic attributes, as well as the static ones. However, as skyline queries computation is based on checking the domination of skyline points over all dimensions, considering both the static and dynamic attributes without separation is required. In this paper, we present an efficient algorithm for computing continuous skyline queries without discriminating between static and dynamic attributes. Our algorithm in brief proceeds as follows: First, it excludes the points which will not be in the initial skyline result; this pruning phase reduces the required number of comparisons. Second, the association between the spatial positions of data points is examined; this phase gives an idea of where changes in the result might occur and consequently enables us to efficiently update the skyline result (continuous update) rather than computing the skyline from scratch. Finally, experimental evaluation is provided which demonstrates the accuracy, performance and efficiency of our algorithm over other existing approaches.Keywords: continuous query processing, dynamic database, moving object, skyline queries
Procedia PDF Downloads 2131856 Palyno-Morphological Characteristics of Gymnosperm Flora of Pakistan and Its Taxonomic Implications with Light Microscope and Scanning Electron Microscopy Methods
Authors: Raees Khan, Sheikh Z. Ul Abidin, Abdul S. Mumtaz, Jie Liu
Abstract:
The present study is intended to assess gymnosperms pollen flora of Pakistan using Light Microscope (LM) and Scanning Electron Microscopy (SEM) for its taxonomic significance in identification of gymnosperms. Pollens of 35 gymnosperm species (12 genera and five families) were collected from its various distributional sites of gymnosperms in Pakistan. LM and SEM were used to investigate different palyno-morphological characteristics. Five pollen types (i.e., Inaperturate, Monolete, Monoporate, Vesiculate-bisaccate, and Polyplicate) were observed. In equatorial view seven types of pollens were observed, in which ten species were sub-angular, nine species were triangular, six species were perprolate, three species were rhomboidal, three species were semi-angular, two species were rectangular and two species were prolate. While five types of pollen were observed in polar view, in which ten species were spheroidal, nine species were angular, eight were interlobate, six species were circular, and two species were elliptic. Eighteen species have rugulate and 17 species has faveolate ornamentation. Eighteen species have verrucate and 17 have gemmate type sculpturing. The data was analysed through cluster analysis. The study showed that these palyno-morphological features have significance value in classification and identification of gymnosperms. Based on these different palyno-morphological features, a taxonomic key was proposed for the accurate and fast identifications of gymnosperms from Pakistan.Keywords: gymnosperms, palynology, Pakistan, taxonomy
Procedia PDF Downloads 2231855 Evaluation of Random Forest and Support Vector Machine Classification Performance for the Prediction of Early Multiple Sclerosis from Resting State FMRI Connectivity Data
Authors: V. Saccà, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone
Abstract:
The work aim was to evaluate how well Random Forest (RF) and Support Vector Machine (SVM) algorithms could support the early diagnosis of Multiple Sclerosis (MS) from resting-state functional connectivity data. In particular, we wanted to explore the ability in distinguishing between controls and patients of mean signals extracted from ICA components corresponding to 15 well-known networks. Eighteen patients with early-MS (mean-age 37.42±8.11, 9 females) were recruited according to McDonald and Polman, and matched for demographic variables with 19 healthy controls (mean-age 37.55±14.76, 10 females). MRI was acquired by a 3T scanner with 8-channel head coil: (a)whole-brain T1-weighted; (b)conventional T2-weighted; (c)resting-state functional MRI (rsFMRI), 200 volumes. Estimated total lesion load (ml) and number of lesions were calculated using LST-toolbox from the corrected T1 and FLAIR. All rsFMRIs were pre-processed using tools from the FMRIB's Software Library as follows: (1) discarding of the first 5 volumes to remove T1 equilibrium effects, (2) skull-stripping of images, (3) motion and slice-time correction, (4) denoising with high-pass temporal filter (128s), (5) spatial smoothing with a Gaussian kernel of FWHM 8mm. No statistical significant differences (t-test, p < 0.05) were found between the two groups in the mean Euclidian distance and the mean Euler angle. WM and CSF signal together with 6 motion parameters were regressed out from the time series. We applied an independent component analysis (ICA) with the GIFT-toolbox using the Infomax approach with number of components=21. Fifteen mean components were visually identified by two experts. The resulting z-score maps were thresholded and binarized to extract the mean signal of the 15 networks for each subject. Statistical and machine learning analysis were then conducted on this dataset composed of 37 rows (subjects) and 15 features (mean signal in the network) with R language. The dataset was randomly splitted into training (75%) and test sets and two different classifiers were trained: RF and RBF-SVM. We used the intrinsic feature selection of RF, based on the Gini index, and recursive feature elimination (rfe) for the SVM, to obtain a rank of the most predictive variables. Thus, we built two new classifiers only on the most important features and we evaluated the accuracies (with and without feature selection) on test-set. The classifiers, trained on all the features, showed very poor accuracies on training (RF:58.62%, SVM:65.52%) and test sets (RF:62.5%, SVM:50%). Interestingly, when feature selection by RF and rfe-SVM were performed, the most important variable was the sensori-motor network I in both cases. Indeed, with only this network, RF and SVM classifiers reached an accuracy of 87.5% on test-set. More interestingly, the only misclassified patient resulted to have the lowest value of lesion volume. We showed that, with two different classification algorithms and feature selection approaches, the best discriminant network between controls and early MS, was the sensori-motor I. Similar importance values were obtained for the sensori-motor II, cerebellum and working memory networks. These findings, in according to the early manifestation of motor/sensorial deficits in MS, could represent an encouraging step toward the translation to the clinical diagnosis and prognosis.Keywords: feature selection, machine learning, multiple sclerosis, random forest, support vector machine
Procedia PDF Downloads 2411854 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison
Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo
Abstract:
A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.Keywords: affective computing, interface, brain, intelligent interaction
Procedia PDF Downloads 3931853 A Monocular Measurement for 3D Objects Based on Distance Area Number and New Minimize Projection Error Optimization Algorithms
Authors: Feixiang Zhao, Shuangcheng Jia, Qian Li
Abstract:
High-precision measurement of the target’s position and size is one of the hotspots in the field of vision inspection. This paper proposes a three-dimensional object positioning and measurement method using a monocular camera and GPS, namely the Distance Area Number-New Minimize Projection Error (DAN-NMPE). Our algorithm contains two parts: DAN and NMPE; specifically, DAN is a picture sequence algorithm, NMPE is a relatively positive optimization algorithm, which greatly improves the measurement accuracy of the target’s position and size. Comprehensive experiments validate the effectiveness of our proposed method on a self-made traffic sign dataset. The results show that with the laser point cloud as the ground truth, the size and position errors of the traffic sign measured by this method are ± 5% and 0.48 ± 0.3m, respectively. In addition, we also compared it with the current mainstream method, which uses a monocular camera to locate and measure traffic signs. DAN-NMPE attains significant improvements compared to existing state-of-the-art methods, which improves the measurement accuracy of size and position by 50% and 15.8%, respectively.Keywords: monocular camera, GPS, positioning, measurement
Procedia PDF Downloads 1481852 Purification of Bilge Water by Adsorption
Authors: Fatiha Atmani, Lamia Djellab, Nacera Yeddou Mezenner, Zohra Bensaadi
Abstract:
Generally, bilge waters can be briefly defined as saline and greasy wastewaters. The oil and grease are mixed with the sea water, which affects many marine species. Bilge water is a complex mixture of various compounds such as solvents, surfactants, fuel, lubricating oils, and hydraulic oils. It is resulted mainly by the leakage from the machinery and fresh water washdowns,which are allowed to drain to the lowest inner part of the ship's hull. There are several physicochemical methods used for bilge water treatment such as biodegradation electrochemical and electro-coagulation/flotation.The research herein presented discusses adsorption as a method to treat bilge water and eggshells were studied as an adsorbent. The influence of operating parameters as contact time, temperature and adsorbent dose (0,2 - 2g/l) on the removal efficiency of Chemical oxygen demand, COD, and turbidity was analyzed. The bilge wastewater used for this study was supplied by Harbour Bouharoune. Chemical oxygen demand removal increased from 26.7% to 68.7% as the adsorbent dose increased from 0.2 to 2 g. The kinetics of adsorption by eggshells were fast, reaching 55 % of the total adsorption capacity in ten minutes (T= 20°C, pH =7.66, m=2g/L). It was found that the turbidity removal efficiency decreased and 95% were achieved at the end of 90 min reaction. The adsorption process was found to be effective for the purification of bilge water and pseudo-second-order kinetic model was fitted for COD removal.Keywords: adsorption, bilge water, eggshells and kinetics, equilibrium and kinetics
Procedia PDF Downloads 3581851 Using 3-Glycidoxypropyltrimethoxysilane Functionalized Silica Nanoparticles to Improve Flexural Properties of E-Glass/Epoxy Grid-Stiffened Composite Panels
Authors: Reza Eslami-Farsani, Hamed Khosravi, Saba Fayazzadeh
Abstract:
Lightweight and efficient structures have the aim to enhance the efficiency of the components in various industries. Toward this end, composites are one of the most widely used materials because of durability, high strength and modulus, and low weight. One type of the advanced composites is grid-stiffened composite (GSC) structures which have been extensively considered in aerospace, automotive, and aircraft industries. They are one of the top candidates for replacing some of the traditional components which are used here. Although there are a good number of published surveys on the design aspects and fabrication of GSC structures, little systematic work has been reported on their material modification to improve their properties, to our knowledge. Matrix modification using nanoparticles is an effective method to enhance the flexural properties of the fibrous composites. In the present study, a silane coupling agent (3-glycidoxypropyltrimethoxysilane/3-GPTS) was introduced onto the silica (SiO2) nanoparticle surface and its effects on the three-point flexural response of isogrid E-glass/epoxy composites were assessed. Based on the fourier transform infrared spectrometer (FTIR) spectra, it was inferred that the 3-GPTS coupling agent was successfully grafted onto the surface of SiO2 nanoparticles after modification. Flexural test revealed an improvement of 16%, 14%, and 36% in stiffness, maximum load and energy absorption of the isogrid specimen filled with 3 wt.% 3-GPTS/SiO2 compared to the neat one. It would be worth mentioning that in these structures, a considerable energy absorption was observed after the primary failure related to the load peak. Also, 3-GPTMS functionalization had a positive effect on the flexural behavior of the multiscale isogrid composites. In conclusion, this study suggests that the addition of modified silica nanoparticles is a promising method to improve the flexural properties of the grid-stiffened fibrous composite structures.Keywords: isogrid-stiffened composite panels, silica nanoparticles, surface modification, flexural properties, energy absorption
Procedia PDF Downloads 2521850 Offset Dependent Uniform Delay Mathematical Optimization Model for Signalized Traffic Network Using Differential Evolution Algorithm
Authors: Tahseen Saad, Halim Ceylan, Jonathan Weaver, Osman Nuri Çelik, Onur Gungor Sahin
Abstract:
A new concept of uniform delay offset dependent mathematical optimization problem is derived as the main objective for this study using a differential evolution algorithm. To control the coordination problem, which depends on offset selection and to estimate uniform delay based on the offset choice in a traffic signal network. The assumption is the periodic sinusoidal function for arrival and departure patterns. The cycle time is optimized at the entry links and the optimized value is used in the non-entry links as a common cycle time. The offset optimization algorithm is used to calculate the uniform delay at each link. The results are illustrated by using a case study and are compared with the canonical uniform delay model derived by Webster and the highway capacity manual’s model. The findings show new model minimizes the total uniform delay to almost half compared to conventional models. The mathematical objective function is robust. The algorithm convergence time is fast.Keywords: area traffic control, traffic flow, differential evolution, sinusoidal periodic function, uniform delay, offset variable
Procedia PDF Downloads 2811849 Evaluating Urban Land Expansion Using Geographic Information System and Remote Sensing in Kabul City, Afghanistan
Authors: Ahmad Sharif Ahmadi, Yoshitaka Kajita
Abstract:
With massive population expansion and fast economic development in last decade, urban land has increasingly expanded and formed high informal development territory in Kabul city. This paper investigates integrated urbanization trends in Kabul city since the formation of the basic structure of the present city using GIS and remote sensing. This study explores the spatial and temporal difference of urban land expansion and land use categories among different time intervals, 1964-1978 and 1978-2008 from 1964 to 2008 in Kabul city. Furthermore, the goal of this paper is to understand the extent of urban land expansion and the factors driving urban land expansion in Kabul city. Many factors like population expansion, the return of refugees from neighboring countries and significant economic growth of the city affected urban land expansion. Across all the study area urban land expansion rate, population expansion rate and economic growth rate have been compared to analyze the relationship of driving forces with urban land expansion. Based on urban land change data detected by interpreting land use maps, it was found that in the entire study area the urban territory has been expanded by 14 times between 1964 and 2008.Keywords: GIS, Kabul city, land use, urban land expansion, urbanization
Procedia PDF Downloads 3411848 A Plan of Smart Management for Groundwater Resources
Authors: Jennifer Chen, Pei Y. Hsu, Yu W. Chen
Abstract:
Groundwater resources play a vital role in regional water supply because over 1/3 of total demand is satisfied by groundwater resources. Because over-pumpage might cause environmental impact such as land subsidence, a sustainable management of groundwater resource is required. In this study, a blueprint of smart management for groundwater resource is proposed and planned. The framework of the smart management can be divided into two major parts, hardware and software parts. First, an internet of groundwater (IoG) which is inspired by the internet of thing (IoT) is proposed to observe the migration of groundwater usage and the associated response, groundwater levels. Second, algorithms based on data mining and signal analysis are proposed to achieve the goal of providing highly efficient management of groundwater. The entire blueprint is a 4-year plan and this year is the first year. We have finished the installation of 50 flow meters and 17 observation wells. An underground hydrological model is proposed to determine the associated drawdown caused by the measured pumpages. Besides, an alternative to the flow meter is also proposed to decrease the installation cost of IoG. An accelerometer and 3G remote transmission are proposed to detect the on and off of groundwater pumpage.Keywords: groundwater management, internet of groundwater, underground hydrological model, alternative of flow meter
Procedia PDF Downloads 3821847 Social Media Creating Communication Gap among Individuals
Authors: Muneeza Anwar, Muniba Raza, Zunahs Khalid
Abstract:
The study discusses the communication gap that has been created due to excessive use of social networking websites such as Facebook, WhatsApp, Viber etc. In this growing world of technology and awareness among people about social media it has also increased its usage. The objective of this study is to measure the ways the internet is affecting the communications among individuals through social media and to check whether this is affecting the society in a positive manner. The study signifies the theoretical and practical aspects of communication gaps among the individuals through social media. The study is conducted to check whether social networking websites are the main causes of creating communication gap among individuals. In this world of fast growing technology every day, there is a new invention, affecting the lives of people both directly and indirectly. Moreover with the usage of technology people keep updating about themselves, about different events happening around their surrounding by creating events, uploading pictures, checking in different place, and creating awareness among people who are not aware of people about what is happening. From the study, we deduced how social media is affecting individual’s life. The findings suggest that social media is although creating communication gaps among people but is also bridging them. Showing that social media is one of the causes that is creating communication gap among the individuals. Communication gap has although increased on a daily basis but on average it has remained the same as they are communicating on social networking websites but eventually decreasing the communication on personal grounds.Keywords: communication gaps, usage of social networking websites, interaction with friends and family, social media
Procedia PDF Downloads 4861846 Maintenance Objective-Based Asset Maintenance Maturity Model
Authors: James M. Wakiru, Liliane Pintelon, Peter Muchiri, Peter Chemweno
Abstract:
The fast-changing business and operational environment are forcing organizations to adopt asset performance management strategies, not only to reduce costs but also maintain operational and production policies while addressing demand. To attain optimal asset performance management, a framework that ensures a continuous and systematic approach to analyzing an organization’s current maturity level and expected improvement regarding asset maintenance processes, strategies, technologies, capabilities, and systems is essential. Moreover, this framework while addressing maintenance-intensive organizations should consider the diverse business, operational and technical context (often dynamic) an organization is in and realistically prescribe or relate to the appropriate tools and systems the organization can potentially employ in the respective level, to improve and attain their maturity goals. This paper proposes an asset maintenance maturity model to assess the current capabilities, strength and weaknesses of maintenance processes an organization is using and analyze gaps for improvement via structuring set levels of achievement. At the epicentre of the proposed framework is the utilization of maintenance objective selected by an organization for various maintenance optimization programs. The framework adapts the Capability Maturity Model of assessing the maintenance process maturity levels in the organization.Keywords: asset maintenance, maturity models, maintenance objectives, optimization
Procedia PDF Downloads 2331845 An Analysis of Non-Elliptic Curve Based Primality Tests
Authors: William Wong, Zakaria Alomari, Hon Ching Lai, Zhida Li
Abstract:
Modern-day information security depends on implementing Diffie-Hellman, which requires the generation of prime numbers. Because the number of primes is infinite, it is impractical to store prime numbers for use, and therefore, primality tests are indispensable in modern-day information security. A primality test is a test to determine whether a number is prime or composite. There are two types of primality tests, which are deterministic tests and probabilistic tests. Deterministic tests are adopting algorithms that provide a definite answer whether a given number is prime or composite. While in probabilistic tests, a probabilistic result would be provided, there is a degree of uncertainty. In this paper, we review three probabilistic tests: the Fermat Primality Test, the Miller-Rabin Test, and the Baillie-PSW Test, as well as one deterministic test, the Agrawal-Kayal-Saxena (AKS) Test. Furthermore, we do an analysis of these tests. All of the reviews discussed are not based on the Elliptic Curve. The analysis demonstrates that, in the majority of real-world scenarios, the Baillie- PSW test’s favorability stems from its typical operational complexity of O(log 3n) and its capacity to deliver accurate results for numbers below 2^64.Keywords: primality tests, Fermat’s primality test, Miller-Rabin primality test, Baillie-PSW primality test, AKS primality test
Procedia PDF Downloads 941844 Soil Compaction by a Forwarder in Timber Harvesting
Authors: Juang R. Matangaran, Erianto I. Putra, Iis Diatin, Muhammad Mujahid, Qi Adlan
Abstract:
Industrial plantation forest is the producer of logs in Indonesia. Several companies of industrial plantation forest have been successfully planted with fast-growing species, and it entered their annual harvesting period. Heavy machines such as forwarders are used in timber harvesting to extract logs from stump to landing site. The negative impact of using such machines are loss of topsoil and soil compaction. Compacted soil is considered unfavorable for plant growth. The research objectives were to analyze the soil bulk density, rut, and cone index of the soil caused by a forwarder passes, to analyze the relation between several times of forwarder passes to the increase of soil bulk density. A Valmet forwarder was used in this research. Soil bulk density at soil surface and cone index from the soil surface to the 50 cm depth of soil were measured at the harvested area. The result showed that soil bulk density increase with the increase of the Valmet forwarder passes. Maximum soil bulk density occurred after 5 times forwarder Valmet passed. The cone index tended to increase from the surface until 50 cm depth of soil. Rut formed and high soil bulk density indicated the soil compaction occurred by the forwarder operation.Keywords: bulk density, forwarder Valmet, plantation forest, soil compaction, timber harvesting
Procedia PDF Downloads 1501843 Workflow Based Inspection of Geometrical Adaptability from 3D CAD Models Considering Production Requirements
Authors: Tobias Huwer, Thomas Bobek, Gunter Spöcker
Abstract:
Driving forces for enhancements in production are trends like digitalization and individualized production. Currently, such developments are restricted to assembly parts. Thus, complex freeform surfaces are not addressed in this context. The need for efficient use of resources and near-net-shape production will require individualized production of complex shaped workpieces. Due to variations between nominal model and actual geometry, this can lead to changes in operations in Computer-aided process planning (CAPP) to make CAPP manageable for an adaptive serial production. In this context, 3D CAD data can be a key to realizing that objective. Along with developments in the geometrical adaptation, a preceding inspection method based on CAD data is required to support the process planner by finding objective criteria to make decisions about the adaptive manufacturability of workpieces. Nowadays, this kind of decisions is depending on the experience-based knowledge of humans (e.g. process planners) and results in subjective decisions – leading to a variability of workpiece quality and potential failure in production. In this paper, we present an automatic part inspection method, based on design and measurement data, which evaluates actual geometries of single workpiece preforms. The aim is to automatically determine the suitability of the current shape for further machining, and to provide a basis for an objective decision about subsequent adaptive manufacturability. The proposed method is realized by a workflow-based approach, keeping in mind the requirements of industrial applications. Workflows are a well-known design method of standardized processes. Especially in applications like aerospace industry standardization and certification of processes are an important aspect. Function blocks, providing a standardized, event-driven abstraction to algorithms and data exchange, will be used for modeling and execution of inspection workflows. Each analysis step of the inspection, such as positioning of measurement data or checking of geometrical criteria, will be carried out by function blocks. One advantage of this approach is its flexibility to design workflows and to adapt algorithms specific to the application domain. In general, within the specified tolerance range it will be checked if a geometrical adaption is possible. The development of particular function blocks is predicated on workpiece specific information e.g. design data. Furthermore, for different product lifecycle phases, appropriate logics and decision criteria have to be considered. For example, tolerances for geometric deviations are different in type and size for new-part production compared to repair processes. In addition to function blocks, appropriate referencing systems are important. They need to support exact determination of position and orientation of the actual geometries to provide a basis for precise analysis. The presented approach provides an inspection methodology for adaptive and part-individual process chains. The analysis of each workpiece results in an inspection protocol and an objective decision about further manufacturability. A representative application domain is the product lifecycle of turbine blades containing a new-part production and a maintenance process. In both cases, a geometrical adaptation is required to calculate individual production data. In contrast to existing approaches, the proposed initial inspection method provides information to decide between different potential adaptive machining processes.Keywords: adaptive, CAx, function blocks, turbomachinery
Procedia PDF Downloads 300