Search results for: permittivity measurement techniques
8193 Monitoring the Drying and Grinding Process during Production of Celitement through a NIR-Spectroscopy Based Approach
Authors: Carolin Lutz, Jörg Matthes, Patrick Waibel, Ulrich Precht, Krassimir Garbev, Günter Beuchle, Uwe Schweike, Peter Stemmermann, Hubert B. Keller
Abstract:
Online measurement of the product quality is a challenging task in cement production, especially in the production of Celitement, a novel environmentally friendly hydraulic binder. The mineralogy and chemical composition of clinker in ordinary Portland cement production is measured by X-ray diffraction (XRD) and X ray fluorescence (XRF), where only crystalline constituents can be detected. But only a small part of the Celitement components can be measured via XRD, because most constituents have an amorphous structure. This paper describes the development of algorithms suitable for an on-line monitoring of the final processing step of Celitement based on NIR-data. For calibration intermediate products were dried at different temperatures and ground for variable durations. The products were analyzed using XRD and thermogravimetric analyses together with NIR-spectroscopy to investigate the dependency between the drying and the milling processes on one and the NIR-signal on the other side. As a result, different characteristic parameters have been defined. A short overview of the Celitement process and the challenging tasks of the online measurement and evaluation of the product quality will be presented. Subsequently, methods for systematic development of near-infrared calibration models and the determination of the final calibration model will be introduced. The application of the model on experimental data illustrates that NIR-spectroscopy allows for a quick and sufficiently exact determination of crucial process parameters.Keywords: calibration model, celitement, cementitious material, NIR spectroscopy
Procedia PDF Downloads 5038192 Kinetic and Mechanistic Study on the Degradation of Typical Pharmaceutical and Personal Care Products in Water by Using Carbon Nanodots/C₃N₄ Composite and Ultrasonic Irradiation
Authors: Miao Yang
Abstract:
PPCPs (pharmaceutical and personal care products) in water, as an environmental pollutant, becomes an issue of increasing concern. Therefore, the techniques for degradation of PPCPs has been a hotspot in water pollution control field. Since there are several disadvantages for common degradation techniques of PPCPs, such as low degradation efficiency for certain PPCPs (ibuprofen and Carbamazepine) this proposal will adopt a combined technique by using CDs (carbon nanodots)/C₃N₄ composite and ultrasonic irradiation to mitigate or overcome these shortages. There is a significant scientific problem that the mechanism including PPCPs, major reactants, and interfacial active sites is not clear yet in the study of PPCPs degradation. This work aims to solve this problem by using both theoretical and experimental methodologies. Firstly, optimized parameters will be obtained by evaluating the kinetics and oxidation efficiency under different conditions. The competition between H₂O₂ and PPCPs with HO• will be elucidated, after which the degradation mechanism of PPCPs by the synergy of CDs/C₃N₄ composite and ultrasonic irradiation will be proposed. Finally, a sonolysis-adsorption-catalysis coupling mechanism will be established which is the theoretical basis and technical support for developing new efficient degradation techniques for PPCPs in the future.Keywords: carbon nanodots/C₃N₄, pharmaceutical and personal care products, ultrasonic irradiation, hydroxyl radical, heterogeneous catalysis
Procedia PDF Downloads 1838191 Fabrication Characteristics and Mechanical Behaviour of Fly Ash-Alumina Reinforced Zn-27Al Alloy Matrix Hybrid Composite Using Stir-Casting Technique
Authors: Oluwagbenga B. Fatile, Felix U. Idu, Olajide T. Sanya
Abstract:
This paper reports the viability of developing Zn-27Al alloy matrix hybrid composites reinforced with alumina, graphite and fly ash (a solid waste byproduct of coal in thermal power plants). This research work was aimed at developing low cost-high performance Zn-27Al matrix composite with low density. Alumina particulates (Al2O3), graphite added with 0, 2, 3, 4, and 5 wt% fly ash were utilized to prepare 10wt% reinforcing phase with Zn-27Al alloy as matrix using two-step stir casting method. Density measurement estimated percentage porosity, tensile testing, micro hardness measurement, and optical microscopy were used to assess the performance of the composites produced. The results show that the hardness, ultimate tensile strength, and percent elongation of the hybrid composites decrease with increase in fly ash content. The maximum decrease in hardness and ultimate tensile strength of 13.72% and 15.25% respectively were observed for composite grade containing 5wt% fly ash. The percentage elongation of composite sample without fly ash is 8.9% which is comparable with that of the sample containing 2wt% fly ash with percentage elongation of 8.8%. The fracture toughness of the fly ash containing composites was, however, superior to those of composites without fly ash with 5wt% fly ash containing composite exhibiting the highest fracture toughness. The results show that fly ash can be utilized as complementary reinforcement in ZA-27 alloy matrix composite to reduce cost.Keywords: fly ash, hybrid composite, mechanical behaviour, stir-cast
Procedia PDF Downloads 3388190 An Application of Modified M-out-of-N Bootstrap Method to Heavy-Tailed Distributions
Authors: Hannah F. Opayinka, Adedayo A. Adepoju
Abstract:
This study is an extension of a prior study on the modification of the existing m-out-of-n (moon) bootstrap method for heavy-tailed distributions in which modified m-out-of-n (mmoon) was proposed as an alternative method to the existing moon technique. In this study, both moon and mmoon techniques were applied to two real income datasets which followed Lognormal and Pareto distributions respectively with finite variances. The performances of these two techniques were compared using Standard Error (SE) and Root Mean Square Error (RMSE). The findings showed that mmoon outperformed moon bootstrap in terms of smaller SEs and RMSEs for all the sample sizes considered in the two datasets.Keywords: Bootstrap, income data, lognormal distribution, Pareto distribution
Procedia PDF Downloads 1908189 Innovative Acoustic Emission Techniques for Concrete Health Monitoring
Authors: Rahmat Ali, Beenish Khan, Aftabullah, Abid A. Shah
Abstract:
This research is an attempt to investigate the wide range of events using acoustic emission (AE) sensors of the concrete cubes subjected to different stress condition loading and unloading of concrete cubes. A total of 27 specimens were prepared and tested including 18 cubic (6”x6”x6”) and nine cylindrical (4”x8”) specimens were molded from three batches of concrete using w/c of 0.40, 0.50, and 0.60. The compressive strength of concrete was determined from concrete cylinder specimens. The deterioration of concrete was evaluated using the occurrence of felicity and Kaiser effects at each stress condition. It was found that acoustic emission hits usually exceeded when damage increases. Additionally, the correlation between AE techniques and the load applied were determined by plotting the normalized values. The influence of w/c on sensitivity of the AE technique in detecting concrete damages was also investigated.Keywords: acoustic emission, concrete, felicity ratio, sensors
Procedia PDF Downloads 3648188 Recommender Systems Using Ensemble Techniques
Authors: Yeonjeong Lee, Kyoung-jae Kim, Youngtae Kim
Abstract:
This study proposes a novel recommender system that uses data mining and multi-model ensemble techniques to enhance the recommendation performance through reflecting the precise user’s preference. The proposed model consists of two steps. In the first step, this study uses logistic regression, decision trees, and artificial neural networks to predict customers who have high likelihood to purchase products in each product group. Then, this study combines the results of each predictor using the multi-model ensemble techniques such as bagging and bumping. In the second step, this study uses the market basket analysis to extract association rules for co-purchased products. Finally, the system selects customers who have high likelihood to purchase products in each product group and recommends proper products from same or different product groups to them through above two steps. We test the usability of the proposed system by using prototype and real-world transaction and profile data. In addition, we survey about user satisfaction for the recommended product list from the proposed system and the randomly selected product lists. The results also show that the proposed system may be useful in real-world online shopping store.Keywords: product recommender system, ensemble technique, association rules, decision tree, artificial neural networks
Procedia PDF Downloads 2978187 Evaluating the Validity of CFD Model of Dispersion in a Complex Urban Geometry Using Two Sets of Experimental Measurements
Authors: Mohammad R. Kavian Nezhad, Carlos F. Lange, Brian A. Fleck
Abstract:
This research presents the validation study of a computational fluid dynamics (CFD) model developed to simulate the scalar dispersion emitted from rooftop sources around the buildings at the University of Alberta North Campus. The ANSYS CFX code was used to perform the numerical simulation of the wind regime and pollutant dispersion by solving the 3D steady Reynolds-averaged Navier-Stokes (RANS) equations on a building-scale high-resolution grid. The validation study was performed in two steps. First, the CFD model performance in 24 cases (eight wind directions and three wind speeds) was evaluated by comparing the predicted flow fields with the available data from the previous measurement campaign designed at the North Campus, using the standard deviation method (SDM), while the estimated results of the numerical model showed maximum average percent errors of approximately 53% and 37% for wind incidents from the North and Northwest, respectively. Good agreement with the measurements was observed for the other six directions, with an average error of less than 30%. In the second step, the reliability of the implemented turbulence model, numerical algorithm, modeling techniques, and the grid generation scheme was further evaluated using the Mock Urban Setting Test (MUST) dispersion dataset. Different statistical measures, including the fractional bias (FB), the geometric mean bias (MG), and the normalized mean square error (NMSE), were used to assess the accuracy of the predicted dispersion field. Our CFD results are in very good agreement with the field measurements.Keywords: CFD, plume dispersion, complex urban geometry, validation study, wind flow
Procedia PDF Downloads 1408186 Review of Research on Effectiveness Evaluation of Technology Innovation Policy
Authors: Xue Wang, Li-Wei Fan
Abstract:
The technology innovation has become the driving force of social and economic development and transformation. The guidance and support of public policies is an important condition to promote the realization of technology innovation goals. Policy effectiveness evaluation is instructive in policy learning and adjustment. This paper reviews existing studies and systematically evaluates the effectiveness of policy-driven technological innovation. We used 167 articles from WOS and CNKI databases as samples to clarify the measurement of technological innovation indicators and analyze the classification and application of policy evaluation methods. In general, technology innovation input and technological output are the two main aspects of technological innovation index design, among which technological patents are the focus of research, the number of patents reflects the scale of technological innovation, and the quality of patents reflects the value of innovation from multiple aspects. As for policy evaluation methods, statistical analysis methods are applied to the formulation, selection and evaluation of the after-effect of policies to analyze the effect of policy implementation qualitatively and quantitatively. The bibliometric methods are mainly based on the public policy texts, discriminating the inter-government relationship and the multi-dimensional value of the policy. Decision analysis focuses on the establishment and measurement of the comprehensive evaluation index system of public policy. The economic analysis methods focus on the performance and output of technological innovation to test the policy effect. Finally, this paper puts forward the prospect of the future research direction.Keywords: technology innovation, index, policy effectiveness, evaluation of policy, bibliometric analysis
Procedia PDF Downloads 758185 An Energy Transfer Fluorescent Probe System for Glucose Sensor at Biomimetic Membrane Surface
Authors: Hoa Thi Hoang, Stephan Sass, Michael U. Kumke
Abstract:
Concanavalin A (conA) is a protein has been widely used in sensor system based on its specific binding to α-D-Glucose or α-D-Manose. For glucose sensor using conA, either fluoresence based techniques with intensity based or lifetime based are used. In this research, liposomes made from phospholipids were used as a biomimetic membrane system. In a first step, novel building blocks containing perylene labeled glucose units were added to the system and used to decorate the surface of the liposomes. Upon the binding between rhodamine labeled con A to the glucose units at the biomimetic membrane surface, a Förster resonance energy transfer system can be formed which combines unique fluorescence properties of perylene (e.g., high fluorescence quantum yield, no triplet formation) and its high hydrophobicity for efficient anchoring in membranes to form a novel probe for the investigation of sugar-driven binding reactions at biomimetic surfaces. Two glucose-labeled perylene derivatives were synthesized with different spacer length between the perylene and glucose unit in order to probe the binding of conA. The binding interaction was fully characterized by using high-end fluorescence techniques. Steady-state and time-resolved fluorescence techniques (e.g., fluorescence depolarization) in combination with single-molecule fluorescence spectroscopy techniques (fluorescence correlation spectroscopy, FCS) were used to monitor the interaction with conA. Base on the fluorescence depolarization, the rotational correlation times and the alteration in the diffusion coefficient (determined by FCS) the binding of the conA to the liposomes carrying the probe was studied. Moreover, single pair FRET experiments using pulsed interleaved excitation are used to characterize in detail the binding of conA to the liposome on a single molecule level avoiding averaging out effects.Keywords: concanavalin A, FRET, sensor, biomimetic membrane
Procedia PDF Downloads 3108184 Nature as a Human Health Asset: An Extensive Review
Authors: C. Sancho Salvatierra, J. M. Martinez Nieto, R. García Gonzalez-Gordon, M. I. Martinez Bellido
Abstract:
Introduction: Nature could act as an asset for human health protecting against possible diseases and promoting the state of both physical and mental health. Goals: This paper aims to determine which natural elements present evidence that show positive influence on human health, on which particular aspects and how. It also aims to determine the best biomarkers to measure such influence. Method: A systematic literature review was carried out. First, a general free text search was performed in databases, such as Scopus, PubMed or PsychInfo. Secondly, a specific search was performed combining keywords in order of increasing complexity. Also the Snowballing technique was used and it was consulted in the CSIC’s (The Spanish National Research Council). Databases: Of the 130 articles obtained and reviewed, 80 referred to natural elements that influenced health. These 80 articles were classified and tabulated according to the nature elements found, the health aspects studied, the health measurement parameters used and the measurement techniques used. In this classification the results of the studies were codified according to whether they were positive, negative or neutral both for the elements of nature and for the aspects of health studied. Finally, the results of the 80 selected studies were summarized and categorized according to the elements of nature that showed the greatest positive influence on health and the biomarkers that had shown greater reliability to measure said influence. Results: Of the 80 articles studied, 24 (30.0%) were reviews and 56 (70.0%) were original research articles. Among the 24 reviews, 18 (75%) found positive results of natural elements on health, and 6 (25%) both positive and negative effects. Of the 56 original articles, 47 (83.9%) showed positive results, 3 (5.4%) both positive and negative, 4 (7.1%) negative effects, and 2 (3.6%) found no effects. The results reflect positive effects of different elements of nature on the following pathologies: diabetes, high blood pressure, stress, attention deficit hyperactivity disorder, psychotic, anxiety and affective disorders. They also show positive effects on the following areas: immune system, social interaction, recovery after illness, mood, decreased aggressiveness, concentrated attention, cognitive performance, restful sleep, vitality and sense of well-being. Among the elements of nature studied, those that show the greatest positive influence on health are forest immersion, natural views, daylight, outdoor physical activity, active transport, vegetation biodiversity, natural sounds and the green residences. As for the biomarkers used that show greater reliability to measure the effects of natural elements are the levels of cortisol (both in blood and saliva), vitamin D levels, serotonin and melatonin, blood pressure, heart rate, muscle tension and skin conductance. Conclusions: Nature is an asset for health, well-being and quality of life. Awareness programs, education and health promotion are needed based on the elements that nature brings us, which in turn generate proactive attitudes in the population towards the protection and conservation of nature. The studies related to this subject in Spain are very scarce. Aknowledgements. This study has been promoted and partially financed by the Environmental Foundation Jaime González-Gordon.Keywords: health, green areas, nature, well-being
Procedia PDF Downloads 2818183 Role of Natural Language Processing in Information Retrieval; Challenges and Opportunities
Authors: Khaled M. Alhawiti
Abstract:
This paper aims to analyze the role of natural language processing (NLP). The paper will discuss the role in the context of automated data retrieval, automated question answer, and text structuring. NLP techniques are gaining wider acceptance in real life applications and industrial concerns. There are various complexities involved in processing the text of natural language that could satisfy the need of decision makers. This paper begins with the description of the qualities of NLP practices. The paper then focuses on the challenges in natural language processing. The paper also discusses major techniques of NLP. The last section describes opportunities and challenges for future research.Keywords: data retrieval, information retrieval, natural language processing, text structuring
Procedia PDF Downloads 3448182 Urban Analysis of the Old City of Oran and Its Building after an Earthquake
Authors: A. Zatir, A. Mokhtari, A. Foufa, S. Zatir
Abstract:
The city of Oran, like any other region of northern Algeria, is subject to frequent seismic activity, the study presented in this work will be based on an analysis of urban and architectural context of the city of Oran before the date of the earthquake of 1790, and then try to deduce the differences between the old city before and after the earthquake. The analysis developed as a specific objective to tap into the seismic history of the city of Oran parallel to its urban history. The example of the citadel of Oran indicates that constructions presenting the site of the old citadel, may present elements of resistance for face to seismic effects. Removed in city observations of these structures, showed the ingenuity of the techniques used by the ancient builders, including the good performance of domes and arches in resistance to seismic forces.Keywords: earthquake, citadel, performance, traditional techniques, constructions
Procedia PDF Downloads 3078181 Enhanced Acquisition Time of a Quantum Holography Scheme within a Nonlinear Interferometer
Authors: Sergio Tovar-Pérez, Sebastian Töpfer, Markus Gräfe
Abstract:
The work proposes a technique that decreases the detection acquisition time of quantum holography schemes down to one-third; this allows the possibility to image moving objects. Since its invention, quantum holography with undetected photon schemes has gained interest in the scientific community. This is mainly due to its ability to tailor the detected wavelengths according to the needs of the scheme implementation. Yet this wavelength flexibility grants the scheme a wide range of possible applications; an important matter was yet to be addressed. Since the scheme uses digital phase-shifting techniques to retrieve the information of the object out of the interference pattern, it is necessary to acquire a set of at least four images of the interference pattern along with well-defined phase steps to recover the full object information. Hence, the imaging method requires larger acquisition times to produce well-resolved images. As a consequence, the measurement of moving objects remains out of the reach of the imaging scheme. This work presents the use and implementation of a spatial light modulator along with a digital holographic technique called quasi-parallel phase-shifting. This technique uses the spatial light modulator to build a structured phase image consisting of a chessboard pattern containing the different phase steps for digitally calculating the object information. Depending on the reduction in the number of needed frames, the acquisition time reduces by a significant factor. This technique opens the door to the implementation of the scheme for moving objects. In particular, the application of this scheme in imaging alive specimens comes one step closer.Keywords: quasi-parallel phase shifting, quantum imaging, quantum holography, quantum metrology
Procedia PDF Downloads 1178180 Nanomaterial Based Electrochemical Sensors for Endocrine Disrupting Compounds
Authors: Gaurav Bhanjana, Ganga Ram Chaudhary, Sandeep Kumar, Neeraj Dilbaghi
Abstract:
Main sources of endocrine disrupting compounds in the ecosystem are hormones, pesticides, phthalates, flame retardants, dioxins, personal-care products, coplanar polychlorinated biphenyls (PCBs), bisphenol A, and parabens. These endocrine disrupting compounds are responsible for learning disabilities, brain development problems, deformations of the body, cancer, reproductive abnormalities in females and decreased sperm count in human males. Although discharge of these chemical compounds into the environment cannot be stopped, yet their amount can be retarded through proper evaluation and detection techniques. The available techniques for determination of these endocrine disrupting compounds mainly include high performance liquid chromatography (HPLC), mass spectroscopy (MS) and gas chromatography-mass spectrometry (GC–MS). These techniques are accurate and reliable but have certain limitations like need of skilled personnel, time consuming, interference and requirement of pretreatment steps. Moreover, these techniques are laboratory bound and sample is required in large amount for analysis. In view of above facts, new methods for detection of endocrine disrupting compounds should be devised that promise high specificity, ultra sensitivity, cost effective, efficient and easy-to-operate procedure. Nowadays, electrochemical sensors/biosensors modified with nanomaterials are gaining high attention among researchers. Bioelement present in this system makes the developed sensors selective towards analyte of interest. Nanomaterials provide large surface area, high electron communication feature, enhanced catalytic activity and possibilities of chemical modifications. In most of the cases, nanomaterials also serve as an electron mediator or electrocatalyst for some analytes.Keywords: electrochemical, endocrine disruptors, microscopy, nanoparticles, sensors
Procedia PDF Downloads 2768179 Efficacy of Deep Learning for Below-Canopy Reconstruction of Satellite and Aerial Sensing Point Clouds through Fractal Tree Symmetry
Authors: Dhanuj M. Gandikota
Abstract:
Sensor-derived three-dimensional (3D) point clouds of trees are invaluable in remote sensing analysis for the accurate measurement of key structural metrics, bio-inventory values, spatial planning/visualization, and ecological modeling. Machine learning (ML) holds the potential in addressing the restrictive tradeoffs in cost, spatial coverage, resolution, and information gain that exist in current point cloud sensing methods. Terrestrial laser scanning (TLS) remains the highest fidelity source of both canopy and below-canopy structural features, but usage is limited in both coverage and cost, requiring manual deployment to map out large, forested areas. While aerial laser scanning (ALS) remains a reliable avenue of LIDAR active remote sensing, ALS is also cost-restrictive in deployment methods. Space-borne photogrammetry from high-resolution satellite constellations is an avenue of passive remote sensing with promising viability in research for the accurate construction of vegetation 3-D point clouds. It provides both the lowest comparative cost and the largest spatial coverage across remote sensing methods. However, both space-borne photogrammetry and ALS demonstrate technical limitations in the capture of valuable below-canopy point cloud data. Looking to minimize these tradeoffs, we explored a class of powerful ML algorithms called Deep Learning (DL) that show promise in recent research on 3-D point cloud reconstruction and interpolation. Our research details the efficacy of applying these DL techniques to reconstruct accurate below-canopy point clouds from space-borne and aerial remote sensing through learned patterns of tree species fractal symmetry properties and the supplementation of locally sourced bio-inventory metrics. From our dataset, consisting of tree point clouds obtained from TLS, we deconstructed the point clouds of each tree into those that would be obtained through ALS and satellite photogrammetry of varying resolutions. We fed this ALS/satellite point cloud dataset, along with the simulated local bio-inventory metrics, into the DL point cloud reconstruction architectures to generate the full 3-D tree point clouds (the truth values are denoted by the full TLS tree point clouds containing the below-canopy information). Point cloud reconstruction accuracy was validated both through the measurement of error from the original TLS point clouds as well as the error of extraction of key structural metrics, such as crown base height, diameter above root crown, and leaf/wood volume. The results of this research additionally demonstrate the supplemental performance gain of using minimum locally sourced bio-inventory metric information as an input in ML systems to reach specified accuracy thresholds of tree point cloud reconstruction. This research provides insight into methods for the rapid, cost-effective, and accurate construction of below-canopy tree 3-D point clouds, as well as the supported potential of ML and DL to learn complex, unmodeled patterns of fractal tree growth symmetry.Keywords: deep learning, machine learning, satellite, photogrammetry, aerial laser scanning, terrestrial laser scanning, point cloud, fractal symmetry
Procedia PDF Downloads 1058178 Optimal Classifying and Extracting Fuzzy Relationship from Query Using Text Mining Techniques
Authors: Faisal Alshuwaier, Ali Areshey
Abstract:
Text mining techniques are generally applied for classifying the text, finding fuzzy relations and structures in data sets. This research provides plenty text mining capabilities. One common application is text classification and event extraction, which encompass deducing specific knowledge concerning incidents referred to in texts. The main contribution of this paper is the clarification of a concept graph generation mechanism, which is based on a text classification and optimal fuzzy relationship extraction. Furthermore, the work presented in this paper explains the application of fuzzy relationship extraction and branch and bound method to simplify the texts.Keywords: extraction, max-prod, fuzzy relations, text mining, memberships, classification, memberships, classification
Procedia PDF Downloads 5858177 Analysis of Supply Chain Risk Management Strategies: Case Study of Supply Chain Disruptions
Authors: Marcelo Dias Carvalho, Leticia Ishikawa
Abstract:
Supply Chain Risk Management refers to a set of strategies used by companies to avoid supply chain disruption caused by damage at production facilities, natural disasters, capacity issues, inventory problems, incorrect forecasts, and delays. Many companies use the techniques of the Toyota Production System, which in a way goes against a better management of supply chain risks. This paper studies key events in some multinationals to analyze the trade-off between the best supply chain risk management techniques and management policies designed to create lean enterprises. The result of a good balance of these actions is the reduction of losses, increased customer trust in the company and better preparedness to face the general risks of a supply chain.Keywords: just in time, lean manufacturing, supply chain disruptions, supply chain management
Procedia PDF Downloads 3418176 Design and Implementation of Smart Watch Textile Antenna for Wi-Fi Bio-Medical Applications in Millimetric Wave Band
Authors: M. G. Ghanem, A. M. M. A. Allam, Diaa E. Fawzy, Mehmet Faruk Cengiz
Abstract:
This paper is devoted to the design and implementation of a smartwatch textile antenna for Wi-Fi bio-medical applications in millimetric wave bands. The antenna is implemented on a leather textile-based substrate to be embedded in a smartwatch. It enables the watch to pick Wi-Fi signals without the need to be connected to a mobile through Bluetooth. It operates at 60 GHz or WiGig (Wireless Gigabit Alliance) band with a wide band for higher rate applications. It also could be implemented over many stratified layers of the body organisms to be used in the diagnosis of many diseases like diabetes and cancer. The structure is designed and simulated using CST (Studio Suite) program. The wearable patch antenna has an octagon shape, and it is implemented on leather material that acts as a flexible substrate with a size of 5.632 x 6.4 x 2 mm3, a relative permittivity of 2.95, and a loss tangent of 0.006. The feeding is carried out using differential feed (discrete port in CST). The work provides five antenna implementations; antenna without ground, a ground is added at the back of the antenna in order to increase the antenna gain, the substrate dimensions are increased to 15 x 30 mm2 to resemble the real hand watch size, layers of skin and fat are added under the ground of the antenna to study the effect of human body tissues human on the antenna performance. Finally, the whole structure is bent. It is found that the antenna can achieve a simulated peak realized gain in dB of 5.68, 7.28, 6.15, 3.03, and 4.37 for antenna without ground, antenna with the ground, antenna with larger substrate dimensions, antenna with skin and fat, and bent structure, respectively. The antenna with ground exhibits high gain; while adding the human organisms absorption, the gain is degraded because of human absorption. The bent structure contributes to higher gain.Keywords: bio medical engineering, millimetric wave, smart watch, textile antennas, Wi-Fi
Procedia PDF Downloads 1248175 Estimation of PM10 Concentration Using Ground Measurements and Landsat 8 OLI Satellite Image
Authors: Salah Abdul Hameed Saleh, Ghada Hasan
Abstract:
The aim of this work is to produce an empirical model for the determination of particulate matter (PM10) concentration in the atmosphere using visible bands of Landsat 8 OLI satellite image over Kirkuk city- IRAQ. The suggested algorithm is established on the aerosol optical reflectance model. The reflectance model is a function of the optical properties of the atmosphere, which can be related to its concentrations. The concentration of PM10 measurements was collected using Particle Mass Profiler and Counter in a Single Handheld Unit (Aerocet 531) meter simultaneously by the Landsat 8 OLI satellite image date. The PM10 measurement locations were defined by a handheld global positioning system (GPS). The obtained reflectance values for visible bands (Coastal aerosol, Blue, Green and blue bands) of landsat 8 OLI image were correlated with in-suite measured PM10. The feasibility of the proposed algorithms was investigated based on the correlation coefficient (R) and root-mean-square error (RMSE) compared with the PM10 ground measurement data. A choice of our proposed multispectral model was founded on the highest value correlation coefficient (R) and lowest value of the root mean square error (RMSE) with PM10 ground data. The outcomes of this research showed that visible bands of Landsat 8 OLI were capable of calculating PM10 concentration with an acceptable level of accuracy.Keywords: air pollution, PM10 concentration, Lansat8 OLI image, reflectance, multispectral algorithms, Kirkuk area
Procedia PDF Downloads 4438174 Development of Evolutionary Algorithm by Combining Optimization and Imitation Approach for Machine Learning in Gaming
Authors: Rohit Mittal, Bright Keswani, Amit Mithal
Abstract:
This paper provides a sense about the application of computational intelligence techniques used to develop computer games, especially car racing. For the deep sense and knowledge of artificial intelligence, this paper is divided into various sections that is optimization, imitation, innovation and combining approach of optimization and imitation. This paper is mainly concerned with combining approach which tells different aspects of using fitness measures and supervised learning techniques used to imitate aspects of behavior. The main achievement of this paper is based on modelling player behaviour and evolving new game content such as racing tracks as single car racing on single track.Keywords: evolution algorithm, genetic, optimization, imitation, racing, innovation, gaming
Procedia PDF Downloads 6508173 Glucose Measurement in Response to Environmental and Physiological Challenges: Towards a Non-Invasive Approach to Study Stress in Fishes
Authors: Tomas Makaras, Julija Razumienė, Vidutė Gurevičienė, Gintarė Sauliutė, Milda Stankevičiūtė
Abstract:
Stress responses represent animal’s natural reactions to various challenging conditions and could be used as a welfare indicator. Regardless of the wide use of glucose measurements in stress evaluation, there are some inconsistencies in its acceptance as a stress marker, especially when it comes to comparison with non-invasive cortisol measurements in the fish challenging stress. To meet the challenge and to test the reliability and applicability of glucose measurement in practice, in this study, different environmental/anthropogenic exposure scenarios were simulated to provoke chemical-induced stress in fish (14-days exposure to landfill leachate) followed by a 14-days stress recovery period and under the cumulative effect of leachate fish subsequently exposed to pathogenic oomycetes (Saprolegnia parasitica) to represent a possible infection in fish. It is endemic to all freshwater habitats worldwide and is partly responsible for the decline of natural freshwater fish populations. Brown trout (Salmo trutta fario) and sea trout (Salmo trutta trutta) juveniles were chosen because of a large amount of literature on physiological stress responses in these species was known. Glucose content in fish by applying invasive and non-invasive glucose measurement procedures in different test mediums such as fish blood, gill tissues and fish-holding water were analysed. The results indicated that the quantity of glucose released in the holding water of stressed fish increased considerably (approx. 3.5- to 8-fold) and remained substantially higher (approx. 2- to 4-fold) throughout the stress recovery period than the control level suggesting that fish did not recover from chemical-induced stress. The circulating levels of glucose in blood and gills decreased over time in fish exposed to different stressors. However, the gill glucose level in fish showed a decrease similar to the control levels measured at the same time points, which was found to be insignificant. The data analysis showed that concentrations of β-D glucose measured in gills of fish treated with S. parasitica differed significantly from the control recovery, but did not differ from the leachate recovery group showing that S. parasitica presence in water had no additive effects. In contrast, a positive correlation between blood and gills glucose were determined. Parallel trends in blood and water glucose changes suggest that water glucose measurement has much potency in predicting stress. This study demonstrated that measuring β-D-glucose in fish-holding water is not stressful as it involves no handling and manipulation of an organism and has critical technical advantages concerning current (invasive) methods, mainly using blood samples or specific tissues. The quantification of glucose could be essential for studies examining the stress physiology/aquaculture studies interested in the assessment or long-term monitoring of fish health.Keywords: brown trout, landfill leachate, sea trout, pathogenic oomycetes, β-D-glucose
Procedia PDF Downloads 1808172 Isolated Iterating Fractal Independently Corresponds with Light and Foundational Quantum Problems
Authors: Blair D. Macdonald
Abstract:
After nearly one hundred years of its origin, foundational quantum mechanics remains one of the greatest unexplained mysteries in physicists today. Within this time, chaos theory and its geometry, the fractal, has developed. In this paper, the propagation behaviour with an iteration of a simple fractal, the Koch Snowflake, was described and analysed. From an arbitrary observation point within the fractal set, the fractal propagates forward by oscillation—the focus of this study and retrospectively behind by exponential growth from a point beginning. It propagates a potentially infinite exponential oscillating sinusoidal wave of discrete triangle bits sharing many characteristics of light and quantum entities. The model's wave speed is potentially constant, offering insights into the perception and a direction of time where, to an observer, when travelling at the frontier of propagation, time may slow to a stop. In isolation, the fractal is a superposition of component bits where position and scale present a problem of location. In reality, this problem is experienced within fractal landscapes or fields where 'position' is only 'known' by the addition of information or markers. The quantum' measurement problem', 'uncertainty principle,' 'entanglement,' and the classical-quantum interface are addressed; these are a problem of scale invariance associated with isolated fractality. Dual forward and retrospective perspectives of the fractal model offer the opportunity for unification between quantum mechanics and cosmological mathematics, observations, and conjectures. Quantum and cosmological problems may be different aspects of the one fractal geometry.Keywords: measurement problem, observer, entanglement, unification
Procedia PDF Downloads 938171 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia
Authors: Carol Anne Hargreaves
Abstract:
A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.Keywords: machine learning, stock market trading, logistic regression, cluster analysis, factor analysis, decision trees, neural networks, automated stock investment system
Procedia PDF Downloads 1608170 Synthesis and Characterization of Hydroxyapatite from Biowaste for Potential Medical Application
Authors: M. D. H. Beg, John O. Akindoyo, Suriati Ghazali, Nitthiyah Jeyaratnam
Abstract:
Over the period of time, several approaches have been undertaken to mitigate the challenges associated with bone regeneration. This includes but not limited to xenografts, allografts, autografts as well as artificial substitutions like bioceramics, synthetic cements and metals. The former three techniques often come along with peculiar limitation and problems such as morbidity, availability, disease transmission, collateral site damage or absolute rejection by the body as the case may be. Synthetic routes remain the only feasible alternative option for treatment of bone defects. Hydroxyapatite (HA) is very compatible and suitable for this application. However, most of the common methods for HA synthesis are either expensive, complicated or environmentally unfriendly. Interestingly, extraction of HA from bio-wastes have been perceived not only to be cost effective, but also environment friendly. In this research, HA was synthesized from bio-waste: namely bovine bones through three different methods which are hydrothermal chemical processes, ultrasound assisted synthesis and ordinary calcination techniques. Structure and property analysis of the HA was carried out through different characterization techniques such as TGA, FTIR, and XRD. All the methods applied were able to produce HA with similar compositional properties to biomaterials found in human calcified tissues. Calcination process was however observed to be more efficient as it eliminated all the organic components from the produced HA. The HA synthesized is unique for its minimal cost and environmental friendliness. It is also perceived to be suitable for tissue and bone engineering applications.Keywords: hydroxyapatite, bone, calcination, biowaste
Procedia PDF Downloads 2548169 Rail-To-Rail Output Op-Amp Design with Negative Miller Capacitance Compensation
Authors: Muhaned Zaidi, Ian Grout, Abu Khari bin A’ain
Abstract:
In this paper, a two-stage op-amp design is considered using both Miller and negative Miller compensation techniques. The first op-amp design uses Miller compensation around the second amplification stage, whilst the second op-amp design uses negative Miller compensation around the first stage and Miller compensation around the second amplification stage. The aims of this work were to compare the gain and phase margins obtained using the different compensation techniques and identify the ability to choose either compensation technique based on a particular set of design requirements. The two op-amp designs created are based on the same two-stage rail-to-rail output CMOS op-amp architecture where the first stage of the op-amp consists of differential input and cascode circuits, and the second stage is a class AB amplifier. The op-amps have been designed using a 0.35mm CMOS fabrication process.Keywords: op-amp, rail-to-rail output, Miller compensation, Negative Miller capacitance
Procedia PDF Downloads 3408168 Image Processing techniques for Surveillance in Outdoor Environment
Authors: Jayanth C., Anirudh Sai Yetikuri, Kavitha S. N.
Abstract:
This paper explores the development and application of computer vision and machine learning techniques for real-time pose detection, facial recognition, and number plate extraction. Utilizing MediaPipe for pose estimation, the research presents methods for detecting hand raises and ducking postures through real-time video analysis. Complementarily, facial recognition is employed to compare and verify individual identities using the face recognition library. Additionally, the paper demonstrates a robust approach for extracting and storing vehicle number plates from images, integrating Optical Character Recognition (OCR) with a database management system. The study highlights the effectiveness and versatility of these technologies in practical scenarios, including security and surveillance applications. The findings underscore the potential of combining computer vision techniques to address diverse challenges and enhance automated systems for both individual and vehicular identification. This research contributes to the fields of computer vision and machine learning by providing scalable solutions and demonstrating their applicability in real-world contexts.Keywords: computer vision, pose detection, facial recognition, number plate extraction, machine learning, real-time analysis, OCR, database management
Procedia PDF Downloads 318167 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods
Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin
Abstract:
Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile
Procedia PDF Downloads 1768166 Nondestructive Electrochemical Testing Method for Prestressed Concrete Structures
Authors: Tomoko Fukuyama, Osamu Senbu
Abstract:
Prestressed concrete is used a lot in infrastructures such as roads or bridges. However, poor grout filling and PC steel corrosion are currently major issues of prestressed concrete structures. One of the problems with nondestructive corrosion detection of PC steel is a plastic pipe which covers PC steel. The insulative property of pipe makes a nondestructive diagnosis difficult; therefore a practical technology to detect these defects is necessary for the maintenance of infrastructures. The goal of the research is a development of an electrochemical technique which enables to detect internal defects from the surface of prestressed concrete nondestructively. Ideally, the measurements should be conducted from the surface of structural members to diagnose non-destructively. In the present experiment, a prestressed concrete member is simplified as a layered specimen to simulate a current path between an input and an output electrode on a member surface. The specimens which are layered by mortar and the prestressed concrete constitution materials (steel, polyethylene, stainless steel, or galvanized steel plates) were provided to the alternating current impedance measurement. The magnitude of an applied electric field was 0.01-volt or 1-volt, and the frequency range was from 106 Hz to 10-2 Hz. The frequency spectrums of impedance, which relate to charge reactions activated by an electric field, were measured to clarify the effects of the material configurations or the properties. In the civil engineering field, the Nyquist diagram is popular to analyze impedance and it is a good way to grasp electric relaxation using a shape of the plot. However, it is slightly not suitable to figure out an influence of a measurement frequency which is reciprocal of reaction time. Hence, Bode diagram is also applied to describe charge reactions in the present paper. From the experiment results, the alternating current impedance method looks to be applicable to the insulative material measurement and eventually prestressed concrete diagnosis. At the same time, the frequency spectrums of impedance show the difference of the material configuration. This is because the charge mobility reflects the variety of substances and also the measuring frequency of the electric field determines migration length of charges which are under the influence of the electric field. However, it could not distinguish the differences of the material thickness and is inferred the difficulties of prestressed concrete diagnosis to identify the amount of an air void or a layer of corrosion product by the technique.Keywords: capacitance, conductance, prestressed concrete, susceptance
Procedia PDF Downloads 4178165 Determination of Complexity Level in Merged Irregular Transposition Cipher
Authors: Okike Benjamin, Garba Ejd
Abstract:
Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In order to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often easily decrypted by adversaries. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.Keywords: transposition cipher, merged irregular cipher, encryption, complexity level
Procedia PDF Downloads 3488164 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards
Authors: Golnush Masghati-Amoli, Paul Chin
Abstract:
Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering
Procedia PDF Downloads 140