Search results for: wind tunnel data
23991 An Online 3D Modeling Method Based on a Lossless Compression Algorithm
Authors: Jiankang Wang, Hongyang Yu
Abstract:
This paper proposes a portable online 3D modeling method. The method first utilizes a depth camera to collect data and compresses the depth data using a frame-by-frame lossless data compression method. The color image is encoded using the H.264 encoding format. After the cloud obtains the color image and depth image, a 3D modeling method based on bundlefusion is used to complete the 3D modeling. The results of this study indicate that this method has the characteristics of portability, online, and high efficiency and has a wide range of application prospects.Keywords: 3D reconstruction, bundlefusion, lossless compression, depth image
Procedia PDF Downloads 8223990 An Analytical Systematic Design Approach to Evaluate Ballistic Performance of Armour Grade AA7075 Aluminium Alloy Using Friction Stir Processing
Authors: Lahari Ramya Pa, Sudhakar Ib, Madhu Vc, Madhusudhan Reddy Gd, Srinivasa Rao E.
Abstract:
Selection of suitable armor materials for defense applications is very crucial with respect to increasing mobility of the systems as well as maintaining safety. Therefore, determining the material with the lowest possible areal density that resists the predefined threat successfully is required in armor design studies. A number of light metal and alloys are come in to forefront especially to substitute the armour grade steels. AA5083 aluminium alloy which fit in to the military standards imposed by USA army is foremost nonferrous alloy to consider for possible replacement of steel to increase the mobility of armour vehicles and enhance fuel economy. Growing need of AA5083 aluminium alloy paves a way to develop supplement aluminium alloys maintaining the military standards. It has been witnessed that AA 2xxx aluminium alloy, AA6xxx aluminium alloy and AA7xxx aluminium alloy are the potential material to supplement AA5083 aluminium alloy. Among those cited aluminium series alloys AA7xxx aluminium alloy (heat treatable) possesses high strength and can compete with armour grade steels. Earlier investigations revealed that layering of AA7xxx aluminium alloy can prevent spalling of rear portion of armour during ballistic impacts. Hence, present investigation deals with fabrication of hard layer (made of boron carbide) i.e. layer on AA 7075 aluminium alloy using friction stir processing with an intention of blunting the projectile in the initial impact and backing tough portion(AA7xxx aluminium alloy) to dissipate residual kinetic energy. An analytical approach has been adopted to unfold the ballistic performance of projectile. Penetration of projectile inside the armour has been resolved by considering by strain energy model analysis. Perforation shearing areas i.e. interface of projectile and armour is taken in to account for evaluation of penetration inside the armour. Fabricated surface composites (targets) were tested as per the military standard (JIS.0108.01) in a ballistic testing tunnel at Defence Metallurgical Research Laboratory (DMRL), Hyderabad in standardized testing conditions. Analytical results were well validated with experimental obtained one.Keywords: AA7075 aluminium alloy, friction stir processing, boron carbide, ballistic performance, target
Procedia PDF Downloads 33023989 H∞ Sampled-Data Control for Linear Systems Time-Varying Delays: Application to Power System
Authors: Chang-Ho Lee, Seung-Hoon Lee, Myeong-Jin Park, Oh-Min Kwon
Abstract:
This paper investigates improved stability criteria for sampled-data control of linear systems with disturbances and time-varying delays. Based on Lyapunov-Krasovskii stability theory, delay-dependent conditions sufficient to ensure H∞ stability for the system are derived in the form of linear matrix inequalities(LMI). The effectiveness of the proposed method will be shown in numerical examples.Keywords: sampled-data control system, Lyapunov-Krasovskii functional, time delay-dependent, LMI, H∞ control
Procedia PDF Downloads 32023988 Logistics Information Systems in the Distribution of Flour in Nigeria
Authors: Cornelius Femi Popoola
Abstract:
This study investigated logistics information systems in the distribution of flour in Nigeria. A case study design was used and 50 staff of Honeywell Flour Mill was sampled for the study. Data generated through a questionnaire were analysed using correlation and regression analysis. The findings of the study revealed that logistic information systems such as e-commerce, interactive telephone systems and electronic data interchange positively correlated with the distribution of flour in Honeywell Flour Mill. Finding also deduced that e-commerce, interactive telephone systems and electronic data interchange jointly and positively contribute to the distribution of flour in Honeywell Flour Mill in Nigeria (R = .935; Adj. R2 = .642; F (3,47) = 14.739; p < .05). The study therefore recommended that Honeywell Flour Mill should upgrade their logistic information systems to computer-to-computer communication of business transactions and documents, as well adopt new technology such as, tracking-and-tracing systems (barcode scanning for packages and palettes), tracking vehicles with Global Positioning System (GPS), measuring vehicle performance with ‘black boxes’ (containing logistic data), and Automatic Equipment Identification (AEI) into their systems.Keywords: e-commerce, electronic data interchange, flour distribution, information system, interactive telephone systems
Procedia PDF Downloads 55323987 Seismic Assessment of an Existing Dual System RC Buildings in Madinah City
Authors: Tarek M. Alguhane, Ayman H. Khalil, M. N. Fayed, Ayman M. Ismail
Abstract:
A 15-storey RC building, studied in this paper, is representative of modern building type constructed in Madina City in Saudi Arabia before 10 years ago. These buildings are almost consisting of reinforced concrete skeleton, i. e. columns, beams and flat slab as well as shear walls in the stairs and elevator areas arranged in the way to have a resistance system for lateral loads (wind–earthquake loads). In this study, the dynamic properties of the 15-storey RC building were identified using ambient motions recorded at several spatially-distributed locations within each building. After updating the mathematical models for this building with the experimental results, three dimensional pushover analysis (nonlinear static analysis) was carried out using SAP2000 software incorporating inelastic material properties for concrete, infill and steel. The effect of modeling the building with and without infill walls on the performance point as well as capacity and demand spectra due to EQ design spectrum function in Madina area has been investigated. The response modification factor (R) for the 15 storey RC building is evaluated from capacity and demand spectra (ATC-40). The purpose of this analysis is to evaluate the expected performance of structural systems by estimating, strength and deformation demands in design, and comparing these demands to available capacities at the performance levels of interest. The results are summarized and discussed.Keywords: seismic assessment, pushover analysis, ambient vibration, modal update
Procedia PDF Downloads 39123986 Cascaded Neural Network for Internal Temperature Forecasting in Induction Motor
Authors: Hidir S. Nogay
Abstract:
In this study, two systems were created to predict interior temperature in induction motor. One of them consisted of a simple ANN model which has two layers, ten input parameters and one output parameter. The other one consisted of eight ANN models connected each other as cascaded. Cascaded ANN system has 17 inputs. Main reason of cascaded system being used in this study is to accomplish more accurate estimation by increasing inputs in the ANN system. Cascaded ANN system is compared with simple conventional ANN model to prove mentioned advantages. Dataset was obtained from experimental applications. Small part of the dataset was used to obtain more understandable graphs. Number of data is 329. 30% of the data was used for testing and validation. Test data and validation data were determined for each ANN model separately and reliability of each model was tested. As a result of this study, it has been understood that the cascaded ANN system produced more accurate estimates than conventional ANN model.Keywords: cascaded neural network, internal temperature, inverter, three-phase induction motor
Procedia PDF Downloads 34523985 Big Data and Health: An Australian Perspective Which Highlights the Importance of Data Linkage to Support Health Research at a National Level
Authors: James Semmens, James Boyd, Anna Ferrante, Katrina Spilsbury, Sean Randall, Adrian Brown
Abstract:
‘Big data’ is a relatively new concept that describes data so large and complex that it exceeds the storage or computing capacity of most systems to perform timely and accurate analyses. Health services generate large amounts of data from a wide variety of sources such as administrative records, electronic health records, health insurance claims, and even smart phone health applications. Health data is viewed in Australia and internationally as highly sensitive. Strict ethical requirements must be met for the use of health data to support health research. These requirements differ markedly from those imposed on data use from industry or other government sectors and may have the impact of reducing the capacity of health data to be incorporated into the real time demands of the Big Data environment. This ‘big data revolution’ is increasingly supported by national governments, who have invested significant funds into initiatives designed to develop and capitalize on big data and methods for data integration using record linkage. The benefits to health following research using linked administrative data are recognised internationally and by the Australian Government through the National Collaborative Research Infrastructure Strategy Roadmap, which outlined a multi-million dollar investment strategy to develop national record linkage capabilities. This led to the establishment of the Population Health Research Network (PHRN) to coordinate and champion this initiative. The purpose of the PHRN was to establish record linkage units in all Australian states, to support the implementation of secure data delivery and remote access laboratories for researchers, and to develop the Centre for Data Linkage for the linkage of national and cross-jurisdictional data. The Centre for Data Linkage has been established within Curtin University in Western Australia; it provides essential record linkage infrastructure necessary for large-scale, cross-jurisdictional linkage of health related data in Australia and uses a best practice ‘separation principle’ to support data privacy and security. Privacy preserving record linkage technology is also being developed to link records without the use of names to overcome important legal and privacy constraint. This paper will present the findings of the first ‘Proof of Concept’ project selected to demonstrate the effectiveness of increased record linkage capacity in supporting nationally significant health research. This project explored how cross-jurisdictional linkage can inform the nature and extent of cross-border hospital use and hospital-related deaths. The technical challenges associated with national record linkage, and the extent of cross-border population movements, were explored as part of this pioneering research project. Access to person-level data linked across jurisdictions identified geographical hot spots of cross border hospital use and hospital-related deaths in Australia. This has implications for planning of health service delivery and for longitudinal follow-up studies, particularly those involving mobile populations.Keywords: data integration, data linkage, health planning, health services research
Procedia PDF Downloads 21623984 Spatial Variability of Brahmaputra River Flow Characteristics
Authors: Hemant Kumar
Abstract:
Brahmaputra River is known according to the Hindu mythology the son of the Lord Brahma. According to this name, the river Brahmaputra creates mass destruction during the monsoon season in Assam, India. It is a state situated in North-East part of India. This is one of the essential states out of the seven countries of eastern India, where almost all entire Brahmaputra flow carried out. The other states carry their tributaries. In the present case study, the spatial analysis performed in this specific case the number of MODIS data are acquired. In the method of detecting the change, the spray content was found during heavy rainfall and in the flooded monsoon season. By this method, particularly the analysis over the Brahmaputra outflow determines the flooded season. The charged particle-associated in aerosol content genuinely verifies the heavy water content below the ground surface, which is validated by trend analysis through rainfall spectrum data. This is confirmed by in-situ sampled view data from a different position of Brahmaputra River. Further, a Hyperion Hyperspectral 30 m resolution data were used to scan the sediment deposits, which is also confirmed by in-situ sampled view data from a different position.Keywords: aerosol, change detection, spatial analysis, trend analysis
Procedia PDF Downloads 14723983 Data Mining Model for Predicting the Status of HIV Patients during Drug Regimen Change
Authors: Ermias A. Tegegn, Million Meshesha
Abstract:
Human Immunodeficiency Virus and Acquired Immunodeficiency Syndrome (HIV/AIDS) is a major cause of death for most African countries. Ethiopia is one of the seriously affected countries in sub Saharan Africa. Previously in Ethiopia, having HIV/AIDS was almost equivalent to a death sentence. With the introduction of Antiretroviral Therapy (ART), HIV/AIDS has become chronic, but manageable disease. The study focused on a data mining technique to predict future living status of HIV/AIDS patients at the time of drug regimen change when the patients become toxic to the currently taking ART drug combination. The data is taken from University of Gondar Hospital ART program database. Hybrid methodology is followed to explore the application of data mining on ART program dataset. Data cleaning, handling missing values and data transformation were used for preprocessing the data. WEKA 3.7.9 data mining tools, classification algorithms, and expertise are utilized as means to address the research problem. By using four different classification algorithms, (i.e., J48 Classifier, PART rule induction, Naïve Bayes and Neural network) and by adjusting their parameters thirty-two models were built on the pre-processed University of Gondar ART program dataset. The performances of the models were evaluated using the standard metrics of accuracy, precision, recall, and F-measure. The most effective model to predict the status of HIV patients with drug regimen substitution is pruned J48 decision tree with a classification accuracy of 98.01%. This study extracts interesting attributes such as Ever taking Cotrim, Ever taking TbRx, CD4 count, Age, Weight, and Gender so as to predict the status of drug regimen substitution. The outcome of this study can be used as an assistant tool for the clinician to help them make more appropriate drug regimen substitution. Future research directions are forwarded to come up with an applicable system in the area of the study.Keywords: HIV drug regimen, data mining, hybrid methodology, predictive model
Procedia PDF Downloads 14223982 Internal Cycles from Hydrometric Data and Variability Detected Through Hydrological Modelling Results, on the Niger River, over 1901-2020
Authors: Salif Koné
Abstract:
We analyze hydrometric data at the Koulikoro station on the Niger River; this basin drains 120600 km2 and covers three countries in West Africa, Guinea, Mali, and Ivory Coast. Two subsequent decadal cycles are highlighted (1925-1936 and 1929-1939) instead of the presumed single decadal one from literature. Moreover, the observed hydrometric data shows a multidecadal 40-year period that is confirmed when graphing a spatial coefficient of variation of runoff over decades (starting at 1901-1910). Spatial runoff data are produced on 48 grids (0.5 degree by 0.5 degree) and through semi-distributed versions of both SimulHyd model and GR2M model - variants of a French Hydrologic model – standing for Genie Rural of 2 parameters at monthly time step. Both extremal decades in terms of runoff coefficient of variation are confronted: 1951-1960 has minimal coefficient of variation, and 1981-1990 shows the maximal value of it during the three months of high-water level (August, September, and October). The mapping of the relative variation of these two decadal situations allows hypothesizing as following: the scale of variation between both extremal situations could serve to fix boundary conditions for further simulations using data from climate scenario.Keywords: internal cycles, hydrometric data, niger river, gr2m and simulhyd framework, runoff coefficient of variation
Procedia PDF Downloads 9523981 A Novel Probabilistic Spatial Locality of Reference Technique for Automatic Cleansing of Digital Maps
Authors: A. Abdullah, S. Abushalmat, A. Bakshwain, A. Basuhail, A. Aslam
Abstract:
GIS (Geographic Information System) applications require geo-referenced data, this data could be available as databases or in the form of digital or hard-copy agro-meteorological maps. These parameter maps are color-coded with different regions corresponding to different parameter values, converting these maps into a database is not very difficult. However, text and different planimetric elements overlaid on these maps makes an accurate image to database conversion a challenging problem. The reason being, it is almost impossible to exactly replace what was underneath the text or icons; thus, pointing to the need for inpainting. In this paper, we propose a probabilistic inpainting approach that uses the probability of spatial locality of colors in the map for replacing overlaid elements with underlying color. We tested the limits of our proposed technique using non-textual simulated data and compared text removing results with a popular image editing tool using public domain data with promising results.Keywords: noise, image, GIS, digital map, inpainting
Procedia PDF Downloads 35223980 Evaluation of Urban Parks Based on POI Data: Taking Futian District of Shenzhen as an Example
Authors: Juanling Lin
Abstract:
The construction of urban parks is an important part of eco-city construction, and the intervention of big data provides a more scientific and rational platform for the assessment of urban parks by identifying and correcting the irrationality of urban park planning from the macroscopic level and then promoting the rational planning of urban parks. The study builds an urban park assessment system based on urban road network data and POI data, taking Futian District of Shenzhen as the research object, and utilizes the GIS geographic information system to assess the park system of Futian District in five aspects: park spatial distribution, accessibility, service capacity, demand, and supply-demand relationship. The urban park assessment system can effectively reflect the current situation of urban park construction and provide a useful exploration for realizing the rationality and fairness of urban park planning.Keywords: urban parks, assessment system, POI, supply and demand
Procedia PDF Downloads 4223979 Excitation of Guided Waves in Finite Width Plates Using a Numerical Approach
Authors: Wenbo Duan, Hossein Habibi, Vassilios Kappatos, Cem Selcuk, Tat-Hean Gan
Abstract:
Ultrasonic guided waves are often used to remove ice or fouling in different structures, such as ship hulls, wind turbine blades and so on. To achieve maximum sound power output, it is important that multiple transducers are arranged in a particular way so that a desired mode can be excited. The objective of this paper is thus to provide a theoretical basis for generating a particular mode in a finite width rectangular plate which can be used for removing potential ice or fouling on the plate. The number of transducers and their locations with respect to a particular mode will be investigated, and the link between dispersion curves and practical applications will be explored. To achieve this, a semi-analytical finite element (SAFE) method is used to study the dispersion characteristics of all the modes in the ultrasonic frequency range. The detailed modal shapes will be revealed, and from the modal analysis, the particular mode with the strongest yet continuous transverse and axial displacements on the surfaces of the plate will be chosen for the purpose of removing potential ice or fouling on the plate. The modal analysis is followed by providing information on the number, location and amplitude of transducers needed to excite this particular mode. Modal excitation is then implemented in a standard finite element commercial package, namely COMSOL Multiphysics. Wave motion is visualized in COMSOL, and the mode shapes generated in SAFE is found to be consistent with the mode shapes generated in COMSOL.Keywords: dispersion analysis, finite width plate, guided wave, modal excitation
Procedia PDF Downloads 47423978 Copula-Based Estimation of Direct and Indirect Effects in Path Analysis Model
Authors: Alam Ali, Ashok Kumar Pathak
Abstract:
Path analysis is a statistical technique used to evaluate the strength of the direct and indirect effects of variables. One or more structural regression equations are used to estimate a series of parameters in order to find the better fit of data. Sometimes, exogenous variables do not show a significant strength of their direct and indirect effect when the assumption of classical regression (ordinary least squares (OLS)) are violated by the nature of the data. The main motive of this article is to investigate the efficacy of the copula-based regression approach over the classical regression approach and calculate the direct and indirect effects of variables when data violates the OLS assumption and variables are linked through an elliptical copula. We perform this study using a well-organized numerical scheme. Finally, a real data application is also presented to demonstrate the performance of the superiority of the copula approach.Keywords: path analysis, copula-based regression models, direct and indirect effects, k-fold cross validation technique
Procedia PDF Downloads 7223977 Drying Shrinkage of Concrete: Scale Effect and Influence of Reinforcement
Authors: Qier Wu, Issam Takla, Thomas Rougelot, Nicolas Burlion
Abstract:
In the framework of French underground disposal of intermediate level radioactive wastes, concrete is widely used as a construction material for containers and tunnels. Drying shrinkage is one of the most disadvantageous phenomena of concrete structures. Cracks generated by differential shrinkage could impair the mechanical behavior, increase the permeability of concrete and act as a preferential path for aggressive species, hence leading to an overall decrease in durability and serviceability. It is of great interest to understand the drying shrinkage phenomenon in order to predict and even to control the strains of concrete. The question is whether the results obtained from laboratory samples are in accordance with the measurements on a real structure. Another question concerns the influence of reinforcement on drying shrinkage of concrete. As part of a global project with Andra (French National Radioactive Waste Management Agency), the present study aims to experimentally investigate the scale effect as well as the influence of reinforcement on the development of drying shrinkage of two high performance concretes (based on CEM I and CEM V cements, according to European standards). Various sizes of samples are chosen, from ordinary laboratory specimens up to real-scale specimens: prismatic specimens with different volume-to-surface (V/S) ratios, thin slices (thickness of 2 mm), cylinders with different sizes (37 and 160 mm in diameter), hollow cylinders, cylindrical columns (height of 1000 mm) and square columns (320×320×1000 mm). The square columns have been manufactured with different reinforcement rates and can be considered as mini-structures, to approximate the behavior of a real voussoir from the waste disposal facility. All the samples are kept, in a first stage, at 20°C and 50% of relative humidity (initial conditions in the tunnel) in a specific climatic chamber developed by the Laboratory of Mechanics of Lille. The mass evolution and the drying shrinkage are monitored regularly. The obtained results show that the specimen size has a great impact on water loss and drying shrinkage of concrete. The specimens with a smaller V/S ratio and a smaller size have a bigger drying shrinkage. The correlation between mass variation and drying shrinkage follows the same tendency for all specimens in spite of the size difference. However, the influence of reinforcement rate on drying shrinkage is not clear based on the present results. The second stage of conservation (50°C and 30% of relative humidity) could give additional results on these influences.Keywords: concrete, drying shrinkage, mass evolution, reinforcement, scale effect
Procedia PDF Downloads 18323976 Effect of Mobile Drip and Linear Irrigation System on Sugar Beet Yield
Authors: Ismail Tas, Yusuf Ersoy Yildirim, Yavuz Fatih Fidantemiz, Aysegul Boyacioglu, Demet Uygan, Ozgur Ates, Erdinc Savasli, Oguz Onder, Murat Tugrul
Abstract:
The biggest input of agricultural production is irrigation, water and energy. Although it varies according to the conditions in drip and sprinkler irrigation systems compared to surface irrigation systems, there is a significant amount of energy expenditure. However, this expense not only increases the user's control over the irrigation water but also provides an increase in water savings and water application efficiency. Thus, while irrigation water is used more effectively, it also contributes to reducing production costs. The Mobile Drip Irrigation System (MDIS) is a system in which new technologies are used, and it is one of the systems that are thought to play an important role in increasing the irrigation water utilization rate of plants and reducing water losses, as well as using irrigation water effectively. MDIS is currently considered the most effective method for irrigation, with the development of both linear and central motion systems. MDIS is potentially more advantageous than sprinkler irrigation systems in terms of reducing wind-induced water losses and reducing evaporation losses on the soil and plant surface. Another feature of MDIS is that the sprinkler heads on the systems (such as the liner and center pivot) can remain operational even when the drip irrigation system is installed. This allows the user to use both irrigation methods. In this study, the effect of MDIS and linear sprinkler irrigation method on sugar beet yield at different irrigation water levels will be revealed.Keywords: MDIS, linear sprinkler, sugar beet, irrigation efficiency
Procedia PDF Downloads 9623975 Over Cracking in Furnace and Corrective Action by Computational Fluid Dynamics (CFD) Analysis
Authors: Mokhtari Karchegani Amir, Maboudi Samad, Azadi Reza, Dastanian Raoof
Abstract:
Marun's petrochemical cracking furnaces have a very comprehensive operating control system for combustion and related equipment, utilizing advanced instrument circuits. However, after several years of operation, numerous problems arose in the pyrolysis furnaces. A team of experts conducted an audit, revealing that the furnaces were over-designed, leading to excessive consumption of air and fuel. This issue was related to the burners' shutter settings, which had not been configured properly. The operations department had responded by increasing the induced draft fan speed and forcing the instrument switches to counteract the wind effect in the combustion chamber. Using Fluent and Gambit software, the furnaces were analyzed. The findings indicated that this situation elevated the convection part's temperature, causing uneven heat distribution inside the furnace. Consequently, this led to overheating in the convection section and excessive cracking within the coils in the radiation section. The increased convection temperature damaged convection parts and resulted in equipment blockages downstream of the furnaces due to the production of more coke and tar in the process. To address these issues, corrective actions were implemented. The excess air for burners and combustion chambers was properly set, resulting in improved efficiency, reduced emissions of environmentally harmful gases, prevention of creep in coils, decreased fuel consumption, and lower maintenance costs.Keywords: furnace, coke, CFD analysis, over cracking
Procedia PDF Downloads 7723974 Optimizing Quantum Machine Learning with Amplitude and Phase Encoding Techniques
Authors: Om Viroje
Abstract:
Quantum machine learning represents a frontier in computational technology, promising significant advancements in data processing capabilities. This study explores the significance of data encoding techniques, specifically amplitude and phase encoding, in this emerging field. By employing a comparative analysis methodology, the research evaluates how these encoding techniques affect the accuracy, efficiency, and noise resilience of quantum algorithms. Our findings reveal that amplitude encoding enhances algorithmic accuracy and noise tolerance, whereas phase encoding significantly boosts computational efficiency. These insights are crucial for developing robust quantum frameworks that can be effectively applied in real-world scenarios. In conclusion, optimizing encoding strategies is essential for advancing quantum machine learning, potentially transforming various industries through improved data processing and analysis.Keywords: quantum machine learning, data encoding, amplitude encoding, phase encoding, noise resilience
Procedia PDF Downloads 1623973 Physicochemical Characterization of Coastal Aerosols over the Mediterranean Comparison with Weather Research and Forecasting-Chem Simulations
Authors: Stephane Laussac, Jacques Piazzola, Gilles Tedeschi
Abstract:
Estimation of the impact of atmospheric aerosols on the climate evolution is an important scientific challenge. One of a major source of particles is constituted by the oceans through the generation of sea-spray aerosols. In coastal areas, marine aerosols can affect air quality through their ability to interact chemically and physically with other aerosol species and gases. The integration of accurate sea-spray emission terms in modeling studies is then required. However, it was found that sea-spray concentrations are not represented with the necessary accuracy in some situations, more particularly at short fetch. In this study, the WRF-Chem model was implemented on a North-Western Mediterranean coastal region. WRF-Chem is the Weather Research and Forecasting (WRF) model online-coupled with chemistry for investigation of regional-scale air quality which simulates the emission, transport, mixing, and chemical transformation of trace gases and aerosols simultaneously with the meteorology. One of the objectives was to test the ability of the WRF-Chem model to represent the fine details of the coastal geography to provide accurate predictions of sea spray evolution for different fetches and the anthropogenic aerosols. To assess the performance of the model, a comparison between the model predictions using a local emission inventory and the physicochemical analysis of aerosol concentrations measured for different wind direction on the island of Porquerolles located 10 km south of the French Riviera is proposed.Keywords: sea-spray aerosols, coastal areas, sea-spray concentrations, short fetch, WRF-Chem model
Procedia PDF Downloads 19623972 Optimization of a Flexible Thermoelectric Generator for Energy Harvesting from Human Skin to Power Wearable Electronics
Authors: Dessalegn Abera Waktole, Boru Jia, Zhengxing Zuo, Wei Wang, Nianling Kuang
Abstract:
A flexible thermoelectric generator is one method for recycling waste heat. This research provides the optimum performance of a flexible thermoelectric generator with optimal geometric parameters and a detailed structural design. In this research, a numerical simulation and experiment were carried out to develop an efficient, flexible thermoelectric generator for energy harvesting from human skin. Heteromorphic electrodes and a polyimide substrate with a copper-printed circuit board were introduced into the structural design of a flexible thermoelectric generator. The heteromorphic electrode was used as a heat sink and component of a flexible thermoelectric generator to enhance the temperature difference within the thermoelectric legs. Both N-type and P-type thermoelectric legs were made of bismuth selenium telluride (Bi1.7Te3.7Se0.3) and bismuth antimony telluride (Bi0.4Sb1.6Te3). The output power of the flexible thermoelectric generator was analyzed under different heat source temperatures and heat dissipation conditions. The COMSOL Multiphysics 5.6 software was used to conduct the simulation, which was validated by experiment. It is recorded that the maximum power output of 232.064μW was obtained by considering different wind speed conditions, the ambient temperature of 20℃, and the heat source temperature of 36℃ under various load resistance conditions, which range from 0.24Ω to 0. 91Ω. According to this finding, heteromorphic electrodes have a significant impact on the performance of the device.Keywords: flexible thermoelectric generator, optimization, performance, temperature gradient, waste heat recovery
Procedia PDF Downloads 16723971 Reversible Information Hitting in Encrypted JPEG Bitstream by LSB Based on Inherent Algorithm
Authors: Vaibhav Barve
Abstract:
Reversible information hiding has drawn a lot of interest as of late. Being reversible, we can restore unique computerized data totally. It is a plan where mystery data is put away in digital media like image, video, audio to maintain a strategic distance from unapproved access and security reason. By and large JPEG bit stream is utilized to store this key data, first JPEG bit stream is encrypted into all around sorted out structure and then this secret information or key data is implanted into this encrypted region by marginally changing the JPEG bit stream. Valuable pixels suitable for information implanting are computed and as indicated by this key subtle elements are implanted. In our proposed framework we are utilizing RC4 algorithm for encrypting JPEG bit stream. Encryption key is acknowledged by framework user which, likewise, will be used at the time of decryption. We are executing enhanced least significant bit supplanting steganography by utilizing genetic algorithm. At first, the quantity of bits that must be installed in a guaranteed coefficient is versatile. By utilizing proper parameters, we can get high capacity while ensuring high security. We are utilizing logistic map for shuffling of bits and utilization GA (Genetic Algorithm) to find right parameters for the logistic map. Information embedding key is utilized at the time of information embedding. By utilizing precise picture encryption and information embedding key, the beneficiary can, without much of a stretch, concentrate the incorporated secure data and totally recoup the first picture and also the original secret information. At the point when the embedding key is truant, the first picture can be recouped pretty nearly with sufficient quality without getting the embedding key of interest.Keywords: data embedding, decryption, encryption, reversible data hiding, steganography
Procedia PDF Downloads 28823970 Streamlining .NET Data Access: Leveraging JSON for Data Operations in .NET
Authors: Tyler T. Procko, Steve Collins
Abstract:
New features in .NET (6 and above) permit streamlined access to information residing in JSON-capable relational databases, such as SQL Server (2016 and above). Traditional methods of data access now comparatively involve unnecessary steps which compromise system performance. This work posits that the established ORM (Object Relational Mapping) based methods of data access in applications and APIs result in common issues, e.g., object-relational impedance mismatch. Recent developments in C# and .NET Core combined with a framework of modern SQL Server coding conventions have allowed better technical solutions to the problem. As an amelioration, this work details the language features and coding conventions which enable this streamlined approach, resulting in an open-source .NET library implementation called Codeless Data Access (CODA). Canonical approaches rely on ad-hoc mapping code to perform type conversions between the client and back-end database; with CODA, no mapping code is needed, as JSON is freely mapped to SQL and vice versa. CODA streamlines API data access by improving on three aspects of immediate concern to web developers, database engineers and cybersecurity professionals: Simplicity, Speed and Security. Simplicity is engendered by cutting out the “middleman” steps, effectively making API data access a whitebox, whereas traditional methods are blackbox. Speed is improved because of the fewer translational steps taken, and security is improved as attack surfaces are minimized. An empirical evaluation of the speed of the CODA approach in comparison to ORM approaches ] is provided and demonstrates that the CODA approach is significantly faster. CODA presents substantial benefits for API developer workflows by simplifying data access, resulting in better speed and security and allowing developers to focus on productive development rather than being mired in data access code. Future considerations include a generalization of the CODA method and extension outside of the .NET ecosystem to other programming languages.Keywords: API data access, database, JSON, .NET core, SQL server
Procedia PDF Downloads 6623969 Blockchain for IoT Security and Privacy in Healthcare Sector
Authors: Umair Shafique, Hafiz Usman Zia, Fiaz Majeed, Samina Naz, Javeria Ahmed, Maleeha Zainab
Abstract:
The Internet of Things (IoT) has become a hot topic for the last couple of years. This innovative technology has shown promising progress in various areas, and the world has witnessed exponential growth in multiple application domains. Researchers are working to investigate its aptitudes to get the best from it by harnessing its true potential. But at the same time, IoT networks open up a new aspect of vulnerability and physical threats to data integrity, privacy, and confidentiality. It's is due to centralized control, data silos approach for handling information, and a lack of standardization in the IoT networks. As we know, blockchain is a new technology that involves creating secure distributed ledgers to store and communicate data. Some of the benefits include resiliency, integrity, anonymity, decentralization, and autonomous control. The potential for blockchain technology to provide the key to managing and controlling IoT has created a new wave of excitement around the idea of putting that data back into the hands of the end-users. In this manuscript, we have proposed a model that combines blockchain and IoT networks to address potential security and privacy issues in the healthcare domain. Then we try to describe various application areas, challenges, and future directions in the healthcare sector where blockchain platforms merge with IoT networks.Keywords: IoT, blockchain, cryptocurrency, healthcare, consensus, data
Procedia PDF Downloads 18023968 Vision-Based Daily Routine Recognition for Healthcare with Transfer Learning
Authors: Bruce X. B. Yu, Yan Liu, Keith C. C. Chan
Abstract:
We propose to record Activities of Daily Living (ADLs) of elderly people using a vision-based system so as to provide better assistive and personalization technologies. Current ADL-related research is based on data collected with help from non-elderly subjects in laboratory environments and the activities performed are predetermined for the sole purpose of data collection. To obtain more realistic datasets for the application, we recorded ADLs for the elderly with data collected from real-world environment involving real elderly subjects. Motivated by the need to collect data for more effective research related to elderly care, we chose to collect data in the room of an elderly person. Specifically, we installed Kinect, a vision-based sensor on the ceiling, to capture the activities that the elderly subject performs in the morning every day. Based on the data, we identified 12 morning activities that the elderly person performs daily. To recognize these activities, we created a HARELCARE framework to investigate into the effectiveness of existing Human Activity Recognition (HAR) algorithms and propose the use of a transfer learning algorithm for HAR. We compared the performance, in terms of accuracy, and training progress. Although the collected dataset is relatively small, the proposed algorithm has a good potential to be applied to all daily routine activities for healthcare purposes such as evidence-based diagnosis and treatment.Keywords: daily activity recognition, healthcare, IoT sensors, transfer learning
Procedia PDF Downloads 13223967 Life Cycle Carbon Dioxide Emissions from the Construction Phase of Highway Sector in China
Authors: Yuanyuan Liu, Yuanqing Wang, Di Li
Abstract:
Carbon dioxide (CO2) emissions mitigation from road construction activities is one of the potential pathways to deal with climate change due to its higher use of materials, machinery energy consumption, and high quantity of vehicle and equipment fuels for transportation and on-site construction activities. Aiming to assess the environmental impact of the road infrastructure construction activities and to identify hotspots of emissions sources, this study developed a life-cycle CO2 emissions assessment framework covering three stages of material production, to-site and on-site transportation under the guidance of the principle of LCA ISO14040. Then streamlined inventory analysis on sub-processes of each stage was conducted based on the budget files from cases of highway projects in China. The calculation results were normalized into functional unit represented as ton per km per lane. Then a comparison between the amount of emissions from each stage, and sub-process was made to identify the major contributor in the whole highway lifecycle. In addition, the calculating results were used to be compared with results in other countries for understanding the level of CO2 emissions associated with Chinese road infrastructure in the world. The results showed that materials production stage produces the most of the CO2 emissions (for more than 80%), and the production of cement and steel accounts for large quantities of carbon emissions. Life cycle CO2 emissions of fuel and electric energy associated with to-site and on-site transportation vehicle and equipment are a minor component of total life cycle CO2 emissions from highway project construction activities. Bridges and tunnels are dominant large carbon contributor compared to the road segments. The life cycle CO2 emissions of road segment in highway project in China are slightly higher than the estimation results of highways in European countries and USA, about 1500 ton per km per lane. In particularly, the life cycle CO2 emissions of road pavement in majority cities all over the world are about 500 ton per km per lane. However, there is obvious difference between the cities when the estimation on life cycle CO2 emissions of highway projects included bridge and tunnel. The findings of the study could offer decision makers a more comprehensive reference to understand the contribution of road infrastructure to climate change, especially understand the contribution from road infrastructure construction activities in China. In addition, the identified hotspots of emissions sources provide the insights of how to reduce road carbon emissions for development of sustainable transportation.Keywords: carbon dioxide emissions, construction activities, highway, life cycle assessment
Procedia PDF Downloads 26923966 Design and Implementation of Security Middleware for Data Warehouse Signature, Framework
Authors: Mayada Al Meghari
Abstract:
Recently, grid middlewares have provided large integrated use of network resources as the shared data and the CPU to become a virtual supercomputer. In this work, we present the design and implementation of the middleware for Data Warehouse Signature, DWS Framework. The aim of using the middleware in our DWS framework is to achieve the high performance by the parallel computing. This middleware is developed on Alchemi.Net framework to increase the security among the network nodes through the authentication and group-key distribution model. This model achieves the key security and prevents any intermediate attacks in the middleware. This paper presents the flow process structures of the middleware design. In addition, the paper ensures the implementation of security for DWS middleware enhancement with the authentication and group-key distribution model. Finally, from the analysis of other middleware approaches, the developed middleware of DWS framework is the optimal solution of a complete covering of security issues.Keywords: middleware, parallel computing, data warehouse, security, group-key, high performance
Procedia PDF Downloads 11923965 Sentiment Classification of Documents
Authors: Swarnadip Ghosh
Abstract:
Sentiment Analysis is the process of detecting the contextual polarity of text. In other words, it determines whether a piece of writing is positive, negative or neutral.Sentiment analysis of documents holds great importance in today's world, when numerous information is stored in databases and in the world wide web. An efficient algorithm to illicit such information, would be beneficial for social, economic as well as medical purposes. In this project, we have developed an algorithm to classify a document into positive or negative. Using our algorithm, we obtained a feature set from the data, and classified the documents based on this feature set. It is important to note that, in the classification, we have not used the independence assumption, which is considered by many procedures like the Naive Bayes. This makes the algorithm more general in scope. Moreover, because of the sparsity and high dimensionality of such data, we did not use empirical distribution for estimation, but developed a method by finding degree of close clustering of the data points. We have applied our algorithm on a movie review data set obtained from IMDb and obtained satisfactory results.Keywords: sentiment, Run's Test, cross validation, higher dimensional pmf estimation
Procedia PDF Downloads 40223964 Corporate Governance and Bank Performance: A Study of Selected Deposit Money Banks in Nigeria
Authors: Ayodele Ajayi, John Ajayi
Abstract:
This paper investigates the effect of corporate governance with a view to determining the relationship between board size and bank performance. Data for the study were obtained from the audited financial statements of five sampled banks listed on the Nigerian Stock Exchange. Panel data technique was adopted and analysis was carried out with the use of multiple regression and pooled ordinary least square. Results from the study show that the larger the board size, the greater the profit implying that corporate governance is positively correlated with bank performance.Keywords: corporate governance, banks performance, board size, pooled data
Procedia PDF Downloads 36023963 Empowering a New Frontier in Heart Disease Detection: Unleashing Quantum Machine Learning
Authors: Sadia Nasrin Tisha, Mushfika Sharmin Rahman, Javier Orduz
Abstract:
Machine learning is applied in a variety of fields throughout the world. The healthcare sector has benefited enormously from it. One of the most effective approaches for predicting human heart diseases is to use machine learning applications to classify data and predict the outcome as a classification. However, with the rapid advancement of quantum technology, quantum computing has emerged as a potential game-changer for many applications. Quantum algorithms have the potential to execute substantially faster than their classical equivalents, which can lead to significant improvements in computational performance and efficiency. In this study, we applied quantum machine learning concepts to predict coronary heart diseases from text data. We experimented thrice with three different features; and three feature sets. The data set consisted of 100 data points. We pursue to do a comparative analysis of the two approaches, highlighting the potential benefits of quantum machine learning for predicting heart diseases.Keywords: quantum machine learning, SVM, QSVM, matrix product state
Procedia PDF Downloads 9423962 Blockchain’s Feasibility in Military Data Networks
Authors: Brenden M. Shutt, Lubjana Beshaj, Paul L. Goethals, Ambrose Kam
Abstract:
Communication security is of particular interest to military data networks. A relatively novel approach to network security is blockchain, a cryptographically secured distribution ledger with a decentralized consensus mechanism for data transaction processing. Recent advances in blockchain technology have proposed new techniques for both data validation and trust management, as well as different frameworks for managing dataflow. The purpose of this work is to test the feasibility of different blockchain architectures as applied to military command and control networks. Various architectures are tested through discrete-event simulation and the feasibility is determined based upon a blockchain design’s ability to maintain long-term stable performance at industry standards of throughput, network latency, and security. This work proposes a consortium blockchain architecture with a computationally inexpensive consensus mechanism, one that leverages a Proof-of-Identity (PoI) concept and a reputation management mechanism.Keywords: blockchain, consensus mechanism, discrete-event simulation, fog computing
Procedia PDF Downloads 138