Search results for: data source
27469 Study on Beta-Ray Detection System in Water Using a MCNP Simulation
Authors: Ki Hyun Park, Hye Min Park, Jeong Ho Kim, Chan Jong Park, Koan Sik Joo
Abstract:
In the modern days, the use of radioactive substances is on the rise in the areas like chemical weaponry, industrial usage, and power plants. Although there are various technologies available to detect and monitor radioactive substances in the air, the technologies to detect underwater radioactive substances are scarce. In this study, computer simulation of the underwater detection system measuring beta-ray, a radioactive substance, has been done through MCNP. CaF₂, YAP(Ce) and YAG(Ce) have been used in the computer simulation to detect beta-ray as scintillator. Also, the source used in the computer simulation is Sr-90 and Y-90, both of them emitting only pure beta-ray. The distance between the source and the detector was shifted from 1mm to 10mm by 1 mm in the computer simulation. The result indicated that Sr-90 was impossible to measure below 1 mm since its emission energy is low while Y-90 was able to be measured up to 10mm underwater. In addition, the detector designed with CaF₂ had the highest efficiency among 3 scintillators used in the computer simulation. Since it was possible to verify the detectable range and the detection efficiency according to modeling through MCNP simulation, it is expected that such result will reduce the time and cost in building the actual beta-ray detector and evaluating its performances, thereby contributing the research and development.Keywords: Beta-ray, CaF₂, detector, MCNP simulation, scintillator
Procedia PDF Downloads 51027468 A Framework for Blockchain Vulnerability Detection and Cybersecurity Education
Authors: Hongmei Chi
Abstract:
The Blockchain has become a necessity for many different societal industries and ordinary lives including cryptocurrency technology, supply chain, health care, public safety, education, etc. Therefore, training our future blockchain developers to know blockchain programming vulnerability and I.T. students' cyber security is in high demand. In this work, we propose a framework including learning modules and hands-on labs to guide future I.T. professionals towards developing secure blockchain programming habits and mitigating source code vulnerabilities at the early stages of the software development lifecycle following the concept of Secure Software Development Life Cycle (SSDLC). In this research, our goal is to make blockchain programmers and I.T. students aware of the vulnerabilities of blockchains. In summary, we develop a framework that will (1) improve students' skills and awareness of blockchain source code vulnerabilities, detection tools, and mitigation techniques (2) integrate concepts of blockchain vulnerabilities for IT students, (3) improve future IT workers’ ability to master the concepts of blockchain attacks.Keywords: software vulnerability detection, hands-on lab, static analysis tools, vulnerabilities, blockchain, active learning
Procedia PDF Downloads 9927467 Compact Dual-Band Bandpass Filter Based on Quarter Wavelength Stepped Impedance Resonators
Authors: Yu-Fu Chen, Zih-Jyun Dai, Chen-Te Chiu, Shiue-Chen Chiou, Yung-Wei Chen, Yu-Ming Lin, Kuan-Yu Chen, Hung-Wei Wu, Hsin-Ying Lee, Yan-Kuin Su, Shoou-Jinn Chang
Abstract:
This paper presents a compact dual-band bandpass filter that involves using the quarter wavelength stepped impedance resonators (SIRs) for achieving simultaneously compact circuit size and good dual-band performance. The filter is designed at 2.4 / 3.5 GHz and constructed by two pairs of quarter wavelength SIRs and source-load lines. By properly tuning the impedance ratio, length ratio and radius of via hole of the SIRs, dual-passbands performance can be easily determined. To improve the passband selectivity, the use of source-load lines is to increase coupling energy between the resonators. The filter is showing simple configuration, effective design method and small circuit size. The measured results are in good agreement with the simulation results.Keywords: dual-band, bandpass filter, stepped impedance resonators, SIR
Procedia PDF Downloads 51727466 XANES Studies on the Oxidation States of Copper Ion in Silicate Glass
Authors: R. Buntem, K. Samkongngam
Abstract:
The silicate glass was prepared using rice husk as the source of silica. The base composition of glass sample is composed of SiO2 (from rice husk ash), Na2CO3, K2CO3, ZnO, H3BO3, CaO, Al2O3 or Al, and CuO. Aluminum is used in place of Al2O3 in order to reduce Cu2+ to Cu+. The red color of Cu2O in the glass matrix was observed when the Al was added into the glass mixture. The expansion coefficients of the copper doped glass are in the range of 1.2 x 10-5-1.4x10-5 (ºC -1) which is common for the silicate glass. The finger prints of the bond vibrations were studied using IR spectroscopy. While the oxidation state and the coordination information of the copper ion in the glass matrix were investigated using X-ray absorption spectroscopy. From the data, Cu+ and Cu2+ exist in the glass matrix. The red particles of Cu2O can be formed in the glass matrix when enough aluminum was added.Keywords: copper in glass, coordination information, silicate glass, XANES spectrum
Procedia PDF Downloads 26327465 Assessment of Tidal Influence in Spatial and Temporal Variations of Water Quality in Masan Bay, Korea
Abstract:
Slack-tide sampling was carried out at seven stations at high and low tides for a tidal cycle, in summer (7, 8, 9) and fall (10), 2016 to determine the differences of water quality according to tides in Masan Bay. The data were analyzed by Pearson correlation and factor analysis. The mixing state of all the water quality components investigated is well explained by the correlation with salinity (SAL). Turbidity (TURB), dissolved silica (DSi), nitrite and nitrate nitrogen (NNN) and total nitrogen (TN), which find their way into the bay from the streams and have no internal source and sink reaction, showed a strong negative correlation with SAL at low tide, indicating the property of conservative mixing. On the contrary, in summer and fall, dissolved oxygen (DO), hydrogen sulfide (H2S) and chemical oxygen demand with KMnO4 (CODMn) of the surface and bottom water, which were sensitive to an internal source and sink reaction, showed no significant correlation with SAL at high and low tides. The remaining water quality parameters showed a conservative or a non-conservative mixing pattern depending on the mixing characteristics at high and low tides, determined by the functional relationship between the changes of the flushing time and the changes of the characteristics of water quality components of the end-members in the bay. Factor analysis performed on the concentration difference data sets between high and low tides helped in identifying the principal latent variables for them. The concentration differences varied spatially and temporally. Principal factors (PFs) scores plots for each monitoring situation showed high associations of the variations to the monitoring sites. At sampling station 1 (ST1), temperature (TEMP), SAL, DSi, TURB, NNN and TN of the surface water in summer, TEMP, SAL, DSi, DO, TURB, NNN, TN, reactive soluble phosphorus (RSP) and total phosphorus (TP) of the bottom water in summer, TEMP, pH, SAL, DSi, DO, TURB, CODMn, particulate organic carbon (POC), ammonia nitrogen (AMN), NNN, TN and fecal coliform (FC) of the surface water in fall, TEMP, pH, SAL, DSi, H2S, TURB, CODMn, AMN, NNN and TN of the bottom water in fall commonly showed up as the most significant parameters and the large concentration differences between high and low tides. At other stations, the significant parameters showed differently according to the spatial and temporal variations of mixing pattern in the bay. In fact, there is no estuary that always maintains steady-state flow conditions. The mixing regime of an estuary might be changed at any time from linear to non-linear, due to the change of flushing time according to the combination of hydrogeometric properties, inflow of freshwater and tidal action, And furthermore the change of end-member conditions due to the internal sinks and sources makes the occurrence of concentration difference inevitable. Therefore, when investigating the water quality of the estuary, it is necessary to take a sampling method considering the tide to obtain average water quality data.Keywords: conservative mixing, end-member, factor analysis, flushing time, high and low tide, latent variables, non-conservative mixing, slack-tide sampling, spatial and temporal variations, surface and bottom water
Procedia PDF Downloads 13027464 An Alternative Credit Scoring System in China’s Consumer Lendingmarket: A System Based on Digital Footprint Data
Authors: Minjuan Sun
Abstract:
Ever since the late 1990s, China has experienced explosive growth in consumer lending, especially in short-term consumer loans, among which, the growth rate of non-bank lending has surpassed bank lending due to the development in financial technology. On the other hand, China does not have a universal credit scoring and registration system that can guide lenders during the processes of credit evaluation and risk control, for example, an individual’s bank credit records are not available for online lenders to see and vice versa. Given this context, the purpose of this paper is three-fold. First, we explore if and how alternative digital footprint data can be utilized to assess borrower’s creditworthiness. Then, we perform a comparative analysis of machine learning methods for the canonical problem of credit default prediction. Finally, we analyze, from an institutional point of view, the necessity of establishing a viable and nationally universal credit registration and scoring system utilizing online digital footprints, so that more people in China can have better access to the consumption loan market. Two different types of digital footprint data are utilized to match with bank’s loan default records. Each separately captures distinct dimensions of a person’s characteristics, such as his shopping patterns and certain aspects of his personality or inferred demographics revealed by social media features like profile image and nickname. We find both datasets can generate either acceptable or excellent prediction results, and different types of data tend to complement each other to get better performances. Typically, the traditional types of data banks normally use like income, occupation, and credit history, update over longer cycles, hence they can’t reflect more immediate changes, like the financial status changes caused by the business crisis; whereas digital footprints can update daily, weekly, or monthly, thus capable of providing a more comprehensive profile of the borrower’s credit capabilities and risks. From the empirical and quantitative examination, we believe digital footprints can become an alternative information source for creditworthiness assessment, because of their near-universal data coverage, and because they can by and large resolve the "thin-file" issue, due to the fact that digital footprints come in much larger volume and higher frequency.Keywords: credit score, digital footprint, Fintech, machine learning
Procedia PDF Downloads 16227463 Simultaneous Saccharification and Co-Fermentation of Paddy Straw and Fruit Wastes into Ethanol Production
Authors: Kamla Malik
Abstract:
For ethanol production from paddy straw firstly pretreatment was done by using sodium hydroxide solution (2.0%) at 15 psi for 1 hr. The maximum lignin removal was achieved with 0.5 mm mesh size of paddy straw. It contained 72.4 % cellulose, 15.9% hemicelluloses and 2.0 % lignin after pretreatment. Paddy straw hydrolysate (PSH) with fruits wastes (5%), such as sweet lime, apple, sapota, grapes, kinnow, banana, papaya, mango, and watermelon were subjected to simultaneous saccharification and co-fermentation (SSCF) for 72 hrs by co-culture of Saccharomyces cerevisiae HAU-1 and Candida sp. with 0.3 % urea as a cheap nitrogen source. Fermentation was carried out at 35°C and determined ethanol yield at 24 hours interval. The maximum production of ethanol was produced within 72 hrs of fermentation in PSH + sapota peels (3.9% v/v) followed by PSH + kinnow peels (3.6%) and PSH+ papaya peels extract (3.1 %). In case of PSH+ banana peels and mango peel extract the ethanol produced were 2.8 % and 2.2 % (v/v). The results of this study suggest that wastes from fruits that contain fermentable sugar should not be discarded into our environment, but should be supplemented in paddy straw which converted to useful products like bio-ethanol that can serve as an alternative energy source.Keywords: ethanol, fermentation, fruit wastes, paddy straw
Procedia PDF Downloads 39027462 Survival Pattern of Under-five Mortality in High Focus States in India
Authors: Rahul Kumar
Abstract:
Background: Under-FiveMortality Rate(U5MR)ofanationiswidelyacceptedandlong-standing indicators of well-beingofherchildren.They measuredtheprobability of dying before theageoffive(expressedper1000livebirths).TheU5MRisanappropriate indicator of the cumulative exposure totheriskofdeathduringthefirstfiveyearsoflife, and accepted globalindicator ofthehealthandsocioeconomicstatusofagiven population.Itisalsousefulforassessing theimpactofvariousintervention programmes aimed at improving child survival.Under-fivemortalitytrendsconstitutealeadingindicatorofthelevel ofchildhealthandoveralldevelopmentincountries. Objectives: The first aim of our research is to study the level, trends, and Pattern of Under-five mortality using different sources of data. The second objective is to examine the survival pattern of Under-five mortality by different background characteristics. Data Source and Methodology: SRS and NFHS data have been used forobservingthelevelandtrendofUnder-Five mortality rate. Kaplan Meier Estimate has been used to understand the survival Pattern of Under-five mortality. Result: WefindthatallmostallthestatesmadesomeprogressbyreducingU5MRin recent decades.During1992-93highestU5MR(per thousand live birth) was observed in Assam(142)followed by up(141),Odisha(131),MP(130),andBihar(127.5).While the least U5MR(perthousandlive birth)wasobservedinRajasthan(102). The highestU5MR(per thousandlive birth)isobservedinUP(78.1), followed by MP(64.9)and Chhattisgarh(63.7)which are far away from the national level(50). Among them, Uttarakhand(46.7)hadleastU5MR(perthousandlivebirth), followed by Odisha(48.6). TheU5MR(perthousandlivebirth)ofcombinedhighfocusstateis63.7whichisfar away fromthenationallevel(50). Weidentified thatthesurvivalprobability ofunder-fivechildrenfromadolescentmotherislessin comparisontootherchildrenbornby differentagegroupofmothers. thatduringneonatalperiodusually male mortality exceedsthefemale mortality butthisdifferentialreversedinthepostneonatalperiod. Astheirageincreasesand approachingtofiveyears,weidentifiedthatthesurvivalprobability ofbothsexdecreasesbut female’s survival probabilitydecrement is more than male as their ageincreases. The poorer children’s survival probability is minimum. Children using improved toilet facility has more survival probability throughout thefiveyearsthan who uses unimproved. The survival probability of children under five who got Full ANCis more than the survival probability of children under five who doesn’t get any ANC. Conclusions: Improvement of maternal education is an urgent need to improve their health seeking behavior and thus the health of their children. Awareness on reproductive health and environmental sanitation should be strengthened.Keywords: under-five mortality, survival pattern, ANC, trend
Procedia PDF Downloads 13327461 Data Mining Approach for Commercial Data Classification and Migration in Hybrid Storage Systems
Authors: Mais Haj Qasem, Maen M. Al Assaf, Ali Rodan
Abstract:
Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.Keywords: hybrid storage system, data mining, recurrent neural network, support vector machine
Procedia PDF Downloads 30827460 A Stochastic Model to Predict Earthquake Ground Motion Duration Recorded in Soft Soils Based on Nonlinear Regression
Authors: Issam Aouari, Abdelmalek Abdelhamid
Abstract:
For seismologists, the characterization of seismic demand should include the amplitude and duration of strong shaking in the system. The duration of ground shaking is one of the key parameters in earthquake resistant design of structures. This paper proposes a nonlinear statistical model to estimate earthquake ground motion duration in soft soils using multiple seismicity indicators. Three definitions of ground motion duration proposed by literature have been applied. With a comparative study, we select the most significant definition to use for predict the duration. A stochastic model is presented for the McCann and Shah Method using nonlinear regression analysis based on a data set for moment magnitude, source to site distance and site conditions. The data set applied is taken from PEER strong motion databank and contains shallow earthquakes from different regions in the world; America, Turkey, London, China, Italy, Chili, Mexico...etc. Main emphasis is placed on soft site condition. The predictive relationship has been developed based on 600 records and three input indicators. Results have been compared with others published models. It has been found that the proposed model can predict earthquake ground motion duration in soft soils for different regions and sites conditions.Keywords: duration, earthquake, prediction, regression, soft soil
Procedia PDF Downloads 15327459 Discussion on Big Data and One of Its Early Training Application
Authors: Fulya Gokalp Yavuz, Mark Daniel Ward
Abstract:
This study focuses on a contemporary and inevitable topic of Data Science and its exemplary application for early career building: Big Data and Leaving Learning Community (LLC). ‘Academia’ and ‘Industry’ have a common sense on the importance of Big Data. However, both of them are in a threat of missing the training on this interdisciplinary area. Some traditional teaching doctrines are far away being effective on Data Science. Practitioners needs some intuition and real-life examples how to apply new methods to data in size of terabytes. We simply explain the scope of Data Science training and exemplified its early stage application with LLC, which is a National Science Foundation (NSF) founded project under the supervision of Prof. Ward since 2014. Essentially, we aim to give some intuition for professors, researchers and practitioners to combine data science tools for comprehensive real-life examples with the guides of mentees’ feedback. As a result of discussing mentoring methods and computational challenges of Big Data, we intend to underline its potential with some more realization.Keywords: Big Data, computation, mentoring, training
Procedia PDF Downloads 36227458 On Transferring of Transient Signals along Hollow Waveguide
Authors: E. Eroglu, S. Semsit, E. Sener, U.S. Sener
Abstract:
In Electromagnetics, there are three canonical boundary value problem with given initial conditions for the electromagnetic field sought, namely: Cavity Problem, Waveguide Problem, and External Problem. The Cavity Problem and Waveguide Problem were rigorously studied and new results were arised at original works in the past decades. In based on studies of an analytical time domain method Evolutionary Approach to Electromagnetics (EAE), electromagnetic field strength vectors produced by a time dependent source function are sought. The fields are took place in L2 Hilbert space. The source function that performs signal transferring, energy and surplus of energy has been demonstrated with all clarity. Depth of the method and ease of applications are emerged needs of gathering obtained results. Main discussion is about perfect electric conductor and hollow waveguide. Even if well studied time-domain modes problems are mentioned, specifically, the modes which have a hollow (i.e., medium-free) cross-section domain are considered.Keywords: evolutionary approach to electromagnetics, time-domain waveguide mode, Neumann problem, Dirichlet boundary value problem, Klein-Gordon
Procedia PDF Downloads 32927457 Evaluation of the Internal Quality for Pineapple Based on the Spectroscopy Approach and Neural Network
Authors: Nonlapun Meenil, Pisitpong Intarapong, Thitima Wongsheree, Pranchalee Samanpiboon
Abstract:
In Thailand, once pineapples are harvested, they must be classified into two classes based on their sweetness: sweet and unsweet. This paper has studied and developed the assessment of internal quality of pineapples using a low-cost compact spectroscopy sensor according to the Spectroscopy approach and Neural Network (NN). During the experiments, Batavia pineapples were utilized, generating 100 samples. The extracted pineapple juice of each sample was used to determine the Soluble Solid Content (SSC) labeling into sweet and unsweet classes. In terms of experimental equipment, the sensor cover was specifically designed to install the sensor and light source to read the reflectance at a five mm depth from pineapple flesh. By using a spectroscopy sensor, data on visible and near-infrared reflectance (Vis-NIR) were collected. The NN was used to classify the pineapple classes. Before the classification step, the preprocessing methods, which are Class balancing, Data shuffling, and Standardization were applied. The 510 nm and 900 nm reflectance values of the middle parts of pineapples were used as features of the NN. With the Sequential model and Relu activation function, 100% accuracy of the training set and 76.67% accuracy of the test set were achieved. According to the abovementioned information, using a low-cost compact spectroscopy sensor has achieved favorable results in classifying the sweetness of the two classes of pineapples.Keywords: neural network, pineapple, soluble solid content, spectroscopy
Procedia PDF Downloads 7527456 The Analysis of Thermal Conductivity in Porcine Meat Due to Electricity by Finite Element Method
Authors: Orose Rugchati, Sarawut Wattanawongpitak
Abstract:
This research studied the analysis of the thermal conductivity and heat transfer in porcine meat due to the electric current flowing between the electrode plates in parallel. Hot-boned pork sample was prepared in 2*1*1 cubic centimeter. The finite element method with ANSYS workbench program was applied to simulate this heat transfer problem. In the thermal simulation, the input thermoelectric energy was calculated from measured current that flowing through the pork and the input voltage from the dc voltage source. The comparison of heat transfer in pork according to two voltage sources: DC voltage 30 volts and dc pulsed voltage 60 volts (pulse width 50 milliseconds and 50 % duty cycle) were demonstrated. From the result, it shown that the thermal conductivity trends to be steady at temperature 40C and 60C around 1.39 W/mC and 2.65 W/mC for dc voltage source 30 volts and dc pulsed voltage 60 volts, respectively. For temperature increased to 50C at 5 minutes, the appearance color of porcine meat at the exposer point has become to fade. This technique could be used for predicting of thermal conductivity caused by some meat’s characteristics.Keywords: thermal conductivity, porcine meat, electricity, finite element method
Procedia PDF Downloads 14027455 Towards a Secure Storage in Cloud Computing
Authors: Mohamed Elkholy, Ahmed Elfatatry
Abstract:
Cloud computing has emerged as a flexible computing paradigm that reshaped the Information Technology map. However, cloud computing brought about a number of security challenges as a result of the physical distribution of computational resources and the limited control that users have over the physical storage. This situation raises many security challenges for data integrity and confidentiality as well as authentication and access control. This work proposes a security mechanism for data integrity that allows a data owner to be aware of any modification that takes place to his data. The data integrity mechanism is integrated with an extended Kerberos authentication that ensures authorized access control. The proposed mechanism protects data confidentiality even if data are stored on an untrusted storage. The proposed mechanism has been evaluated against different types of attacks and proved its efficiency to protect cloud data storage from different malicious attacks.Keywords: access control, data integrity, data confidentiality, Kerberos authentication, cloud security
Procedia PDF Downloads 33527454 Assessment of Drinking Water Contamination from the Water Source to the Consumer in Palapye Region, Botswana
Authors: Tshegofatso Galekgathege
Abstract:
Poor water quality is of great concern to human health as it can cause disease outbreaks. A standard practice today, in developed countries, is that people should be provided with safe-reliable drinking water, as safe drinking water is recognized as a basic human right and a cost effective measure of reducing diseases. Over 1.1 billion people worldwide lack access to a safe water supply and as a result, the majority are forced to use polluted surface or groundwater. It is widely accepted that our water supply systems are susceptible to the intentional or accidental contamination .Water quality degradation may occur anywhere in the path that water takes from the water source to the consumer. Chlorine is believed to be an effective tool in disinfecting water, but its concentration may decrease with time due to consumption by chemical reactions. This shows that we are at the risk of being infected by waterborne diseases if chlorine in water falls below the required level of 0.2-1mg/liter which should be maintained in water and some contaminants enter into the water distribution system. It is believed that the lack of adequate sanitation also contributes to the contamination of water globally. This study therefore, assesses drinking water contamination from the source to the consumer by identifying the point vulnerable to contamination from the source to the consumer in the study area .To identify the point vulnerable to contamination, water was sampled monthly from boreholes, water treatment plant, water distribution system (WDS), service reservoirs and consumer taps from all the twenty (20) villages of Palapye region. Sampled water was then taken to the laboratory for testing and analysis of microbiological and chemical parameters. Water quality analysis were then compared with Botswana drinking water quality standards (BOS32:2009) to see if they comply. Major sources of water contamination identified during site visits were the livestock which were found drinking stagnant water from leaking pipes in 90 percent of the villages. Soils structure around the area was negatively affected because of livestock movement even vegetation in the area. In conclusion microbiological parameters of water in the study area do not comply with drinking water standards, some microbiological parameters in water indicated that livestock do not only affect land degradation but also the quality of water. Chlorine has been applied to water over some years but it is not effective enough thus preventative measures have to be developed, to prevent contaminants from reaching water. Remember: Prevention is better than cure.Keywords: land degradation, leaking systems, livestock, water contamination
Procedia PDF Downloads 35227453 Thermal Resistance of Special Garments Exposed to a Radiant Heat
Authors: Jana Pichova, Lubos Hes, Vladimir Bajzik
Abstract:
Protective clothing is designed to keep a wearer save in hazardous conditions or enable perform short time working operation without being injured or feeling discomfort. Firefighters or other related workers are exposed to abnormal heat which can be conductive, convective or radiant type. Their garment is proposed to resist this conditions and prevent burn injuries or dead of human. However thermal comfort of firefighter exposed to high heat source have not been studied yet. Thermal resistance is the best representative parameter of thermal comfort. In this study a new method of testing of thermal resistance of special clothing exposed to high radiation heat source was designed. This method simulates human body wearing single or multi-layered garment which is exposed to radiative heat. Setup of this method enables measuring of radiative heat flow in time without effect of convection. The new testing method is verified on chosen group of textiles for firefighters.Keywords: protective clothing, radiative heat, thermal comfort of firefighters, thermal resistance of special garments
Procedia PDF Downloads 37927452 Large Eddy Simulation of Particle Clouds Using Open-Source CFD
Authors: Ruo-Qian Wang
Abstract:
Open-source CFD has become increasingly popular and promising. The recent progress in multiphase flow enables new CFD applications, which provides an economic and flexible research tool for complex flow problems. Our numerical study using four-way coupling Euler-Lagrangian Large-Eddy Simulations to resolve particle cloud dynamics with OpenFOAM and CFDEM will be introduced: The fractioned Navier-Stokes equations are numerically solved for fluid phase motion, solid phase motion is addressed by Lagrangian tracking for every single particle, and total momentum is conserved by fluid-solid inter-phase coupling. The grid convergence test was performed, which proves the current resolution of the mesh is appropriate. Then, we validated the code by comparing numerical results with experiments in terms of particle cloud settlement and growth. A good comparison was obtained showing reliability of the present numerical schemes. The time and height at phase separations were defined and analyzed for a variety of initial release conditions. Empirical formulas were drawn to fit the results.Keywords: four-way coupling, dredging, land reclamation, multiphase flows, oil spill
Procedia PDF Downloads 42927451 LGG Architecture for Brain Tumor Segmentation Using Convolutional Neural Network
Authors: Sajeeha Ansar, Asad Ali Safi, Sheikh Ziauddin, Ahmad R. Shahid, Faraz Ahsan
Abstract:
The most aggressive form of brain tumor is called glioma. Glioma is kind of tumor that arises from glial tissue of the brain and occurs quite often. A fully automatic 2D-CNN model for brain tumor segmentation is presented in this paper. We performed pre-processing steps to remove noise and intensity variances using N4ITK and standard intensity correction, respectively. We used Keras open-source library with Theano as backend for fast implementation of CNN model. In addition, we used BRATS 2015 MRI dataset to evaluate our proposed model. Furthermore, we have used SimpleITK open-source library in our proposed model to analyze images. Moreover, we have extracted random 2D patches for proposed 2D-CNN model for efficient brain segmentation. Extracting 2D patched instead of 3D due to less dimensional information present in 2D which helps us in reducing computational time. Dice Similarity Coefficient (DSC) is used as performance measure for the evaluation of the proposed method. Our method achieved DSC score of 0.77 for complete, 0.76 for core, 0.77 for enhanced tumor regions. However, these results are comparable with methods already implemented 2D CNN architecture.Keywords: brain tumor segmentation, convolutional neural networks, deep learning, LGG
Procedia PDF Downloads 18227450 Spatial Analysis for Wind Risk Index Assessment
Authors: Ljiljana Seric, Vladimir Divic, Marin Bugaric
Abstract:
This paper presents methodology for spatial analysis of GIS data that is used for assessing the microlocation risk index from potential damages of high winds. The analysis is performed on freely available GIS data comprising information about wind load, terrain cover and topography of the area. The methodology utilizes the legislation of Eurocode norms for determination of wind load of buildings and constructions. The core of the methodology is adoption of the wind load parameters related to location on geographical spatial grid. Presented work is a part of the Wind Risk Project, supported by the European Commission under the Civil Protection Financial Instrument of the European Union (ECHO). The partners involved in Wind Risk project performed Wind Risk assessment and proposed action plan for three European countries – Slovenia, Croatia and Germany. The proposed method is implemented in GRASS GIS open source GIS software and demonstrated for Case study area of wider area of Split, Croatia. Obtained Wind Risk Index is visualized and correlated with critical infrastructures like buildings, roads and power lines. The results show good correlation between high Wind Risk Index with recent incidents related to wind.Keywords: Eurocode norms, GIS, spatial analysis, wind distribution, wind risk
Procedia PDF Downloads 31627449 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data
Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah
Abstract:
At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.Keywords: Semantic Web, linked open data, database, statistic
Procedia PDF Downloads 17527448 Romanian Teachers' Perspectives of Different Leadership Styles
Authors: Ralpian Randolian
Abstract:
Eighty-five Romanian teachers and principals participated on this study to examine their perspectives of different leadership styles. Demographic variables such as the source of degree (Romania, Europe institutes, USA institutes, etc.), gender, region, level taught, years of experience, and specialty were identified. The researcher developed a questionnaire that consisted of 4 leadership styles. The data were analyzed using structural equation modeling (SEM) to identify which of the variables best predict the leadership styles. Results indicated that the democracy style was the most preferred leadership style by Jordanian parents, while the authoritarian styles ranked second. The results also found statistically significant differences were found related to the study variables. This study ends by putting forward a number of suggestions and recommendation.Keywords: teachers’ perspectives, leadership styles, gender, structural equation modeling
Procedia PDF Downloads 48927447 The Role of Data Protection Officer in Managing Individual Data: Issues and Challenges
Authors: Nazura Abdul Manap, Siti Nur Farah Atiqah Salleh
Abstract:
For decades, the misuse of personal data has been a critical issue. Malaysia has accepted responsibility by implementing the Malaysian Personal Data Protection Act 2010 to secure personal data (PDPA 2010). After more than a decade, this legislation is set to be revised by the current PDPA 2023 Amendment Bill to align with the world's key personal data protection regulations, such as the European Union General Data Protection Regulations (GDPR). Among the other suggested adjustments is the Data User's appointment of a Data Protection Officer (DPO) to ensure the commercial entity's compliance with the PDPA 2010 criteria. The change is expected to be enacted in parliament fairly soon; nevertheless, based on the experience of the Personal Data Protection Department (PDPD) in implementing the Act, it is projected that there will be a slew of additional concerns associated with the DPO mandate. Consequently, the goal of this article is to highlight the issues that the DPO will encounter and how the Personal Data Protection Department should respond to this subject. The study result was produced using a qualitative technique based on an examination of the current literature. This research reveals that there are probable obstacles experienced by the DPO, and thus, there should be a definite, clear guideline in place to aid DPO in executing their tasks. It is argued that appointing a DPO is a wise measure in ensuring that the legal data security requirements are met.Keywords: guideline, law, data protection officer, personal data
Procedia PDF Downloads 7827446 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images
Authors: Amit Kumar Happy
Abstract:
This paper is motivated by the importance of multi-sensor image fusion with a specific focus on infrared (IR) and visual image (VI) fusion for various applications, including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like visible camera & IR thermal imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (infrared) that may be reflected or self-emitted. A digital color camera captures the visible source image, and a thermal infrared camera acquires the thermal source image. In this paper, some image fusion algorithms based upon multi-scale transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes the implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, they also make it hard to become deployed in systems and applications that require a real-time operation, high flexibility, and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.Keywords: image fusion, IR thermal imager, multi-sensor, multi-scale transform
Procedia PDF Downloads 11527445 A Guide to the Implementation of Ambisonics Super Stereo
Authors: Alessio Mastrorillo, Giuseppe Silvi, Francesco Scagliola
Abstract:
In this work, we introduce an Ambisonics decoder with an implementation of the C-format, also called Super Stereo. This format is an alternative to conventional stereo and binaural decoding. Unlike those, this format conveys audio information from the horizontal plane and works with stereo speakers and headphones. The two C-format channels can also return a reconstructed planar B-format. This work provides an open-source implementation for this format. We implement an all-pass filter for signal quadrature, as required by the decoding equations. This filter works with six Biquads in a cascade configuration, with values for control frequency and quality factor discovered experimentally. The phase response of the filter delivers a small error in the 20-14.000Hz range. The decoder has been tested with audio sources up to 192kHz sample rate, returning pristine sound quality and detailed stereo image. It has been included in the Envelop for Live suite and is available as an open-source repository. This decoder has applications in Virtual Reality and 360° audio productions, music composition, and online streaming.Keywords: ambisonics, UHJ, quadrature filter, virtual reality, Gerzon, decoder, stereo, binaural, biquad
Procedia PDF Downloads 9127444 Analysis of Secondary Peak in Hα Emission Profile during Gas Puffing in Aditya Tokamak
Authors: Harshita Raj, Joydeep Ghosh, Rakesh L. Tanna, Prabal K. Chattopadhyay, K. A. Jadeja, Sharvil Patel, Kaushal M. Patel, Narendra C. Patel, S. B. Bhatt, V. K. Panchal, Chhaya Chavda, C. N. Gupta, D. Raju, S. K. Jha, J. Raval, S. Joisa, S. Purohit, C. V. S. Rao, P. K. Atrey, Umesh Nagora, R. Manchanda, M. B. Chowdhuri, Nilam Ramaiya, S. Banerjee, Y. C. Saxena
Abstract:
Efficient gas fueling is a critical aspect that needs to be mastered in order to maintain plasma density, to carry out fusion. This requires a fair understanding of fuel recycling in order to optimize the gas fueling. In Aditya tokamak, multiple gas puffs are used in a precise and controlled manner, for hydrogen fueling during the flat top of plasma discharge which has been instrumental in achieving discharges with enhanced density as well as energy confinement time. Following each gas puff, we observe peaks in temporal profile of Hα emission, Soft X-ray (SXR) and chord averaged electron density in a number of discharges, indicating efficient gas fueling. Interestingly, Hα temporal profile exhibited an additional peak following the peak corresponding to each gas puff. These additional peak Hα appeared in between the two gas puffs, indicating the presence of a secondary hydrogen source apart from the gas puffs. A thorough investigation revealed that these secondary Hα peaks coincide with Hard X- ray bursts which come from the interaction of runaway electrons with vessel limiters. This leads to consider that the runaway electrons (REs), which hit the wall, in turn, bring out the absorbed hydrogen and oxygen from the wall and makes the interaction of REs with limiter a secondary hydrogen source. These observations suggest that runaway electron induced recycling should also be included in recycling particle source in the particle balance calculations in tokamaks. Observation of two Hα peaks associated with one gas puff and their roles in enhancing and maintaining plasma density in Aditya tokamak will be discussed in this paper.Keywords: fusion, gas fueling, recycling, Tokamak, Aditya
Procedia PDF Downloads 40227443 Exploring the Role of Hydrogen to Achieve the Italian Decarbonization Targets using an OpenScience Energy System Optimization Model
Authors: Alessandro Balbo, Gianvito Colucci, Matteo Nicoli, Laura Savoldi
Abstract:
Hydrogen is expected to become an undisputed player in the ecological transition throughout the next decades. The decarbonization potential offered by this energy vector provides various opportunities for the so-called “hard-to-abate” sectors, including industrial production of iron and steel, glass, refineries and the heavy-duty transport. In this regard, Italy, in the framework of decarbonization plans for the whole European Union, has been considering a wider use of hydrogen to provide an alternative to fossil fuels in hard-to-abate sectors. This work aims to assess and compare different options concerning the pathway to be followed in the development of the future Italian energy system in order to meet decarbonization targets as established by the Paris Agreement and by the European Green Deal, and to infer a techno-economic analysis of the required asset alternatives to be used in that perspective. To accomplish this objective, the Energy System Optimization Model TEMOA-Italy is used, based on the open-source platform TEMOA and developed at PoliTo as a tool to be used for technology assessment and energy scenario analysis. The adopted assessment strategy includes two different scenarios to be compared with a business-as-usual one, which considers the application of current policies in a time horizon up to 2050. The studied scenarios are based on the up-to-date hydrogen-related targets and planned investments included in the National Hydrogen Strategy and in the Italian National Recovery and Resilience Plan, with the purpose of providing a critical assessment of what they propose. One scenario imposes decarbonization objectives for the years 2030, 2040 and 2050, without any other specific target. The second one (inspired to the national objectives on the development of the sector) promotes the deployment of the hydrogen value-chain. These scenarios provide feedback about the applications hydrogen could have in the Italian energy system, including transport, industry and synfuels production. Furthermore, the decarbonization scenario where hydrogen production is not imposed, will make use of this energy vector as well, showing the necessity of its exploitation in order to meet pledged targets by 2050. The distance of the planned policies from the optimal conditions for the achievement of Italian objectives is be clarified, revealing possible improvements of various steps of the decarbonization pathway, which seems to have as a fundamental element Carbon Capture and Utilization technologies for its accomplishment. In line with the European Commission open science guidelines, the transparency and the robustness of the presented results is ensured by the adoption of the open-source open-data model such as the TEMOA-Italy.Keywords: decarbonization, energy system optimization models, hydrogen, open-source modeling, TEMOA
Procedia PDF Downloads 7327442 Application of Remote Sensing and GIS in Assessing Land Cover Changes within Granite Quarries around Brits Area, South Africa
Authors: Refilwe Moeletsi
Abstract:
Dimension stone quarrying around Brits and Belfast areas started in the early 1930s and has been growing rapidly since then. Environmental impacts associated with these quarries have not been documented, and hence this study aims at detecting any change in the environment that might have been caused by these activities. Landsat images that were used to assess land use/land cover changes in Brits quarries from 1998 - 2015. A supervised classification using maximum likelihood classifier was applied to classify each image into different land use/land cover types. Classification accuracy was assessed using Google Earth™ as a source of reference data. Post-classification change detection method was used to determine changes. The results revealed significant increase in granite quarries and corresponding decrease in vegetation cover within the study region.Keywords: remote sensing, GIS, change detection, granite quarries
Procedia PDF Downloads 31427441 Data Collection Based on the Questionnaire Survey In-Hospital Emergencies
Authors: Nouha Mhimdi, Wahiba Ben Abdessalem Karaa, Henda Ben Ghezala
Abstract:
The methods identified in data collection are diverse: electronic media, focus group interviews and short-answer questionnaires [1]. The collection of poor-quality data resulting, for example, from poorly designed questionnaires, the absence of good translators or interpreters, and the incorrect recording of data allow conclusions to be drawn that are not supported by the data or to focus only on the average effect of the program or policy. There are several solutions to avoid or minimize the most frequent errors, including obtaining expert advice on the design or adaptation of data collection instruments; or use technologies allowing better "anonymity" in the responses [2]. In this context, we opted to collect good quality data by doing a sizeable questionnaire-based survey on hospital emergencies to improve emergency services and alleviate the problems encountered. At the level of this paper, we will present our study, and we will detail the steps followed to achieve the collection of relevant, consistent and practical data.Keywords: data collection, survey, questionnaire, database, data analysis, hospital emergencies
Procedia PDF Downloads 10827440 Isolation and Identification of Biosurfactant Producing Microorganism for Bioaugmentation
Authors: Karthick Gopalan, Selvamohan Thankiah
Abstract:
Biosurfactants are lipid compounds produced by microbes, which are amphipathic molecules consisting of hydrophophic and hydrophilic domains. In the present investigation, ten bacterial strains were isolated from petroleum oil contaminated sites near petrol bunk. Oil collapsing test, haemolytic activity were used as a criteria for primary isolation of biosurfactant producing bacteria. In this study, all the bacterial strains gave positive results. Among the ten strains, two were observed as good biosurfactant producers, they utilize the diesel as a sole carbon source. Optimization of biosurfactant producing bacteria isolated from petroleum oil contaminated sites was carried out using different parameters such as, temperature (20ºC, 25ºC, 30ºC, 37ºC and 45ºC), pH (5,6,7,8 & 9) and nitrogen sources (ammonium chloride, ammonium carbonate and sodium nitrate). Biosurfactants produced by bacteria were extracted, dried and quantified. As a result of optimization of parameters the suitable values for the production of more amount of biosurfactant by the isolated bacterial species was observed as 30ºC (0.543 gm/lt) in the pH 7 (0.537 gm/lt) with ammonium nitrate (0.431 gm/lt) as sole carbon source.Keywords: isolation and identification, biosurfactant, microorganism, bioaugmentation
Procedia PDF Downloads 350