Search results for: data security
23299 Imaging of Underground Targets with an Improved Back-Projection Algorithm
Authors: Alireza Akbari, Gelareh Babaee Khou
Abstract:
Ground Penetrating Radar (GPR) is an important nondestructive remote sensing tool that has been used in both military and civilian fields. Recently, GPR imaging has attracted lots of attention in detection of subsurface shallow small targets such as landmines and unexploded ordnance and also imaging behind the wall for security applications. For the monostatic arrangement in the space-time GPR image, a single point target appears as a hyperbolic curve because of the different trip times of the EM wave when the radar moves along a synthetic aperture and collects reflectivity of the subsurface targets. With this hyperbolic curve, the resolution along the synthetic aperture direction shows undesired low resolution features owing to the tails of hyperbola. However, highly accurate information about the size, electromagnetic (EM) reflectivity, and depth of the buried objects is essential in most GPR applications. Therefore hyperbolic curve behavior in the space-time GPR image is often willing to be transformed to a focused pattern showing the object's true location and size together with its EM scattering. The common goal in a typical GPR image is to display the information of the spatial location and the reflectivity of an underground object. Therefore, the main challenge of GPR imaging technique is to devise an image reconstruction algorithm that provides high resolution and good suppression of strong artifacts and noise. In this paper, at first, the standard back-projection (BP) algorithm that was adapted to GPR imaging applications used for the image reconstruction. The standard BP algorithm was limited with against strong noise and a lot of artifacts, which have adverse effects on the following work like detection targets. Thus, an improved BP is based on cross-correlation between the receiving signals proposed for decreasing noises and suppression artifacts. To improve the quality of the results of proposed BP imaging algorithm, a weight factor was designed for each point in region imaging. Compared to a standard BP algorithm scheme, the improved algorithm produces images of higher quality and resolution. This proposed improved BP algorithm was applied on the simulation and the real GPR data and the results showed that the proposed improved BP imaging algorithm has a superior suppression artifacts and produces images with high quality and resolution. In order to quantitatively describe the imaging results on the effect of artifact suppression, focusing parameter was evaluated.Keywords: algorithm, back-projection, GPR, remote sensing
Procedia PDF Downloads 45223298 Phonetics and Phonological Investigation of Geminates and Gemination in Some Indic Languages
Authors: Hifzur Ansary
Abstract:
The aim and scope of the present research are to delve into the form of geminates and the process of gemination with special reference to Indic Languages. This work presents the results of a cross-linguistic investigation of word-medial geminate consonants. This study is a theoretical as well as experimental, that is, it is based not only on impressionistic data from Indic languages but also on an instrumental analysis of the data. The primary data have been collected from the native speakers. The secondary data have been collected from printed materials such as journals, grammar books and other published articles. The observations made in this study have been checked with a number of educated native speakers of Bangla and Telugu. The study focuses on geminates and gemination in Bangla (Indo-Aryan Language Family) and Telugu (Dravidian Language family) exhaustively. Thus this study also attempts to posit the valid geminates in Bangali and Telugu and provides an account of gemination in these languages. It also makes a comparison of singletons and geminated consonants. It describes the distribution of geminate phonemes and non-geminate phonemes of Bangla and Telugu. The present research would also investigate the vowel lengthening in Bangla with respect to gemination. The study also explains how gemination processes present in Indian Languages are transferred to Indian English.Keywords: geminate consonant, singleton-geminate contrast, different types of assimilation, gemination derives from borrowed words
Procedia PDF Downloads 28823297 Supply Chain Risk Management: A Meta-Study of Empirical Research
Authors: Shoufeng Cao, Kim Bryceson, Damian Hine
Abstract:
The existing supply chain risk management (SCRM) research is currently chaotic and somewhat disorganized, and the topic has been addressed conceptually more often than empirically. This paper, using both qualitative and quantitative data, employs a modified Meta-study method to investigate the SCRM empirical research published in quality journals over the period of 12 years (2004-2015). The purpose is to outline the extent research trends and the employed research methodologies (i.e., research method, data collection and data analysis) across the sub-field that will guide future research. The synthesized findings indicate that empirical study on risk ripple effect along an entire supply chain, industry-specific supply chain risk management and global/export supply chain risk management has not yet given much attention than it deserves in the SCRM field. Besides, it is suggested that future empirical research should employ multiple and/or mixed methods and multi-source data collection techniques to reduce common method bias and single-source bias, thus improving research validity and reliability. In conclusion, this paper helps to stimulate more quality empirical research in the SCRM field via identifying promising research directions and providing some methodology guidelines.Keywords: empirical research, meta-study, methodology guideline, research direction, supply chain risk management
Procedia PDF Downloads 31723296 Design of Bacterial Pathogens Identification System Based on Scattering of Laser Beam Light and Classification of Binned Plots
Authors: Mubashir Hussain, Mu Lv, Xiaohan Dong, Zhiyang Li, Bin Liu, Nongyue He
Abstract:
Detection and classification of microbes have a vast range of applications in biomedical engineering especially in detection, characterization, and quantification of bacterial contaminants. For identification of pathogens, different techniques are emerging in the field of biomedical engineering. Latest technology uses light scattering, capable of identifying different pathogens without any need for biochemical processing. Bacterial Pathogens Identification System (BPIS) which uses a laser beam, passes through the sample and light scatters off. An assembly of photodetectors surrounded by the sample at different angles to detect the scattering of light. The algorithm of the system consists of two parts: (a) Library files, and (b) Comparator. Library files contain data of known species of bacterial microbes in the form of binned plots, while comparator compares data of unknown sample with library files. Using collected data of unknown bacterial species, highest voltage values stored in the form of peaks and arranged in 3D histograms to find the frequency of occurrence. Resulting data compared with library files of known bacterial species. If sample data matching with any library file of known bacterial species, sample identified as a matched microbe. An experiment performed to identify three different bacteria particles: Enterococcus faecalis, Pseudomonas aeruginosa, and Escherichia coli. By applying algorithm using library files of given samples, results were compromising. This system is potentially applicable to several biomedical areas, especially those related to cell morphology.Keywords: microbial identification, laser scattering, peak identification, binned plots classification
Procedia PDF Downloads 15023295 Social Factors and Suicide Risk in Modern Russia
Authors: Maria Cherepanova, Svetlana Maximova
Abstract:
Background And Aims: Suicide is among ten most common causes of death of the working-age population in the world. According to the WHO forecasts, by 2025 suicide will be the third leading cause of death, after cardiovascular diseases and cancer. In 2019, the global suicide rate in the world was 10,5 per 100,000 people. In Russia, the average figure was 11.6. However, in some depressed regions of Russia, such as Buryatia and Altai, it reaches 35.3. The aim of this study was to develop models based on the regional factors of social well-being deprivation that provoke the suicidal risk of various age groups of Russian population. We also investigated suicidal risk prevention in modern Russia, analyzed its efficacy, and developed recommendations for suicidal risk prevention improvement. Methods: In this study, we analyzed the data from sociological surveys from six regions of Russia. Totally we interviewed 4200 people, the age of the respondents was from 16 to 70 years. The results were subjected to factorial and regression analyzes. Results: The results of our study indicate that young people are especially socially vulnerable, which result in ineffective patterns of self-preservation behavior and increase the risk of suicide. That is due to lack of anti-suicidal barriers formation; low importance of vital values; the difficulty or impossibility to achieve basic needs; low satisfaction with family and professional life; and decrease in personal unconditional significance. The suicidal risk of the middle-aged population is due to a decrease in social well-being in the main aspects of life, which determines low satisfaction, decrease in ontological security, and the prevalence of auto-aggressive deviations. The suicidal risk of the elderly population is due to increased factors of social exclusion which result in narrowing the social space and limiting the richness of life. Conclusions: The existing system for lowering suicide risk in modern Russia is predominantly oriented to a medical treatment, which provides only intervention to people who already committed suicide, that significantly limits its preventive effectiveness and social control of this deviation. The national strategy for suicide risk reduction in modern Russian society should combine medical and social activities, designed to minimize possible situations resulting to suicide. The strategy for elimination of suicidal risk should include a systematic and significant improvement of the social well-being of the population and aim at overcoming the basic aspects of social disadvantages such as poverty, unemployment as well as implementing innovative mental health improvement, developing life-saving behavior that will help to counter suicides in Russia.Keywords: social factors, suicide, prevention, Russia
Procedia PDF Downloads 16723294 An Analysis into Global Suicide Trends and Their Relation to Current Events Through a Socio-Cultural Lens
Authors: Lyndsey Kim
Abstract:
We utilized country-level data on suicide rates from 1985 through 2015 provided by the WHO to explore global trends as well as country-specific trends. First, we find that up until 1995, there was an increase in suicide rates globally, followed by a steep decline in deaths. This observation is largely driven by the data from Europe, where suicides are prominent but steadily declining. Second, men are more likely to commit suicide than women across the world over the years. Third, the older generation is more likely to commit suicide than youth and adults. Finally, we turn to Durkheim’s theory and use it as a lens to understand trends in suicide across time and countries and attempt to identify social and economic events that might explain patterns that we observe. For example, we discovered a drastically different pattern in suicide rates in the US, with a steep increase in suicides in the early 2000s. We hypothesize this might be driven by both the 9/11 attacks and the recession of 2008.Keywords: suicide trends, current events, data analysis, world health organization, durkheim theory
Procedia PDF Downloads 9323293 Trading off Accuracy for Speed in Powerdrill
Authors: Filip Buruiana, Alexander Hall, Reimar Hofmann, Thomas Hofmann, Silviu Ganceanu, Alexandru Tudorica
Abstract:
In-memory column-stores make interactive analysis feasible for many big data scenarios. PowerDrill is a system used internally at Google for exploration in logs data. Even though it is a highly parallelized column-store and uses in memory caching, interactive response times cannot be achieved for all datasets (note that it is common to analyze data with 50 billion records in PowerDrill). In this paper, we investigate two orthogonal approaches to optimize performance at the expense of an acceptable loss of accuracy. Both approaches can be implemented as outer wrappers around existing database engines and so they should be easily applicable to other systems. For the first optimization we show that memory is the limiting factor in executing queries at speed and therefore explore possibilities to improve memory efficiency. We adapt some of the theory behind data sketches to reduce the size of particularly expensive fields in our largest tables by a factor of 4.5 when compared to a standard compression algorithm. This saves 37% of the overall memory in PowerDrill and introduces a 0.4% relative error in the 90th percentile for results of queries with the expensive fields. We additionally evaluate the effects of using sampling on accuracy and propose a simple heuristic for annotating individual result-values as accurate (or not). Based on measurements of user behavior in our real production system, we show that these estimates are essential for interpreting intermediate results before final results are available. For a large set of queries this effectively brings down the 95th latency percentile from 30 to 4 seconds.Keywords: big data, in-memory column-store, high-performance SQL queries, approximate SQL queries
Procedia PDF Downloads 25923292 Data Quality on Regular Childhood Immunization Programme at Degehabur District: Somali Region, Ethiopia
Authors: Eyob Seife
Abstract:
Immunization is a life-saving intervention which prevents needless suffering through sickness, disability, and death. Emphasis on data quality and use will become even stronger with the development of the immunization agenda 2030 (IA2030). Quality of data is a key factor in generating reliable health information that enables monitoring progress, financial planning, vaccine forecasting capacities, and making decisions for continuous improvement of the national immunization program. However, ensuring data of sufficient quality and promoting an information-use culture at the point of the collection remains critical and challenging, especially in hard-to-reach and pastoralist areas where Degehabur district is selected based on a hypothesis of ‘there is no difference in reported and recounted immunization data consistency. Data quality is dependent on different factors where organizational, behavioral, technical, and contextual factors are the mentioned ones. A cross-sectional quantitative study was conducted on September 2022 in the Degehabur district. The study used the world health organization (WHO) recommended data quality self-assessment (DQS) tools. Immunization tally sheets, registers, and reporting documents were reviewed at 5 health facilities (2 health centers and 3 health posts) of primary health care units for one fiscal year (12 months) to determine the accuracy ratio. The data was collected by trained DQS assessors to explore the quality of monitoring systems at health posts, health centers, and the district health office. A quality index (QI) was assessed, and the accuracy ratio formulated were: the first and third doses of pentavalent vaccines, fully immunized (FI), and the first dose of measles-containing vaccines (MCV). In this study, facility-level results showed both over-reporting and under-reporting were observed at health posts when computing the accuracy ratio of the tally sheet to health post reports found at health centers for almost all antigens verified where pentavalent 1 was 88.3%, 60.4%, and 125.6% for Health posts A, B, and C respectively. For first-dose measles-containing vaccines (MCV), similarly, the accuracy ratio was found to be 126.6%, 42.6%, and 140.9% for Health posts A, B, and C, respectively. The accuracy ratio for fully immunized children also showed 0% for health posts A and B and 100% for health post-C. A relatively better accuracy ratio was seen at health centers where the first pentavalent dose was 97.4% and 103.3% for health centers A and B, while a first dose of measles-containing vaccines (MCV) was 89.2% and 100.9% for health centers A and B, respectively. A quality index (QI) of all facilities also showed results between the maximum of 33.33% and a minimum of 0%. Most of the verified immunization data accuracy ratios were found to be relatively better at the health center level. However, the quality of the monitoring system is poor at all levels, besides poor data accuracy at all health posts. So attention should be given to improving the capacity of staff and quality of monitoring system components, namely recording, reporting, archiving, data analysis, and using information for decision at all levels, especially in pastoralist areas where such kinds of study findings need to be improved beside to improving the data quality at root and health posts level.Keywords: accuracy ratio, Degehabur District, regular childhood immunization program, quality of monitoring system, Somali Region-Ethiopia
Procedia PDF Downloads 10723291 A Minimum Spanning Tree-Based Method for Initializing the K-Means Clustering Algorithm
Authors: J. Yang, Y. Ma, X. Zhang, S. Li, Y. Zhang
Abstract:
The traditional k-means algorithm has been widely used as a simple and efficient clustering method. However, the algorithm often converges to local minima for the reason that it is sensitive to the initial cluster centers. In this paper, an algorithm for selecting initial cluster centers on the basis of minimum spanning tree (MST) is presented. The set of vertices in MST with same degree are regarded as a whole which is used to find the skeleton data points. Furthermore, a distance measure between the skeleton data points with consideration of degree and Euclidean distance is presented. Finally, MST-based initialization method for the k-means algorithm is presented, and the corresponding time complexity is analyzed as well. The presented algorithm is tested on five data sets from the UCI Machine Learning Repository. The experimental results illustrate the effectiveness of the presented algorithm compared to three existing initialization methods.Keywords: degree, initial cluster center, k-means, minimum spanning tree
Procedia PDF Downloads 41123290 Novel Adaptive Radial Basis Function Neural Networks Based Approach for Short-Term Load Forecasting of Jordanian Power Grid
Authors: Eyad Almaita
Abstract:
In this paper, a novel adaptive Radial Basis Function Neural Networks (RBFNN) algorithm is used to forecast the hour by hour electrical load demand in Jordan. A small and effective RBFNN model is used to forecast the hourly total load demand based on a small number of features. These features are; the load in the previous day, the load in the same day in the previous week, the temperature in the same hour, the hour number, the day number, and the day type. The proposed adaptive RBFNN model can enhance the reliability of the conventional RBFNN after embedding the network in the system. This is achieved by introducing an adaptive algorithm that allows the change of the weights of the RBFNN after the training process is completed, which will eliminates the need to retrain the RBFNN model again. The data used in this paper is real data measured by National Electrical Power co. (Jordan). The data for the period Jan./2012-April/2013 is used train the RBFNN models and the data for the period May/2013- Sep. /2013 is used to validate the models effectiveness.Keywords: load forecasting, adaptive neural network, radial basis function, short-term, electricity consumption
Procedia PDF Downloads 34423289 Speed Optimization Model for Reducing Fuel Consumption Based on Shipping Log Data
Authors: Ayudhia P. Gusti, Semin
Abstract:
It is known that total operating cost of a vessel is dominated by the cost of fuel consumption. How to reduce the fuel cost of ship so that the operational costs of fuel can be minimized is the question that arises. As the basis of these kinds of problem, sailing speed determination is an important factor to be considered by a shipping company. Optimal speed determination will give a significant influence on the route and berth schedule of ships, which also affect vessel operating costs. The purpose of this paper is to clarify some important issues about ship speed optimization. Sailing speed, displacement, sailing time, and specific fuel consumption were obtained from shipping log data to be further analyzed for modeling the speed optimization. The presented speed optimization model is expected to affect the fuel consumption and to reduce the cost of fuel consumption.Keywords: maritime transportation, reducing fuel, shipping log data, speed optimization
Procedia PDF Downloads 56823288 An Assessment of the Temperature Change Scenarios Using RS and GIS Techniques: A Case Study of Sindh
Authors: Jan Muhammad, Saad Malik, Fadia W. Al-Azawi, Ali Imran
Abstract:
In the era of climate variability, rising temperatures are the most significant aspect. In this study PRECIS model data and observed data are used for assessing the temperature change scenarios of Sindh province during the first half of present century. Observed data from various meteorological stations of Sindh are the primary source for temperature change detection. The current scenario (1961–1990) and the future one (2010-2050) are acted by the PRECIS Regional Climate Model at a spatial resolution of 25 * 25 km. Regional Climate Model (RCM) can yield reasonably suitable projections to be used for climate-scenario. The main objective of the study is to map the simulated temperature as obtained from climate model-PRECIS and their comparison with observed temperatures. The analysis is done on all the districts of Sindh in order to have a more precise picture of temperature change scenarios. According to results the temperature is likely to increases by 1.5 - 2.1°C by 2050, compared to the baseline temperature of 1961-1990. The model assesses more accurate values in northern districts of Sindh as compared to the coastal belt of Sindh. All the district of the Sindh province exhibit an increasing trend in the mean temperature scenarios and each decade seems to be warmer than the previous one. An understanding of the change in temperatures is very vital for various sectors such as weather forecasting, water, agriculture, and health, etc.Keywords: PRECIS Model, real observed data, Arc GIS, interpolation techniques
Procedia PDF Downloads 24923287 Difference Expansion Based Reversible Data Hiding Scheme Using Edge Directions
Authors: Toshanlal Meenpal, Ankita Meenpal
Abstract:
A very important technique in reversible data hiding field is Difference expansion. Secret message as well as the cover image may be completely recovered without any distortion after data extraction process due to reversibility feature. In general, in any difference expansion scheme embedding is performed by integer transform in the difference image acquired by grouping two neighboring pixel values. This paper proposes an improved reversible difference expansion embedding scheme. We mainly consider edge direction for embedding by modifying the difference of two neighboring pixels values. In general, the larger difference tends to bring a degraded stego image quality than the smaller difference. Image quality in the range of 0.5 to 3.7 dB in average is achieved by the proposed scheme, which is shown through the experimental results. However payload wise it achieves almost similar capacity in comparisons with previous method.Keywords: information hiding, wedge direction, difference expansion, integer transform
Procedia PDF Downloads 48423286 A Numerical Model for Simulation of Blood Flow in Vascular Networks
Authors: Houman Tamaddon, Mehrdad Behnia, Masud Behnia
Abstract:
An accurate study of blood flow is associated with an accurate vascular pattern and geometrical properties of the organ of interest. Due to the complexity of vascular networks and poor accessibility in vivo, it is challenging to reconstruct the entire vasculature of any organ experimentally. The objective of this study is to introduce an innovative approach for the reconstruction of a full vascular tree from available morphometric data. Our method consists of implementing morphometric data on those parts of the vascular tree that are smaller than the resolution of medical imaging methods. This technique reconstructs the entire arterial tree down to the capillaries. Vessels greater than 2 mm are obtained from direct volume and surface analysis using contrast enhanced computed tomography (CT). Vessels smaller than 2mm are reconstructed from available morphometric and distensibility data and rearranged by applying Murray’s Laws. Implementation of morphometric data to reconstruct the branching pattern and applying Murray’s Laws to every vessel bifurcation simultaneously, lead to an accurate vascular tree reconstruction. The reconstruction algorithm generates full arterial tree topography down to the first capillary bifurcation. Geometry of each order of the vascular tree is generated separately to minimize the construction and simulation time. The node-to-node connectivity along with the diameter and length of every vessel segment is established and order numbers, according to the diameter-defined Strahler system, are assigned. During the simulation, we used the averaged flow rate for each order to predict the pressure drop and once the pressure drop is predicted, the flow rate is corrected to match the computed pressure drop for each vessel. The final results for 3 cardiac cycles is presented and compared to the clinical data.Keywords: blood flow, morphometric data, vascular tree, Strahler ordering system
Procedia PDF Downloads 27223285 Investigation of the Litho-Structure of Ilesa Using High Resolution Aeromagnetic Data
Authors: Oladejo Olagoke Peter, Adagunodo T. A., Ogunkoya C. O.
Abstract:
The research investigated the arrangement of some geological features under Ilesa employing aeromagnetic data. The obtained data was subjected to various data filtering and processing techniques, which are Total Horizontal Derivative (THD), Depth Continuation and Analytical Signal Amplitude using Geosoft Oasis Montaj 6.4.2 software. The Reduced to the Equator –Total Magnetic Intensity (TRE-TMI) outcomes reveal significant magnetic anomalies, with high magnitude (55.1 to 155 nT) predominantly at the Northwest half of the area. Intermediate magnetic susceptibility, ranging between 6.0 to 55.1 nT, dominates the eastern part, separated by depressions and uplifts. The southern part of the area exhibits a magnetic field of low intensity, ranging from -76.6 to 6.0 nT. The lineaments exhibit varying lengths ranging from 2.5 and 16.0 km. Analyzing the Rose Diagram and the analytical signal amplitude indicates structural styles mainly of E-W and NE-SW orientations, particularly evident in the western, SW and NE regions with an amplitude of 0.0318nT/m. The identified faults in the area demonstrate orientations of NNW-SSE, NNE-SSW and WNW-ESE, situated at depths ranging from 500 to 750 m. Considering the divergence magnetic susceptibility, structural style or orientation of the lineaments, identified fault and their depth, these lithological features could serve as a valuable foundation for assessing ground motion, particularly in the presence of sufficient seismic energy.Keywords: lineament, aeromagnetic, anomaly, fault, magnetic
Procedia PDF Downloads 7623284 Consumer Welfare in the Platform Economy
Authors: Prama Mukhopadhyay
Abstract:
Starting from transport to food, today’s world platform economy and digital markets have taken over almost every sphere of consumers’ lives. Sellers and buyers are getting connected through platforms, which is acting as an intermediary. It has made consumer’s life easier in terms of time, price, choice and other factors. Having said that, there are several concerns regarding platforms. There are competition law concerns like unfair pricing, deep discounting by the platforms which affect the consumer welfare. Apart from that, the biggest problem is lack of transparency with respect to the business models, how it operates, price calculation, etc. In most of the cases, consumers are unaware of how their personal data are being used. In most of the cases, they are unaware of how algorithm uses their personal data to determine the price of the product or even to show the relevant products using their previous searches. Using personal or non-personal data without consumer’s consent is a huge legal concern. In addition to this, another major issue lies with the question of liability. If a dispute arises, who will be responsible? The seller or the platform? For example, if someone ordered food through a food delivery app and the food was bad, in this situation who will be liable: the restaurant or the food delivery platform? In this paper, the researcher tries to examine the legal concern related to platform economy from the consumer protection and consumer welfare perspectives. The paper analyses the cases from different jurisdictions and approach taken by the judiciaries. The author compares the existing legislation of EU, US and other Asian Countries and tries to highlight the best practices.Keywords: competition, consumer, data, platform
Procedia PDF Downloads 14423283 Competing Risks Modeling Using within Node Homogeneity Classification Tree
Authors: Kazeem Adesina Dauda, Waheed Babatunde Yahya
Abstract:
To design a tree that maximizes within-node homogeneity, there is a need for a homogeneity measure that is appropriate for event history data with multiple risks. We consider the use of Deviance and Modified Cox-Snell residuals as a measure of impurity in Classification Regression Tree (CART) and compare our results with the results of Fiona (2008) in which homogeneity measures were based on Martingale Residual. Data structure approach was used to validate the performance of our proposed techniques via simulation and real life data. The results of univariate competing risk revealed that: using Deviance and Cox-Snell residuals as a response in within node homogeneity classification tree perform better than using other residuals irrespective of performance techniques. Bone marrow transplant data and double-blinded randomized clinical trial, conducted in other to compare two treatments for patients with prostate cancer were used to demonstrate the efficiency of our proposed method vis-à-vis the existing ones. Results from empirical studies of the bone marrow transplant data showed that the proposed model with Cox-Snell residual (Deviance=16.6498) performs better than both the Martingale residual (deviance=160.3592) and Deviance residual (Deviance=556.8822) in both event of interest and competing risks. Additionally, results from prostate cancer also reveal the performance of proposed model over the existing one in both causes, interestingly, Cox-Snell residual (MSE=0.01783563) outfit both the Martingale residual (MSE=0.1853148) and Deviance residual (MSE=0.8043366). Moreover, these results validate those obtained from the Monte-Carlo studies.Keywords: within-node homogeneity, Martingale residual, modified Cox-Snell residual, classification and regression tree
Procedia PDF Downloads 27223282 A Smart Sensor Network Approach Using Affordable River Water Level Sensors
Authors: Dian Zhang, Brendan Heery, Maria O’Neill, Ciprian Briciu-Burghina, Noel E. O’Connor, Fiona Regan
Abstract:
Recent developments in sensors, wireless data communication and the cloud computing have brought the sensor web to a whole new generation. The introduction of the concept of ‘Internet of Thing (IoT)’ has brought the sensor research into a new level, which involves the developing of long lasting, low cost, environment friendly and smart sensors; new wireless data communication technologies; big data analytics algorithms and cloud based solutions that are tailored to large scale smart sensor network. The next generation of smart sensor network consists of several layers: physical layer, where all the smart sensors resident and data pre-processes occur, either on the sensor itself or field gateway; data transmission layer, where data and instructions exchanges happen; the data process layer, where meaningful information is extracted and organized from the pre-process data stream. There are many definitions of smart sensor, however, to summarize all these definitions, a smart sensor must be Intelligent and Adaptable. In future large scale sensor network, collected data are far too large for traditional applications to send, store or process. The sensor unit must be intelligent that pre-processes collected data locally on board (this process may occur on field gateway depends on the sensor network structure). In this case study, three smart sensing methods, corresponding to simple thresholding, statistical model and machine learning based MoPBAS method, are introduced and their strength and weakness are discussed as an introduction to the smart sensing concept. Data fusion, the integration of data and knowledge from multiple sources, are key components of the next generation smart sensor network. For example, in the water level monitoring system, weather forecast can be extracted from external sources and if a heavy rainfall is expected, the server can send instructions to the sensor notes to, for instance, increase the sampling rate or switch on the sleeping mode vice versa. In this paper, we describe the deployment of 11 affordable water level sensors in the Dublin catchment. The objective of this paper is to use the deployed river level sensor network at the Dodder catchment in Dublin, Ireland as a case study to give a vision of the next generation of a smart sensor network for flood monitoring to assist agencies in making decisions about deploying resources in the case of a severe flood event. Some of the deployed sensors are located alongside traditional water level sensors for validation purposes. Using the 11 deployed river level sensors in a network as a case study, a vision of the next generation of smart sensor network is proposed. Each key component of the smart sensor network is discussed, which hopefully inspires the researchers who are working in the sensor research domain.Keywords: smart sensing, internet of things, water level sensor, flooding
Procedia PDF Downloads 38123281 Validation of Escherichia coli O157:H7 Inactivation on Apple-Carrot Juice Treated with Manothermosonication by Kinetic Models
Authors: Ozan Kahraman, Hao Feng
Abstract:
Several models such as Weibull, Modified Gompertz, Biphasic linear, and Log-logistic models have been proposed in order to describe non-linear inactivation kinetics and used to fit non-linear inactivation data of several microorganisms for inactivation by heat, high pressure processing or pulsed electric field. First-order kinetic parameters (D-values and z-values) have often been used in order to identify microbial inactivation by non-thermal processing methods such as ultrasound. Most ultrasonic inactivation studies employed first-order kinetic parameters (D-values and z-values) in order to describe the reduction on microbial survival count. This study was conducted to analyze the E. coli O157:H7 inactivation data by using five microbial survival models (First-order, Weibull, Modified Gompertz, Biphasic linear and Log-logistic). First-order, Weibull, Modified Gompertz, Biphasic linear and Log-logistic kinetic models were used for fitting inactivation curves of Escherichia coli O157:H7. The residual sum of squares and the total sum of squares criteria were used to evaluate the models. The statistical indices of the kinetic models were used to fit inactivation data for E. coli O157:H7 by MTS at three temperatures (40, 50, and 60 0C) and three pressures (100, 200, and 300 kPa). Based on the statistical indices and visual observations, the Weibull and Biphasic models were best fitting of the data for MTS treatment as shown by high R2 values. The non-linear kinetic models, including the Modified Gompertz, First-order, and Log-logistic models did not provide any better fit to data from MTS compared the Weibull and Biphasic models. It was observed that the data found in this study did not follow the first-order kinetics. It is possibly because of the cells which are sensitive to ultrasound treatment were inactivated first, resulting in a fast inactivation period, while those resistant to ultrasound were killed slowly. The Weibull and biphasic models were found as more flexible in order to determine the survival curves of E. coli O157:H7 treated by MTS on apple-carrot juice.Keywords: Weibull, Biphasic, MTS, kinetic models, E.coli O157:H7
Procedia PDF Downloads 36623280 The Use of Learning Management Systems during Emerging the Tacit Knowledge
Authors: Ercan Eker, Muhammer Karaman, Akif Aslan, Hakan Tanrikuluoglu
Abstract:
Deficiency of institutional memory and knowledge management can result in information security breaches, loss of prestige and trustworthiness and the worst the loss of know-how and institutional knowledge. Traditional learning management within organizations is generally handled by personal efforts. That kind of struggle mostly depends on personal desire, motivation and institutional belonging. Even if an organization has highly motivated employees at a certain time, the institutional knowledge and memory life cycle will generally remain limited to these employees’ spending time in this organization. Having a learning management system in an organization can sustain the institutional memory, knowledge and know-how in the organization. Learning management systems are much more needed especially in public organizations where the job rotation is frequently seen and managers are appointed periodically. However, a learning management system should not be seen as an organizations’ website. It is a more comprehensive, interactive and user-friendly knowledge management tool for organizations. In this study, the importance of using learning management systems in the process of emerging tacit knowledge is underlined.Keywords: knowledge management, learning management systems, tacit knowledge, institutional memory
Procedia PDF Downloads 38023279 Benchmarking of Pentesting Tools
Authors: Esteban Alejandro Armas Vega, Ana Lucila Sandoval Orozco, Luis Javier García Villalba
Abstract:
The benchmarking of tools for dynamic analysis of vulnerabilities in web applications is something that is done periodically, because these tools from time to time update their knowledge base and search algorithms, in order to improve their accuracy. Unfortunately, the vast majority of these evaluations are made by software enthusiasts who publish their results on blogs or on non-academic websites and always with the same evaluation methodology. Similarly, academics who have carried out this type of analysis from a scientific approach, the majority, make their analysis within the same methodology as well the empirical authors. This paper is based on the interest of finding answers to questions that many users of this type of tools have been asking over the years, such as, to know if the tool truly test and evaluate every vulnerability that it ensures do, or if the tool, really, deliver a real report of all the vulnerabilities tested and exploited. This kind of questions have also motivated previous work but without real answers. The aim of this paper is to show results that truly answer, at least on the tested tools, all those unanswered questions. All the results have been obtained by changing the common model of benchmarking used for all those previous works.Keywords: cybersecurity, IDS, security, web scanners, web vulnerabilities
Procedia PDF Downloads 31923278 Research on Energy-Related Occupant Behavior of Residential Air Conditioning Based on Zigbee Intelligent Electronic Equipment
Authors: Dawei Xia, Benyan Jiang, Yong Li
Abstract:
Split-type air conditioners is widely used for indoor temperature regulation in urban residential buildings in summer in China. The energy-related occupant behavior has a great impact on building energy consumption. Obtaining the energy-related occupant behavior data of air conditioners is the research basis for the energy consumption prediction and simulation. Relying on the development of sensing and control technology, this paper selects Zigbee intelligent electronic equipment to monitor the energy-related occupant behavior of 20 households for 3 months in summer. Through analysis of data, it is found that people of different ages in the region have significant difference in the time, duration, frequency, and energy consumption of air conditioners, and form a data model of three basic energy-related occupant behavior patterns to provide an accurate simulation of energy.Keywords: occupant behavior, Zigbee, split air conditioner, energy simulation
Procedia PDF Downloads 19623277 Estimating City-Level Rooftop Rainwater Harvesting Potential with a Focus on Sustainability
Authors: Priya Madhuri P., Kamini J., Jayanthi S. C.
Abstract:
Rooftop rainwater harvesting is a crucial practice to address water scarcity, pollution, and flooding. This study aims to estimate the rooftop rainwater harvesting potential (RRWHP) for Suryapet, India, using building footprint data and average rainfall data. The study uses rainfall grids from the India Meteorological Department and Very High Resolution Satellite data to capture building footprints and calculate the RRWHP for a five-year period (2015-2020). Buildings with an area of more than 20 square meters are considered. A conservative figure of 60% efficiency for the catchment area is considered. The study chose 31,770 buildings with an effective rooftop area of around 1.56 sq. km. The city experiences annual rainfall values ranging from 791 mm to 987 mm, with August being the wettest month. The projected annual rooftop rainwater harvesting potential is 1.3 billion litres.Keywords: buildings, rooftop rainwater harvesting, sustainable water management, urban
Procedia PDF Downloads 3823276 Concept for Knowledge out of Sri Lankan Non-State Sector: Performances of Higher Educational Institutes and Successes of Its Sector
Authors: S. Jeyarajan
Abstract:
Concept of knowledge is discovered from conducted study for successive Competition in Sri Lankan Non-State Higher Educational Institutes. The Concept discovered out of collected Knowledge Management Practices from Emerald inside likewise reputed literatures and of Non-State Higher Educational sector. A test is conducted to reveal existences and its reason behind of these collected practices in Sri Lankan Non-State Higher Education Institutes. Further, unavailability of such study and uncertain on number of participants for data collection in the Sri Lankan context contributed selection of research method as qualitative method, which used attributes of Delphi Method to manage those likewise uncertainty. Data are collected under Dramaturgical Method, which contributes efficient usage of the Delphi method. Grounded theory is selected as data analysis techniques, which is conducted in intermixed discourse to manage different perspectives of data that are collected systematically through perspective and modified snowball sampling techniques. Data are then analysed using Grounded Theory Development Techniques in Intermix discourses to manage differences in Data. Consequently, Agreement in the results of Grounded theories and of finding in the Foreign Study is discovered in the analysis whereas present study conducted as Qualitative Research and The Foreign Study conducted as Quantitative Research. As such, the Present study widens the discovery in the Foreign Study. Further, having discovered reason behind of the existences, the Present result shows Concept for Knowledge from Sri Lankan Non-State sector to manage higher educational Institutes in successful manner.Keywords: adherence of snowball sampling into perspective sampling, Delphi method in qualitative method, grounded theory development in intermix discourses of analysis, knowledge management for success of higher educational institutes
Procedia PDF Downloads 17323275 Physics-Informed Convolutional Neural Networks for Reservoir Simulation
Authors: Jiangxia Han, Liang Xue, Keda Chen
Abstract:
Despite the significant progress over the last decades in reservoir simulation using numerical discretization, meshing is complex. Moreover, the high degree of freedom of the space-time flow field makes the solution process very time-consuming. Therefore, we present Physics-Informed Convolutional Neural Networks(PICNN) as a hybrid scientific theory and data method for reservoir modeling. Besides labeled data, the model is driven by the scientific theories of the underlying problem, such as governing equations, boundary conditions, and initial conditions. PICNN integrates governing equations and boundary conditions into the network architecture in the form of a customized convolution kernel. The loss function is composed of data matching, initial conditions, and other measurable prior knowledge. By customizing the convolution kernel and minimizing the loss function, the neural network parameters not only fit the data but also honor the governing equation. The PICNN provides a methodology to model and history-match flow and transport problems in porous media. Numerical results demonstrate that the proposed PICNN can provide an accurate physical solution from a limited dataset. We show how this method can be applied in the context of a forward simulation for continuous problems. Furthermore, several complex scenarios are tested, including the existence of data noise, different work schedules, and different good patterns.Keywords: convolutional neural networks, deep learning, flow and transport in porous media, physics-informed neural networks, reservoir simulation
Procedia PDF Downloads 14323274 Improved Throttled Load Balancing Approach for Cloud Environment
Authors: Sushant Singh, Anurag Jain, Seema Sabharwal
Abstract:
Cloud computing is advancing with a rapid speed. Already, it has been adopted by a huge set of users. Easy to use and anywhere access like potential of cloud computing has made it more attractive relative to other technologies. This has resulted in reduction of deployment cost on user side. It has also allowed the big companies to sell their infrastructure to recover the installation cost for the organization. Roots of cloud computing have extended from Grid computing. Along with the inherited characteristics of its predecessor technologies it has also adopted the loopholes present in those technologies. Some of the loopholes are identified and corrected recently, but still some are yet to be rectified. Two major areas where still scope of improvement exists are security and performance. The proposed work is devoted to performance enhancement for the user of the existing cloud system by improving the basic throttled mapping approach between task and resources. The improved procedure has been tested using the cloud analyst simulator. The results are compared with the original and it has been found that proposed work is one step ahead of existing techniques.Keywords: cloud analyst, cloud computing, load balancing, throttled
Procedia PDF Downloads 24923273 A Study on the New Weapon Requirements Analytics Using Simulations and Big Data
Authors: Won Il Jung, Gene Lee, Luis Rabelo
Abstract:
Since many weapon systems are getting more complex and diverse, various problems occur in terms of the acquisition cost, time, and performance limitation. As a matter of fact, the experiment execution in real world is costly, dangerous, and time-consuming to obtain Required Operational Characteristics (ROC) for a new weapon acquisition although enhancing the fidelity of experiment results. Also, until presently most of the research contained a large amount of assumptions so therefore a bias is present in the experiment results. At this moment, the new methodology is proposed to solve these problems without a variety of assumptions. ROC of the new weapon system is developed through the new methodology, which is a way to analyze big data generated by simulating various scenarios based on virtual and constructive models which are involving 6 Degrees of Freedom (6DoF). The new methodology enables us to identify unbiased ROC on new weapons by reducing assumptions and provide support in terms of the optimal weapon systems acquisition.Keywords: big data, required operational characteristics (ROC), virtual and constructive models, weapon acquisition
Procedia PDF Downloads 28923272 The Influence of Job Recognition and Job Motivation on Organizational Commitment in Public Sector: The Mediation Role of Employee Engagement
Authors: Muhammad Tayyab, Saba Saira
Abstract:
It is an established fact that organizations across the globe consider employees as their assets and try to advance their well-being. However, the local firms of developing countries are mostly profit oriented and do not have much concern about their employees’ engagement or commitment. Like other developing countries, the local organizations of Pakistan are also less concerned about the well-being of their employees. Especially public sector organizations lack concern regarding engagement, satisfaction or commitment of the employees. Therefore, this study aimed at investigating the impact of job recognition and job motivation on organizational commitment in the mediation role of employee engagement. The data were collected from land record officers of board of revenue, Punjab, Pakistan. Structured questionnaire was used to collect data through physically visiting land record officers and also through the internet. A total of 318 land record officers’ responses were finalized to perform data analysis. The data were analyzed through confirmatory factor analysis and structural equation modeling technique. The findings revealed that job recognition and job motivation have direct as well as indirect positive and significant impact on organizational commitment. The limitations, practical implications and future research indications are also explained.Keywords: job motivation, job recognition, employee engagement, employee commitment, public sector, land record officers
Procedia PDF Downloads 13223271 Securing Internet of Things Devices in Healthcare industry: An Investigation into Efficient and Effective Authorization Procedures
Authors: Maruf Farhan, Abdul Salih, Sikandar Ali Tahir
Abstract:
Protecting patient information's confidentiality is paramount considering the widespread use of Internet of Things (IoT) gadgets in medical settings. This study's subjects are decentralized identifiers (DIDs) and verifiable credentials (VCs) in conjunction with an OAuth-based authorization framework, as they are the key to protecting IoT healthcare devices. DIDs enable autonomous authentication and trust formation between IoT devices and other entities. To authorize users and enforce access controls based on verified claims, VCs offer a secure and adaptable solution. Through the proposed method, medical facilities can improve the privacy and security of their IoT devices while streamlining access control administration. A Smart pill dispenser in a hospital setting is used to illustrate the advantages of this method. The findings demonstrate the value of DIDs, VCs, and OAuth-based delegation in protecting the IoT devices. Improved processes for authorizing and controlling access to IoT devices are possible thanks to the research findings, which also help ensure patient confidentiality in the healthcare sector.Keywords: Iot, DID, authorization, verifiable credentials
Procedia PDF Downloads 7623270 Optimizing The Residential Design Process Using Automated Technologies
Authors: Martin Georgiev, Milena Nanova, Damyan Damov
Abstract:
Architects, engineers, and developers need to analyse and implement a wide spectrum of data in different formats, if they want to produce viable residential developments. Usually, this data comes from a number of different sources and is not well structured. The main objective of this research project is to provide parametric tools working with real geodesic data that can generate residential solutions. Various codes, regulations and design constraints are described by variables and prioritized. In this way, we establish a common workflow for architects, geodesists, and other professionals involved in the building and investment process. This collaborative medium ensures that the generated design variants conform to various requirements, contributing to a more streamlined and informed decision-making process. The quantification of distinctive characteristics inherent to typical residential structures allows a systematic evaluation of the generated variants, focusing on factors crucial to designers, such as daylight simulation, circulation analysis, space utilization, view orientation, etc. Integrating real geodesic data offers a holistic view of the built environment, enhancing the accuracy and relevance of the design solutions. The use of generative algorithms and parametric models offers high productivity and flexibility of the design variants. It can be implemented in more conventional CAD and BIM workflow. Experts from different specialties can join their efforts, sharing a common digital workspace. In conclusion, our research demonstrates that a generative parametric approach based on real geodesic data and collaborative decision-making could be introduced in the early phases of the design process. This gives the designers powerful tools to explore diverse design possibilities, significantly improving the qualities of the building investment during its entire lifecycle.Keywords: architectural design, residential buildings, urban development, geodesic data, generative design, parametric models, workflow optimization
Procedia PDF Downloads 52