Search results for: Privacy and Data Protection Law
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26340

Search results for: Privacy and Data Protection Law

21660 Energy Consumption and Economic Growth: Testimony of Selected Sub-Saharan Africa Countries

Authors: Alfred Quarcoo

Abstract:

The main purpose of this paper is to examine the causal relationship between energy consumption and economic growth in Sub-Saharan Africa using panel data techniques. An annual data on energy consumption and Economic Growth (proxied by real gross domestic product per capita) spanning from 1990 to 2016 from the World bank index database was used. The results of the Augmented Dickey–Fuller unit root test shows that the series for all countries are not stationary at levels. However, the log of economic growth in Benin and Congo become stationary after taking the differences of the data, and log of energy consumption become stationary for all countries and Log of economic growth in Kenya and Zimbabwe were found to be stationary after taking the second differences of the panel series. The findings of the Johansen cointegration test demonstrate that the variables Log of Energy Consumption and Log of economic growth are not co-integrated for the cases of Kenya and Zimbabwe, so no long-run relationship between the variables were established in any country. The Granger causality test indicates that there is a unidirectional causality running from energy use to economic growth in Kenya and no causal linkage between Energy consumption and economic growth in Benin, Congo and Zimbabwe.

Keywords: Cointegration, Granger Causality, Sub-Sahara Africa, World Bank Development Indicators

Procedia PDF Downloads 37
21659 Time Travel Testing: A Mechanism for Improving Renewal Experience

Authors: Aritra Majumdar

Abstract:

While organizations strive to expand their new customer base, retaining existing relationships is a key aspect of improving overall profitability and also showcasing how successful an organization is in holding on to its customers. It is an experimentally proven fact that the lion’s share of profit always comes from existing customers. Hence seamless management of renewal journeys across different channels goes a long way in improving trust in the brand. From a quality assurance standpoint, time travel testing provides an approach to both business and technology teams to enhance the customer experience when they look to extend their partnership with the organization for a defined phase of time. This whitepaper will focus on key pillars of time travel testing: time travel planning, time travel data preparation, and enterprise automation. Along with that, it will call out some of the best practices and common accelerator implementation ideas which are generic across verticals like healthcare, insurance, etc. In this abstract document, a high-level snapshot of these pillars will be provided. Time Travel Planning: The first step of setting up a time travel testing roadmap is appropriate planning. Planning will include identifying the impacted systems that need to be time traveled backward or forward depending on the business requirement, aligning time travel with other releases, frequency of time travel testing, preparedness for handling renewal issues in production after time travel testing is done and most importantly planning for test automation testing during time travel testing. Time Travel Data Preparation: One of the most complex areas in time travel testing is test data coverage. Aligning test data to cover required customer segments and narrowing it down to multiple offer sequencing based on defined parameters are keys for successful time travel testing. Another aspect is the availability of sufficient data for similar combinations to support activities like defect retesting, regression testing, post-production testing (if required), etc. This section will talk about the necessary steps for suitable data coverage and sufficient data availability from a time travel testing perspective. Enterprise Automation: Time travel testing is never restricted to a single application. The workflow needs to be validated in the downstream applications to ensure consistency across the board. Along with that, the correctness of offers across different digital channels needs to be checked in order to ensure a smooth customer experience. This section will talk about the focus areas of enterprise automation and how automation testing can be leveraged to improve the overall quality without compromising on the project schedule. Along with the above-mentioned items, the white paper will elaborate on the best practices that need to be followed during time travel testing and some ideas pertaining to accelerator implementation. To sum it up, this paper will be written based on the real-time experience author had on time travel testing. While actual customer names and program-related details will not be disclosed, the paper will highlight the key learnings which will help other teams to implement time travel testing successfully.

Keywords: time travel planning, time travel data preparation, enterprise automation, best practices, accelerator implementation ideas

Procedia PDF Downloads 141
21658 Electronic Data Interchange (EDI) in the Supply Chain: Impact on Customer Satisfaction

Authors: Hicham Amine, Abdelouahab Mesnaoui

Abstract:

Electronic data interchange EDI is the computer-to-computer exchange of structured business information. This information typically takes the form of standardized electronic business documents, such as invoices, purchase orders, bills of lading, and so on. The purpose of this study is to identify the impact EDI might have on supply chain and typically on customer satisfaction keeping in mind the constraints the organization might face. This study included 139 subject matter experts (SMEs) who participated by responding to a survey that was distributed. 85% responded that they are extremely for the implementation while 10% were neutral and 5% were against the implementation. From the quality assurance department, we have got 75% from the clients agreed to move on with the change whereas 10% stayed neutral and finally 15% were against the change. From the legal department where 80% of the answers were for the implementation and 10% of the participants stayed neutral whereas the last 10% were against it. The survey consisted of 40% male and 60% female (sex-ratio (F/M=1,5), who had chosen to participate. Our survey also contained 3 categories in terms of technical background where 80% are from technical background and 15% were from nontechnical background and 5% had some average technical background. This study examines the impact of EDI on customer satisfaction which is the primary hypothesis and justifies the importance of the implementation which enhances the customer satisfaction.

Keywords: electronic data interchange, supply chain, subject matter experts, customer satisfaction

Procedia PDF Downloads 324
21657 Accelerating Side Channel Analysis with Distributed and Parallelized Processing

Authors: Kyunghee Oh, Dooho Choi

Abstract:

Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.

Keywords: DPA, distributed computing, parallelized processing, side channel analysis

Procedia PDF Downloads 406
21656 Monitoring of Hydrological Parameters in the Alexandra Jukskei Catchment in South Africa

Authors: Vhuhwavho Gadisi, Rebecca Alowo, German Nkhonjera

Abstract:

It has been noted that technical programming for handling groundwater resources is not accessible. The lack of these systems hinders groundwater management processes necessary for decision-making through monitoring and evaluation regarding the Jukskei River of the Crocodile River (West) Basin in Johannesburg, South Africa. Several challenges have been identified in South Africa's Jukskei Catchment concerning groundwater management. Some of those challenges will include the following: Gaps in data records; there is a need for training and equipping of monitoring staff; formal accreditation of monitoring capacities and equipment; there is no access to regulation terms (e.g., meters). Taking into consideration necessities and human requirements as per typical densities in various regions of South Africa, there is a need to construct several groundwater level monitoring stations in a particular segment; the available raw data on groundwater level should be converted into consumable products for example, short reports on delicate areas (e.g., Dolomite compartments, wetlands, aquifers, and sole source) and considering the increasing civil unrest there has been vandalism and theft of groundwater monitoring infrastructure. GIS was employed at the catchment level to plot the relationship between those identified groundwater parameters in the catchment area and the identified borehole. GIS-based maps were designed for groundwater monitoring to be pretested on one borehole in the Jukskei catchment. This data will be used to establish changes in the borehole compared to changes in the catchment area according to identified parameters.

Keywords: GIS, monitoring, Jukskei, catchment

Procedia PDF Downloads 82
21655 Transportation Mode Classification Using GPS Coordinates and Recurrent Neural Networks

Authors: Taylor Kolody, Farkhund Iqbal, Rabia Batool, Benjamin Fung, Mohammed Hussaeni, Saiqa Aleem

Abstract:

The rising threat of climate change has led to an increase in public awareness and care about our collective and individual environmental impact. A key component of this impact is our use of cars and other polluting forms of transportation, but it is often difficult for an individual to know how severe this impact is. While there are applications that offer this feedback, they require manual entry of what transportation mode was used for a given trip, which can be burdensome. In order to alleviate this shortcoming, a data from the 2016 TRIPlab datasets has been used to train a variety of machine learning models to automatically recognize the mode of transportation. The accuracy of 89.6% is achieved using single deep neural network model with Gated Recurrent Unit (GRU) architecture applied directly to trip data points over 4 primary classes, namely walking, public transit, car, and bike. These results are comparable in accuracy to results achieved by others using ensemble methods and require far less computation when classifying new trips. The lack of trip context data, e.g., bus routes, bike paths, etc., and the need for only a single set of weights make this an appropriate methodology for applications hoping to reach a broad demographic and have responsive feedback.

Keywords: classification, gated recurrent unit, recurrent neural network, transportation

Procedia PDF Downloads 118
21654 Data Mining to Capture User-Experience: A Case Study in Notebook Product Appearance Design

Authors: Rhoann Kerh, Chen-Fu Chien, Kuo-Yi Lin

Abstract:

In the era of rapidly increasing notebook market, consumer electronics manufacturers are facing a highly dynamic and competitive environment. In particular, the product appearance is the first part for user to distinguish the product from the product of other brands. Notebook product should differ in its appearance to engage users and contribute to the user experience (UX). The UX evaluates various product concepts to find the design for user needs; in addition, help the designer to further understand the product appearance preference of different market segment. However, few studies have been done for exploring the relationship between consumer background and the reaction of product appearance. This study aims to propose a data mining framework to capture the user’s information and the important relation between product appearance factors. The proposed framework consists of problem definition and structuring, data preparation, rules generation, and results evaluation and interpretation. An empirical study has been done in Taiwan that recruited 168 subjects from different background to experience the appearance performance of 11 different portable computers. The results assist the designers to develop product strategies based on the characteristics of consumers and the product concept that related to the UX, which help to launch the products to the right customers and increase the market shares. The results have shown the practical feasibility of the proposed framework.

Keywords: consumers decision making, product design, rough set theory, user experience

Procedia PDF Downloads 295
21653 Audit of TPS photon beam dataset for small field output factors using OSLDs against RPC standard dataset

Authors: Asad Yousuf

Abstract:

Purpose: The aim of the present study was to audit treatment planning system beam dataset for small field output factors against standard dataset produced by radiological physics center (RPC) from a multicenter study. Such data are crucial for validity of special techniques, i.e., IMRT or stereotactic radiosurgery. Materials/Method: In this study, multiple small field size output factor datasets were measured and calculated for 6 to 18 MV x-ray beams using the RPC recommend methods. These beam datasets were measured at 10 cm depth for 10 × 10 cm2 to 2 × 2 cm2 field sizes, defined by collimator jaws at 100 cm. The measurements were made with a Landauer’s nanoDot OSLDs whose volume is small enough to gather a full ionization reading even for the 1×1 cm2 field size. At our institute the beam data including output factors have been commissioned at 5 cm depth with an SAD setup. For comparison with the RPC data, the output factors were converted to an SSD setup using tissue phantom ratios. SSD setup also enables coverage of the ion chamber in 2×2 cm2 field size. The measured output factors were also compared with those calculated by Eclipse™ treatment planning software. Result: The measured and calculated output factors are in agreement with RPC dataset within 1% and 4% respectively. The large discrepancies in TPS reflect the increased challenge in converting measured data into a commissioned beam model for very small fields. Conclusion: OSLDs are simple, durable, and accurate tool to verify doses that delivered using small photon beam fields down to a 1x1 cm2 field sizes. The study emphasizes that the treatment planning system should always be evaluated for small field out factors for the accurate dose delivery in clinical setting.

Keywords: small field dosimetry, optically stimulated luminescence, audit treatment, radiological physics center

Procedia PDF Downloads 310
21652 Nonlinear Multivariable Analysis of CO2 Emissions in China

Authors: Hsiao-Tien Pao, Yi-Ying Li, Hsin-Chia Fu

Abstract:

This paper addressed the impacts of energy consumption, economic growth, financial development, and population size on environmental degradation using grey relational analysis (GRA) for China, where foreign direct investment (FDI) inflows is the proxy variable for financial development. The more recent historical data during the period 2004–2011 are used, because the use of very old data for data analysis may not be suitable for rapidly developing countries. The results of the GRA indicate that the linkage effects of energy consumption–emissions and GDP–emissions are ranked first and second, respectively. These reveal that energy consumption and economic growth are strongly correlated with emissions. Higher economic growth requires more energy consumption and increasing environmental pollution. Likewise, more efficient energy use needs a higher level of economic development. Therefore, policies to improve energy efficiency and create a low-carbon economy can reduce emissions without hurting economic growth. The finding of FDI–emissions linkage is ranked third. This indicates that China do not apply weak environmental regulations to attract inward FDI. Furthermore, China’s government in attracting inward FDI should strengthen environmental policy. The finding of population–emissions linkage effect is ranked fourth, implying that population size does not directly affect CO2 emissions, even though China has the world’s largest population, and Chinese people are very economical use of energy-related products. Overall, the energy conservation, improving efficiency, managing demand, and financial development, which aim at curtailing waste of energy, reducing both energy consumption and emissions, and without loss of the country’s competitiveness, can be adopted for developing economies. The GRA is one of the best way to use a lower data to build a dynamic analysis model.

Keywords: China, CO₂ emissions, foreign direct investment, grey relational analysis

Procedia PDF Downloads 389
21651 Statistical Model of Water Quality in Estero El Macho, Machala-El Oro

Authors: Rafael Zhindon Almeida

Abstract:

Surface water quality is an important concern for the evaluation and prediction of water quality conditions. The objective of this study is to develop a statistical model that can accurately predict the water quality of the El Macho estuary in the city of Machala, El Oro province. The methodology employed in this study is of a basic type that involves a thorough search for theoretical foundations to improve the understanding of statistical modeling for water quality analysis. The research design is correlational, using a multivariate statistical model involving multiple linear regression and principal component analysis. The results indicate that water quality parameters such as fecal coliforms, biochemical oxygen demand, chemical oxygen demand, iron and dissolved oxygen exceed the allowable limits. The water of the El Macho estuary is determined to be below the required water quality criteria. The multiple linear regression model, based on chemical oxygen demand and total dissolved solids, explains 99.9% of the variance of the dependent variable. In addition, principal component analysis shows that the model has an explanatory power of 86.242%. The study successfully developed a statistical model to evaluate the water quality of the El Macho estuary. The estuary did not meet the water quality criteria, with several parameters exceeding the allowable limits. The multiple linear regression model and principal component analysis provide valuable information on the relationship between the various water quality parameters. The findings of the study emphasize the need for immediate action to improve the water quality of the El Macho estuary to ensure the preservation and protection of this valuable natural resource.

Keywords: statistical modeling, water quality, multiple linear regression, principal components, statistical models

Procedia PDF Downloads 75
21650 A Data-Driven Compartmental Model for Dengue Forecasting and Covariate Inference

Authors: Yichao Liu, Peter Fransson, Julian Heidecke, Jonas Wallin, Joacim Rockloev

Abstract:

Dengue, a mosquito-borne viral disease, poses a significant public health challenge in endemic tropical or subtropical countries, including Sri Lanka. To reveal insights into the complexity of the dynamics of this disease and study the drivers, a comprehensive model capable of both robust forecasting and insightful inference of drivers while capturing the co-circulating of several virus strains is essential. However, existing studies mostly focus on only one aspect at a time and do not integrate and carry insights across the siloed approach. While mechanistic models are developed to capture immunity dynamics, they are often oversimplified and lack integration of all the diverse drivers of disease transmission. On the other hand, purely data-driven methods lack constraints imposed by immuno-epidemiological processes, making them prone to overfitting and inference bias. This research presents a hybrid model that combines machine learning techniques with mechanistic modelling to overcome the limitations of existing approaches. Leveraging eight years of newly reported dengue case data, along with socioeconomic factors, such as human mobility, weekly climate data from 2011 to 2018, genetic data detecting the introduction and presence of new strains, and estimates of seropositivity for different districts in Sri Lanka, we derive a data-driven vector (SEI) to human (SEIR) model across 16 regions in Sri Lanka at the weekly time scale. By conducting ablation studies, the lag effects allowing delays up to 12 weeks of time-varying climate factors were determined. The model demonstrates superior predictive performance over a pure machine learning approach when considering lead times of 5 and 10 weeks on data withheld from model fitting. It further reveals several interesting interpretable findings of drivers while adjusting for the dynamics and influences of immunity and introduction of a new strain. The study uncovers strong influences of socioeconomic variables: population density, mobility, household income and rural vs. urban population. The study reveals substantial sensitivity to the diurnal temperature range and precipitation, while mean temperature and humidity appear less important in the study location. Additionally, the model indicated sensitivity to vegetation index, both max and average. Predictions on testing data reveal high model accuracy. Overall, this study advances the knowledge of dengue transmission in Sri Lanka and demonstrates the importance of incorporating hybrid modelling techniques to use biologically informed model structures with flexible data-driven estimates of model parameters. The findings show the potential to both inference of drivers in situations of complex disease dynamics and robust forecasting models.

Keywords: compartmental model, climate, dengue, machine learning, social-economic

Procedia PDF Downloads 60
21649 Estimation of Longitudinal Dispersion Coefficient Using Tracer Data

Authors: K. Ebrahimi, Sh. Shahid, M. Mohammadi Ghaleni, M. H. Omid

Abstract:

The longitudinal dispersion coefficient is a crucial parameter for 1-D water quality analysis of riverine flows. So far, different types of empirical equations for estimation of the coefficient have been developed, based on various case studies. The main objective of this paper is to develop an empirical equation for estimation of the coefficient for a riverine flow. For this purpose, a set of tracer experiments was conducted, involving salt tracer, at three sections located in downstream of a lengthy canal. Tracer data were measured in three mixing lengths along the canal including; 45, 75 and 100m. According to the results, the obtained coefficients from new developed empirical equation gave an encouraging level of agreement with the theoretical values.

Keywords: coefficients, dispersion, river, tracer, water quality

Procedia PDF Downloads 375
21648 Game-Based Learning in a Higher Education Course: A Case Study with Minecraft Education Edition

Authors: Salvador Antelmo Casanova Valencia

Abstract:

This study documents the use of the Minecraft Education Edition application to explore immersive game-based learning environments. We analyze the contributions of fourth-year university students who are pursuing a degree in Administrative Computing at the Universidad Michoacana de San Nicolas de Hidalgo. In this study, descriptive data and statistical inference are detailed using a quasi-experimental design using the Wilcoxon test. The instruments will provide data validation. Game-based learning in immersive environments necessarily implies greater student participation and commitment, resulting in the study, motivation, and significant improvements, promoting cooperation and autonomous learning.

Keywords: game-based learning, gamification, higher education, Minecraft

Procedia PDF Downloads 149
21647 Image Quality and Dose Optimisations in Digital and Computed Radiography X-ray Radiography Using Lumbar Spine Phantom

Authors: Elhussaien Elshiekh

Abstract:

A study was performed to management and compare radiation doses and image quality during Lumbar spine PA and Lumbar spine LAT, x- ray radiography using Computed Radiography (CR) and Digital Radiography (DR). Standard exposure factors such as kV, mAs and FFD used for imaging the Lumbar spine anthropomorphic phantom obtained from average exposure factors that were used with CR in five radiology centres. Lumbar spine phantom was imaged using CR and DR systems. Entrance surface air kerma (ESAK) was calculated X-ray tube output and patient exposure factor. Images were evaluated using visual grading system based on the European Guidelines on Quality Criteria for diagnostic radiographic images. The ESAK corresponding to each image was measured at the surface of the phantom. Six experienced specialists evaluated hard copies of all the images, the image score (IS) was calculated for each image by finding the average score of the Six evaluators. The IS value also was used to determine whether an image was diagnostically acceptable. The optimum recommended exposure factors founded here for Lumbar spine PA and Lumbar spine LAT, with respectively (80 kVp,25 mAs at 100 cm FFD) and (75 kVp,15 mAs at 100 cm FFD) for CR system, and (80 kVp,15 mAs at100 cm FFD) and (75 kVp,10 mAs at 100 cm FFD) for DR system. For Lumbar spine PA, the lowest ESAK value required to obtain a diagnostically acceptable image were 0.80 mGy for DR and 1.20 mGy for CR systems. Similarly for Lumbar spine LAT projection, the lowest ESAK values to obtain a diagnostically acceptable image were 0.62 mGy for DR and 0.76 mGy for CR systems. At standard kVp and mAs values, the image quality did not vary significantly between the CR and the DR system, but at higher kVp and mAs values, the DR images were found to be of better quality than CR images. In addition, the lower limit of entrance skin dose consistent with diagnostically acceptable DR images was 40% lower than that for CR images.

Keywords: image quality, dosimetry, radiation protection, optimization, digital radiography, computed radiography

Procedia PDF Downloads 37
21646 Determining the Direction of Causality between Creating Innovation and Technology Market

Authors: Liubov Evstigneeva

Abstract:

In this paper an attempt is made to establish causal nexuses between innovation and international trade in Russia. The topicality of this issue is determined by the necessity of choosing policy instruments for economic modernization and transition to innovative development. The vector auto regression (VAR) model and Granger test are applied for the Russian monthly data from 2005 until the second quartile of 2015. Both lagged import and export at the national level cause innovation, the latter starts to stimulate foreign trade since it is a remote lag. In comparison to aggregate data, the results by patent’s categories are more diverse. Importing technologies from foreign countries stimulates patent activity, while innovations created in Russia are only Granger causality for import to Commonwealth of Independent States.

Keywords: export, import, innovation, patents

Procedia PDF Downloads 308
21645 DNA Prime/MVTT Boost Enhances Broadly Protective Immune Response against Mosaic HIV-1 Gag

Authors: Wan Liu, Haibo Wang, Cathy Huang, Zhiwu Tan, Zhiwei Chen

Abstract:

The tremendous diversity of HIV-1 has been a major challenge for an effective AIDS vaccine development. Mosaic approach presents the potential for vaccine design aiming for global protection. The mosaic antigen of HIV-1 Gag allows antigenic breadth for vaccine-elicited immune response against a wider spectrum of viral strains. However, the enhancement of immune response using vaccines is dependent on the strategy used. Heterologous prime/boost regimen has been shown to elicit high levels of immune responses. Here, we investigated whether priming using plasmid DNA with electroporation followed by boosting with the live replication-competent modified vaccinia virus vector TianTan (MVTT) combined with the mosaic antigenic sequence could elicit a greater and broader antigen-specific response against HIV-1 Gag in mice. When compared to DNA or MVTT alone, or MVTT/MVTT group, DNA/MVTT group resulted in coincidentally high frequencies of broadly reactive, Gag-specific, polyfunctional, long-lived, and cytotoxic CD8+ T cells and increased anti-Gag antibody titer. Meanwhile, the vaccination could upregulate PD-1+, and Tim-3+ CD8+ T cell, myeloid-derived suppressive cells and Treg cells to balance the stronger immune response induced. Importantly, the prime/boost vaccination could help control the EcoHIV and mesothelioma AB1-gag challenge. The stronger protective Gag-specific immunity induced by a Mosaic DNA/MVTT vaccine corroborate the promise of the mosaic approach, and the potential of two acceptably safe vectors to enhance anti-HIV immunity and cancer prevention.

Keywords: DNA/MVTT vaccine, EcoHIV, mosaic antigen, mesothelioma AB1-gag

Procedia PDF Downloads 230
21644 Using Inverted 4-D Seismic and Well Data to Characterise Reservoirs from Central Swamp Oil Field, Niger Delta

Authors: Emmanuel O. Ezim, Idowu A. Olayinka, Michael Oladunjoye, Izuchukwu I. Obiadi

Abstract:

Monitoring of reservoir properties prior to well placements and production is a requirement for optimisation and efficient oil and gas production. This is usually done using well log analyses and 3-D seismic, which are often prone to errors. However, 4-D (Time-lapse) seismic, incorporating numerous 3-D seismic surveys of the same field with the same acquisition parameters, which portrays the transient changes in the reservoir due to production effects over time, could be utilised because it generates better resolution. There is, however dearth of information on the applicability of this approach in the Niger Delta. This study was therefore designed to apply 4-D seismic, well-log and geologic data in monitoring of reservoirs in the EK field of the Niger Delta. It aimed at locating bypassed accumulations and ensuring effective reservoir management. The Field (EK) covers an area of about 1200km2 belonging to the early (18ma) Miocene. Data covering two 4-D vintages acquired over a fifteen-year interval were obtained from oil companies operating in the field. The data were analysed to determine the seismic structures, horizons, Well-to-Seismic Tie (WST), and wavelets. Well, logs and production history data from fifteen selected wells were also collected from the Oil companies. Formation evaluation, petrophysical analysis and inversion alongside geological data were undertaken using Petrel, Shell-nDi, Techlog and Jason Software. Well-to-seismic tie, formation evaluation and saturation monitoring using petrophysical and geological data and software were used to find bypassed hydrocarbon prospects. The seismic vintages were interpreted, and the amounts of change in the reservoir were defined by the differences in Acoustic Impedance (AI) inversions of the base and the monitor seismic. AI rock properties were estimated from all the seismic amplitudes using controlled sparse-spike inversion. The estimated rock properties were used to produce AI maps. The structural analysis showed the dominance of NW-SE trending rollover collapsed-crest anticlines in EK with hydrocarbons trapped northwards. There were good ties in wells EK 27, 39. Analysed wavelets revealed consistent amplitude and phase for the WST; hence, a good match between the inverted impedance and the good data. Evidence of large pay thickness, ranging from 2875ms (11420 TVDSS-ft) to about 2965ms, were found around EK 39 well with good yield properties. The comparison between the base of the AI and the current monitor and the generated AI maps revealed zones of untapped hydrocarbons as well as assisted in determining fluids movement. The inverted sections through EK 27, 39 (within 3101 m - 3695 m), indicated depletion in the reservoirs. The extent of the present non-uniform gas-oil contact and oil-water contact movements were from 3554 to 3575 m. The 4-D seismic approach led to better reservoir characterization, well development and the location of deeper and bypassed hydrocarbon reservoirs.

Keywords: reservoir monitoring, 4-D seismic, well placements, petrophysical analysis, Niger delta basin

Procedia PDF Downloads 104
21643 Attributes That Influence Respondents When Choosing a Mate in Internet Dating Sites: An Innovative Matching Algorithm

Authors: Moti Zwilling, Srečko Natek

Abstract:

This paper aims to present an innovative predictive analytics analysis in order to find the best combination between two consumers who strive to find their partner or in internet sites. The methodology shown in this paper is based on analysis of consumer preferences and involves data mining and machine learning search techniques. The study is composed of two parts: The first part examines by means of descriptive statistics the correlations between a set of parameters that are taken between man and women where they intent to meet each other through the social media, usually the internet. In this part several hypotheses were examined and statistical analysis were taken place. Results show that there is a strong correlation between the affiliated attributes of man and woman as long as concerned to how they present themselves in a social media such as "Facebook". One interesting issue is the strong desire to develop a serious relationship between most of the respondents. In the second part, the authors used common data mining algorithms to search and classify the most important and effective attributes that affect the response rate of the other side. Results exhibit that personal presentation and education background are found as most affective to achieve a positive attitude to one's profile from the other mate.

Keywords: dating sites, social networks, machine learning, decision trees, data mining

Procedia PDF Downloads 282
21642 Analysis of Cardiovascular Diseases Using Artificial Neural Network

Authors: Jyotismita Talukdar

Abstract:

In this paper, a study has been made on the possibility and accuracy of early prediction of several Heart Disease using Artificial Neural Network. (ANN). The study has been made in both noise free environment and noisy environment. The data collected for this analysis are from five Hospitals. Around 1500 heart patient’s data has been collected and studied. The data is analysed and the results have been compared with the Doctor’s diagnosis. It is found that, in noise free environment, the accuracy varies from 74% to 92%and in noisy environment (2dB), the results of accuracy varies from 62% to 82%. In the present study, four basic attributes considered are Blood Pressure (BP), Fasting Blood Sugar (FBS), Thalach (THAL) and Cholesterol (CHOL.). It has been found that highest accuracy(93%), has been achieved in case of PPI( Post-Permanent-Pacemaker Implementation ), around 79% in case of CAD(Coronary Artery disease), 87% in DCM (Dilated Cardiomyopathy), 89% in case of RHD&MS(Rheumatic heart disease with Mitral Stenosis), 75 % in case of RBBB +LAFB (Right Bundle Branch Block + Left Anterior Fascicular Block), 72% for CHB(Complete Heart Block) etc. The lowest accuracy has been obtained in case of ICMP (Ischemic Cardiomyopathy), about 38% and AF( Atrial Fibrillation), about 60 to 62%.

Keywords: coronary heart disease, chronic stable angina, sick sinus syndrome, cardiovascular disease, cholesterol, Thalach

Procedia PDF Downloads 160
21641 Damage Detection in a Cantilever Beam under Different Excitation and Temperature Conditions

Authors: A. Kyprianou, A. Tjirkallis

Abstract:

Condition monitoring of structures in service is very important as it provides information about the risk of damage development. One of the essential constituents of structural condition monitoring is the damage detection methodology. In the context of condition monitoring of in service structures a damage detection methodology analyses data obtained from the structure while it is in operation. Usually, this means that the data could be affected by operational and environmental conditions in a way that could mask the effects of a possible damage on the data. This, depending on the damage detection methodology, could lead to either false alarms or miss existing damages. In this article a damage detection methodology that is based on the Spatio-temporal continuous wavelet transform (SPT-CWT) analysis of a sequence of experimental time responses of a cantilever beam is proposed. The cantilever is subjected to white and pink noise excitation to simulate different operating conditions. In addition, in order to simulate changing environmental conditions, the cantilever is subjected to heating by a heat gun. The response of the cantilever beam is measured by a high-speed camera. Edges are extracted from the series of images of the beam response captured by the camera. Subsequent processing of the edges gives a series of time responses on 439 points on the beam. This sequence is then analyzed using the SPT-CWT to identify damage. The algorithm proposed was able to clearly identify damage under any condition when the structure was excited by white noise force. In addition, in the case of white noise excitation, the analysis could also reveal the position of the heat gun when it was used to heat the structure. The analysis could identify the different operating conditions i.e. between responses due to white noise excitation and responses due to pink noise excitation. During the pink noise excitation whereas damage and changing temperature were identified it was not possible to clearly identify the effect of damage from that of temperature. The methodology proposed in this article for damage detection enables the separation the damage effect from that due to temperature and excitation on data obtained from measurements of a cantilever beam. This methodology does not require information about the apriori state of the structure.

Keywords: spatiotemporal continuous wavelet transform, damage detection, data normalization, varying temperature

Procedia PDF Downloads 264
21640 By-Line Analysis of Determinants Insurance Premiums : Evidence from Tunisian Market

Authors: Nadia Sghaier

Abstract:

In this paper, we aim to identify the determinants of the life and non-life insurance premiums of different lines for the case of the Tunisian insurance market over a recent period from 1997 to 2019. The empirical analysis is conducted using the linear cointegration techniques in the panel data framework, which allow both long and short-run relationships. The obtained results show evidence of long-run relationship between premiums, losses, and financial variables (stock market indices and interest rate). Furthermore, we find that the short-run effect of explanatory variables differs across lines. This finding has important implications for insurance tarification and regulation.

Keywords: insurance premiums, lines, Tunisian insurance market, cointegration approach in panel data

Procedia PDF Downloads 179
21639 Development of a Methodology for Surgery Planning and Control: A Management Approach to Handle the Conflict of High Utilization and Low Overtime

Authors: Timo Miebach, Kirsten Hoeper, Carolin Felix

Abstract:

In times of competitive pressures and demographic change, hospitals have to reconsider their strategies as a company. Due to the fact, that operations are one of the main income and one of the primary cost drivers otherwise, a process-oriented approach and an efficient use of resources seems to be the right way for getting a consistent market position. Thus, the efficient operation room occupancy planning is an important cause variable for the success and continued the existence of these institutions. A high utilization of resources is essential. This means a very high, but nevertheless sensible capacity-oriented utilization of working systems that can be realized by avoiding downtimes and a thoughtful occupancy planning. This engineering approach should help hospitals to reach her break-even point. Firstly, the aim is to establish a strategy point, which can be used for the generation of a planned throughput time. Secondly, the operation planning and control should be facilitated and implemented accurately by the generation of time modules. More than 100,000 data records of the Hannover Medical School were analyzed. The data records contain information about the type of conducted operation, the duration of the individual process steps, and all other organizational-specific data such as an operating room. Based on the aforementioned data base, a generally valid model was developed by an analysis to define a strategy point which takes the conflict of capacity utilization and low overtime into account. Furthermore, time modules were generated in this work, which allows a simplified and flexible operation planning and control for the operation manager. By the time modules, it is possible to reduce a high average value of the idle times of the operation rooms. Furthermore, the potential is used to minimize the idle time spread.

Keywords: capacity, operating room, surgery planning and control, utilization

Procedia PDF Downloads 238
21638 The Effect of Knowledge Management in Lean Organization

Authors: Mehrnoosh Askarizadeh

Abstract:

In an ever changeable and globalized world with new economic and global competitors competing for the same customers and resources, is increasing the pressure on organizations' competitiveness. In addition, organizations faces additional challenges due to an ever-growing amount of data and the ever-bigger challenge of analyzing that data and keeping the data secure. Successful companies are characterized by exploiting their intellectual capital in an efficient manner. Thus, the most valuable asset an organization has today has become its employees' knowledge. To enable this, there is a tool that supports easier handling and optimizes the use of knowledge, which is knowledge management. Based on the theoretical framework and careful review as well as analysis of interviews and observations resulted in six essential areas: structure, management, compensation, communication, trust and motivation. The analysis showed that the scientific articles and literature have different perspectives, different definitions and are based on different theories but the essence is that they all finally seems to arrive at the same result and conclusion, although with different viewpoints and perspectives. This is regardless of whether the focus is on management style, rewards or communication they all focus on the individual. The conclusion is that organizational culture affects knowledge management and dissemination of information, because of its direct impact on the individual. The largest and most important underlying factor why we choose to participate in improvement work or share knowledge is our motivation. Motivation is the reason for and the reason behind our actions.

Keywords: lean, lean production, knowledge management, information management, motivation

Procedia PDF Downloads 506
21637 Assessing Natura 2000 Network Effectiveness in Landscape Conservation: A Case Study in Castile and León, Spain (1990-2018)

Authors: Paula García-Llamas, Polonia Díez González, Angela Taboada

Abstract:

In an era marked by unprecedented anthropogenic alterations to landscapes and biodiversity, the consequential loss of fauna, flora, and habitats poses a grave concern. It is imperative to evaluate our capacity to manage and mitigate such changes effectively. This study aims to scrutinize the efficacy of the Natura 2000 Network (NN2000) in landscape conservation within the autonomous community of Castile and Leon (Spain), spanning from 1990 to 2018. Leveraging land use change maps from the European Corine Land Cover database across four subperiods (1990-2000, 2000-2006, 2006-2012, and 2012-2018), we quantified alterations occurring both within NN2000 protected sites and within a 5km buffer zone. Additionally, we spatially assess land use/land cover patterns of change considering fluxes of various habitat types defined within NN2000. Our findings reveal that the protected areas under NN2000 were particularly susceptible to change, with the most significant transformations observed during the 1990-2000 period. Predominant change processes include secondary succession and scrubland formation due to land use cessation, deforestation, and agricultural intensification. While NN2000 demonstrates efficacy in curtailing urbanization and industrialization within buffer zones, its management measures have proven insufficient in safeguarding landscapes against the dynamic changes witnessed between 1990 and 2018, especially in relation to rural abandonment.

Keywords: Corine land cover, land cover changes, site of community importance, special protection area

Procedia PDF Downloads 36
21636 Environmental Sanitation Parameters Recording in Refugee-Migrants Camps in Greece, 2017

Authors: Crysovaladou Kefaloudi, Kassiani Mellou, Eirini Saranti-Papasaranti, Athanasios Koustenis, Chrysoula Botsi, Agapios Terzidis

Abstract:

Recent migration crisis led to a vast migrant – refugees movement to Greece which created an urgent need for hosting settlements. Taken into account the protection of public health from possible pathogens related to water and food supply as well as waste and sewage accumulation, a 'Living Conditions Recording Form' was created in the context of 'PHILOS' European Program funded by the Asylum Migration and Integration Fund (AMIF) of EU’s DG Migration and Home Affairs, in order to assess a number of environmental sanitation parameters, in refugees – migrants camps in mainland. The assessment will be completed until the end of July. From March to June 2017, mobile unit teams comprised of health inspectors of sub-action 2 of “PHILOS” proceeded with the assessment of living conditions in twenty-two out of thirty-one camps and 'Stata' was used for the statistical analysis of obtained information. Variables were grouped into the following categories: 1) Camp administration, 2) hosted population number, 3) accommodation, 4) heating installations, 5) personal hygiene, 6) sewage collection and disposal, 7) water supply, 8) waste collection and management, 9) pest control, 10) fire safety, 11) food handling and safety. Preliminary analysis of the results showed that camp administration was performed in 90% of the camps by a public authority with the coordination of various NGOs. The median number of hosted population was 222 ranging from 62 to 3200, and the median value of hosted population per accommodation type was 4 in 19 camps. Heating facilities were provided in 86.1% of camps. In 18.2 % of the camps, one personal hygiene facility was available per 6 people ranging in the rest of the camps from 1 per 3 to 1 per 20 hosted refugees-migrants. Waste and sewage collection was performed depending on populations demand in an adequate way in all recorded camps. In 90% of camps, water was supplied through the central water supply system. In 85% of camps quantity and quality of water supply inside camps was regularly monitored for microbial and chemical indices. Pest control was implemented in 86.4% of the camps as well as fire safety measures. Food was supplied by catering companies in 50% of the camps, and the quality and quantity food was monitored at a regular basis. In 77% of camps, food was prepared by the hosted population with the availability of proper storage conditions. Furthermore, in all camps, hosted population was provided with personal hygiene items and health sanitary educational programs were implemented in 77.3% of camps. In conclusion, in the majority of the camps, environmental sanitation parameters were satisfactory. However, waste and sewage accumulation, as well as inadequate pest control measures were recorded in some camps. The obtained data have led to a number of recommendations for the improvement of sanitary conditions, disseminated to all relevant stakeholders. Special emphasis was given to hygiene measures implementation during food handling by migrants – refugees, as well as to waste and sewage accumulation taking in to account the population’s cultural background.

Keywords: environmental sanitation parameters, food borne diseases risk assessment, refugee – migrants camps, water borne diseases risk assessment

Procedia PDF Downloads 210
21635 A Study on Method for Identifying Capacity Factor Declination of Wind Turbines

Authors: Dongheon Shin, Kyungnam Ko, Jongchul Huh

Abstract:

The investigation on wind turbine degradation was carried out using the nacelle wind data. The three Vestas V80-2MW wind turbines of Sungsan wind farm in Jeju Island, South Korea were selected for this work. The SCADA data of the wind farm for five years were analyzed to draw power curve of the turbines. It is assumed that the wind distribution is the Rayleigh distribution to calculate the normalized capacity factor based on the drawn power curve of the three wind turbines for each year. The result showed that the reduction of power output from the three wind turbines occurred every year and the normalized capacity factor decreased to 0.12%/year on average.

Keywords: wind energy, power curve, capacity factor, annual energy production

Procedia PDF Downloads 419
21634 Water Quality Calculation and Management System

Authors: H. M. B. N Jayasinghe

Abstract:

The water is found almost everywhere on Earth. Water resources contain a lot of pollution. Some diseases can be spread through the water to the living beings. So to be clean water it should undergo a number of treatments necessary to make it drinkable. So it is must to have purification technology for the wastewater. So the waste water treatment plants act a major role in these issues. When considering the procedures taken after the water treatment process was always based on manual calculations and recordings. Water purification plants may interact with lots of manual processes. It means the process taking much time consuming. So the final evaluation and chemical, biological treatment process get delayed. So to prevent those types of drawbacks there are some computerized programmable calculation and analytical techniques going to be introduced to the laboratory staff. To solve this problem automated system will be a solution in which guarantees the rational selection. A decision support system is a way to model data and make quality decisions based upon it. It is widely used in the world for the various kind of process automation. Decision support systems that just collect data and organize it effectively are usually called passive models where they do not suggest a specific decision but only reveal information. This web base system is based on global positioning data adding facility with map location. Most worth feature is SMS and E-mail alert service to inform the appropriate person on a critical issue. The technological influence to the system is HTML, MySQL, PHP, and some other web developing technologies. Current issues in the computerized water chemistry analysis are not much deep in progress. For an example the swimming pool water quality calculator. The validity of the system has been verified by test running and comparison with an existing plant data. Automated system will make the life easier in productively and qualitatively.

Keywords: automated system, wastewater, purification technology, map location

Procedia PDF Downloads 235
21633 Exploring Ways Early Childhood Teachers Integrate Information and Communication Technologies into Children's Play: Two Case Studies from the Australian Context

Authors: Caroline Labib

Abstract:

This paper reports on a qualitative study exploring the approaches teachers used to integrate computers or smart tablets into their program planning. Their aim was to integrate ICT into children’s play, thereby supporting children’s learning and development. Data was collected in preschool settings in Melbourne in 2016. Interviews with teachers, observations of teacher interactions with children and copies of teachers’ planning and observation documents informed the study. The paper looks closely at findings from two early childhood settings and focuses on exploring the differing approaches two EC teachers have adopted when integrating iPad or computers into their settings. Data analysis revealed three key approaches which have been labelled: free digital play, guided digital play and teacher-led digital use. Importantly, teacher decisions were influenced by the interplay between the opportunities that the ICT tools offered, the teachers’ prior knowledge and experience about ICT and children’s learning needs and contexts. This paper is a snapshot of two early childhood settings, and further research will encompass data from six more early childhood settings in Victoria with the aim of exploring a wide range of motivating factors for early childhood teachers trying to integrate ICT into their programs.

Keywords: early childhood education (ECE), digital play, information and communication technologies (ICT), play, and teachers' interaction approaches

Procedia PDF Downloads 193
21632 Maximum Likelihood Estimation Methods on a Two-Parameter Rayleigh Distribution under Progressive Type-Ii Censoring

Authors: Daniel Fundi Murithi

Abstract:

Data from economic, social, clinical, and industrial studies are in some way incomplete or incorrect due to censoring. Such data may have adverse effects if used in the estimation problem. We propose the use of Maximum Likelihood Estimation (MLE) under a progressive type-II censoring scheme to remedy this problem. In particular, maximum likelihood estimates (MLEs) for the location (µ) and scale (λ) parameters of two Parameter Rayleigh distribution are realized under a progressive type-II censoring scheme using the Expectation-Maximization (EM) and the Newton-Raphson (NR) algorithms. These algorithms are used comparatively because they iteratively produce satisfactory results in the estimation problem. The progressively type-II censoring scheme is used because it allows the removal of test units before the termination of the experiment. Approximate asymptotic variances and confidence intervals for the location and scale parameters are derived/constructed. The efficiency of EM and the NR algorithms is compared given root mean squared error (RMSE), bias, and the coverage rate. The simulation study showed that in most sets of simulation cases, the estimates obtained using the Expectation-maximization algorithm had small biases, small variances, narrower/small confidence intervals width, and small root of mean squared error compared to those generated via the Newton-Raphson (NR) algorithm. Further, the analysis of a real-life data set (data from simple experimental trials) showed that the Expectation-Maximization (EM) algorithm performs better compared to Newton-Raphson (NR) algorithm in all simulation cases under the progressive type-II censoring scheme.

Keywords: expectation-maximization algorithm, maximum likelihood estimation, Newton-Raphson method, two-parameter Rayleigh distribution, progressive type-II censoring

Procedia PDF Downloads 147
21631 The Impact of Financial Risk on Banks’ Financial Performance: A Comparative Study of Islamic Banks and Conventional Banks in Pakistan

Authors: Mohammad Yousaf Safi Mohibullah Afghan

Abstract:

The study made on Islamic and conventional banks scrutinizes the risks interconnected with credit and liquidity on the productivity performance of Islamic and conventional banks that operate in Pakistan. Among the banks, only 4 Islamic and 18 conventional banks have been selected to enrich the result of our study on Islamic banks performance in connection to conventional banks. The selection of the banks to the panel is based on collecting quarterly unbalanced data ranges from the first quarter of 2007 to the last quarter of 2017. The data are collected from the Bank’s web sites and State Bank of Pakistan. The data collection is carried out based on Delta-method test. The mentioned test is used to find out the empirical results. In the study, while collecting data on the banks, the return on assets and return on equity have been major factors that are used assignificant proxies in determining the profitability of the banks. Moreover, another major proxy is used in measuring credit and liquidity risks, the loan loss provision to total loan and the ratio of liquid assets to total liability. Meanwhile, with consideration to the previous literature, some other variables such as bank size, bank capital, bank branches, and bank employees have been used to tentatively control the impact of those factors whose direct and indirect effects on profitability is understood. In conclusion, the study emphasizes that credit risk affects return on asset and return on equity positively, and there is no significant difference in term of credit risk between Islamic and conventional banks. Similarly, the liquidity risk has a significant impact on the bank’s profitability, though the marginal effect of liquidity risk is higher for Islamic banks than conventional banks.

Keywords: islamic & conventional banks, performance return on equity, return on assets, pakistan banking sectors, profitibility

Procedia PDF Downloads 142