Search results for: data mining technique
26696 Moderating Effects of Family Ownership on the Relationship between Corporate Governance Mechanisms and Financial Performance of Publicly Listed Companies in Nigeria
Authors: Ndagi Salihu
Abstract:
Corporate governance mechanisms are the control measures for ensuring that all the interests groups are equally represented and management are working towards wealth creation in the interest of all. Therefore, there are many empirical studies during the last three decades on corporate governance and firm performance. However, little is known about the effects of family ownership on the relationship between corporate governance and firm performance, especially in the developing economy like Nigeria. This limit our understanding of the unique governance dynamics of family ownership with regards firm performance. This study examined the impact of family ownership on the relationship between governance mechanisms and financial performance of publicly listed companies in Nigeria. The study adopted quantitative research methodology using correlational ex-post factor design and secondary data from annual reports and accounts of a sample of 23 listed companies for a period of 5 years (2014-2018). The explanatory variables are the board size, board composition, board financial expertise, and board audit committee attributes. Financial performance is proxy by Return on Assets (ROA) and Return on Equity (ROE). Multiple panel regression technique of data analysis was employed in the analysis, and the study found that family ownership has a significant positive effect on the relationships between corporate governance mechanisms and financial performance of publicly listed firms in Nigeria. This finding is the same for both the ROA and ROE. However, the findings indicate that board size, board financial expertise, and board audit committee attributes have a significant positive impact on the ROA and ROE of the sample firms after the moderation. Moreover, board composition has significant positive effect on financial performance of the sample listed firms in terms of ROA and ROE. The study concludes that the use of family ownership in the control of firms in Nigeria could improve performance by reducing the opportunistic actions managers as well as agency problems. The study recommends that publicly listed companies in Nigeria should allow significant family ownership of equities and participation in management.Keywords: profitability, board characteristics, agency theory, stakeholders
Procedia PDF Downloads 14226695 Emotional Intelligence and General Self-Efficacy as Predictors of Career Commitment of Secondary School Teachers in Nigeria
Authors: Moyosola Jude Akomolafe
Abstract:
Career commitment among employees is crucial to the success of any organization. However, career commitment has been reported to be very low among teachers in the public secondary schools in Nigeria. This study, therefore, examined the contributions of emotional intelligence and general self-efficacy to career commitment of among secondary school teachers in Nigeria. Descriptive research design of correlational type was adopted for the study. It made use of stratified random sampling technique was used in selecting two hundred and fifty (250) secondary schools teachers for the study. Three standardized instruments namely: The Big Five Inventory (BFI), Emotional Intelligence Scale (EIS), General Self-Efficacy Scale (GSES) and Career Commitment Scale (CCS) were adopted for the study. Three hypotheses were tested at 0.05 level of significance. Data collected were analyzed through Multiple Regression Analysis to investigate the predicting capacity of emotional intelligence and general self-efficacy on career commitment of secondary school teachers. The results showed that the variables when taken as a whole significantly predicted career commitment among secondary school teachers. The relative contribution of each variable revealed that emotional intelligence and general self-efficacy significantly predicted career commitment among secondary school teachers in Nigeria. The researcher recommended that secondary school teachers should be exposed to emotional intelligence and self-efficacy training to enhance their career commitment.Keywords: career commitment, emotional intelligence, general self-efficacy, secondary school teachers
Procedia PDF Downloads 39126694 MapReduce Algorithm for Geometric and Topological Information Extraction from 3D CAD Models
Authors: Ahmed Fradi
Abstract:
In a digital world in perpetual evolution and acceleration, data more and more voluminous, rich and varied, the new software solutions emerged with the Big Data phenomenon offer new opportunities to the company enabling it not only to optimize its business and to evolve its production model, but also to reorganize itself to increase competitiveness and to identify new strategic axes. Design and manufacturing industrial companies, like the others, face these challenges, data represent a major asset, provided that they know how to capture, refine, combine and analyze them. The objective of our paper is to propose a solution allowing geometric and topological information extraction from 3D CAD model (precisely STEP files) databases, with specific algorithm based on the programming paradigm MapReduce. Our proposal is the first step of our future approach to 3D CAD object retrieval.Keywords: Big Data, MapReduce, 3D object retrieval, CAD, STEP format
Procedia PDF Downloads 54326693 Application of Aquatic Plants for the Remediation of Organochlorine Pesticides from Keenjhar Lake
Authors: Soomal Hamza, Uzma Imran
Abstract:
Organochlorine pesticides bio-accumulate into the fat of fish, birds, and animals through which it enters the human food cycle. Due to their persistence and stability in the environment, many health impacts are associated with them, most of which are carcinogenic in nature. In this study, the level of organochlorine pesticides has been detected in Keenjhar Lake and remediated using Rhizoremediation technique. 14 OC pesticides namely, Aldrin, Deldrin, Heptachlor, Heptachlor epoxide, Endrin, Endosulfun I and II, DDT, DDE, DDD, Alpha, Beta, Gamma BHC and two plants namely, Water Hyacinth and Slvinia Molesta were used in the system using pot experiment which processed for 11 days. A consortium was inoculated in both plants to increase its efficiency. Water samples were processed using liquide-liquid extraction. Sediments and roots samples were processed using Soxhlet method followed by clean-up and Gas Chromatography. Delta-BHC was the predominantly found in all samples with mean concentration (ppb) and standard deviation of 0.02 ± 0.14, 0.52 ± 0.68, 0.61 ± 0.06, in Water, Sediments and Roots samples respectively. The highest levels were of Endosulfan II in the samples of water, sediments and roots. Water Hyacinth proved to be better bioaccumulaor as compared to Silvinia Molesta. The pattern of compounds reduction rate by the end of experiment was Delta-BHC>DDD > Alpha-BHC > DDT> Heptachlor> H.Epoxide> Deldrin> Aldrin> Endrin> DDE> Endosulfun I > Endosulfun II. Not much significant difference was observed between the pots with the consortium and pots without the consortium addition. Phytoremediation is a promising technique, but more studies are required to assess the bioremediation potential of different aquatic plants and plant-endophyte relationship.Keywords: aquatic plant, bio remediation, gas chromatography, liquid liquid extraction
Procedia PDF Downloads 15426692 Forecasting 24-Hour Ahead Electricity Load Using Time Series Models
Authors: Ramin Vafadary, Maryam Khanbaghi
Abstract:
Forecasting electricity load is important for various purposes like planning, operation, and control. Forecasts can save operating and maintenance costs, increase the reliability of power supply and delivery systems, and correct decisions for future development. This paper compares various time series methods to forecast 24 hours ahead of electricity load. The methods considered are the Holt-Winters smoothing, SARIMA Modeling, LSTM Network, Fbprophet, and Tensorflow probability. The performance of each method is evaluated by using the forecasting accuracy criteria, namely, the mean absolute error and root mean square error. The National Renewable Energy Laboratory (NREL) residential energy consumption data is used to train the models. The results of this study show that the SARIMA model is superior to the others for 24 hours ahead forecasts. Furthermore, a Bagging technique is used to make the predictions more robust. The obtained results show that by Bagging multiple time-series forecasts, we can improve the robustness of the models for 24 hours ahead of electricity load forecasting.Keywords: bagging, Fbprophet, Holt-Winters, LSTM, load forecast, SARIMA, TensorFlow probability, time series
Procedia PDF Downloads 9826691 Tracing Sources of Sediment in an Arid River, Southern Iran
Authors: Hesam Gholami
Abstract:
Elevated suspended sediment loads in riverine systems resulting from accelerated erosion due to human activities are a serious threat to the sustainable management of watersheds and ecosystem services therein worldwide. Therefore, mitigation of deleterious sediment effects as a distributed or non-point pollution source in the catchments requires reliable provenance information. Sediment tracing or sediment fingerprinting, as a combined process consisting of sampling, laboratory measurements, different statistical tests, and the application of mixing or unmixing models, is a useful technique for discriminating the sources of sediments. From 1996 to the present, different aspects of this technique, such as grouping the sources (spatial and individual sources), discriminating the potential sources by different statistical techniques, and modification of mixing and unmixing models, have been introduced and modified by many researchers worldwide, and have been applied to identify the provenance of fine materials in agricultural, rural, mountainous, and coastal catchments, and in large catchments with numerous lakes and reservoirs. In the last two decades, efforts exploring the uncertainties associated with sediment fingerprinting results have attracted increasing attention. The frameworks used to quantify the uncertainty associated with fingerprinting estimates can be divided into three groups comprising Monte Carlo simulation, Bayesian approaches and generalized likelihood uncertainty estimation (GLUE). Given the above background, the primary goal of this study was to apply geochemical fingerprinting within the GLUE framework in the estimation of sub-basin spatial sediment source contributions in the arid Mehran River catchment in southern Iran, which drains into the Persian Gulf. The accuracy of GLUE predictions generated using four different sets of statistical tests for discriminating three sub-basin spatial sources was evaluated using 10 virtual sediments (VS) samples with known source contributions using the root mean square error (RMSE) and mean absolute error (MAE). Based on the results, the contributions modeled by GLUE for the western, central and eastern sub-basins are 1-42% (overall mean 20%), 0.5-30% (overall mean 12%) and 55-84% (overall mean 68%), respectively. According to the mean absolute fit (MAF; ≥ 95% for all target sediment samples) and goodness-of-fit (GOF; ≥ 99% for all samples), our suggested modeling approach is an accurate technique to quantify the source of sediments in the catchments. Overall, the estimated source proportions can help watershed engineers plan the targeting of conservation programs for soil and water resources.Keywords: sediment source tracing, generalized likelihood uncertainty estimation, virtual sediment mixtures, Iran
Procedia PDF Downloads 7726690 DCASH: Dynamic Cache Synchronization Algorithm for Heterogeneous Reverse Y Synchronizing Mobile Database Systems
Authors: Gunasekaran Raja, Kottilingam Kottursamy, Rajakumar Arul, Ramkumar Jayaraman, Krithika Sairam, Lakshmi Ravi
Abstract:
The synchronization server maintains a dynamically changing cache, which contains the data items which were requested and collected by the mobile node from the server. The order and presence of tuples in the cache changes dynamically according to the frequency of updates performed on the data, by the server and client. To synchronize, the data which has been modified by client and the server at an instant are collected, batched together by the type of modification (insert/ update/ delete), and sorted according to their update frequencies. This ensures that the DCASH (Dynamic Cache Synchronization Algorithm for Heterogeneous Reverse Y synchronizing Mobile Database Systems) gives priority to the frequently accessed data with high usage. The optimal memory management algorithm is proposed to manage data items according to their frequency, theorems were written to show the current mobile data activity is reverse Y in nature and the experiments were tested with 2g and 3g networks for various mobile devices to show the reduced response time and energy consumption.Keywords: mobile databases, synchronization, cache, response time
Procedia PDF Downloads 40826689 Forest Polices and Management in Nigeria: Are Households Willing to Pay for Forest Management?
Authors: A. O. Arowolo, M. U. Agbonlahor, P. A. Okuneye, A. E. Obayelu
Abstract:
Nigeria is rich with abundant resources with an immense contribution of the forest resource to her economic development and to the livelihood of the rural populace over the years. However, this important resource has continued to shrink because it is not sustainably used, managed or conserved. The loss of forest cover has far reaching consequences on regional, national and global economy as well as the environment. This paper reviewed the Nigeria forest management policies, the challenges and willingness to pay (WTP) for management of the community forests in Ogun State, Nigeria. Data for the empirical investigation were obtained using a cross-section survey of 160 rural households by multistage sampling technique. The WTP was assessed by the Dichotomous Choice Contingent Valuation. One major findings is that, the Nigerian forest reserves is established in order to conserve and manage forest resources but has since been neglected while the management plans are either non-existent or abandoned. Also, the free areas termed the community forests where people have unrestricted access to exploit are fast diminishing in both contents and scale. The mean WTP for sustainable management of community forests in the study area was positive with a value of ₦389.04/month. The study recommends policy measures aimed at participatory forest management plan which will include the rural communities in the management of community forests. This will help ensure sustainable management of forest resources as well as improve the welfare of the rural households.Keywords: forests, management, WTP, Nigeria
Procedia PDF Downloads 39126688 Unified Structured Process for Health Analytics
Authors: Supunmali Ahangama, Danny Chiang Choon Poo
Abstract:
Health analytics (HA) is used in healthcare systems for effective decision-making, management, and planning of healthcare and related activities. However, user resistance, the unique position of medical data content, and structure (including heterogeneous and unstructured data) and impromptu HA projects have held up the progress in HA applications. Notably, the accuracy of outcomes depends on the skills and the domain knowledge of the data analyst working on the healthcare data. The success of HA depends on having a sound process model, effective project management and availability of supporting tools. Thus, to overcome these challenges through an effective process model, we propose an HA process model with features from the rational unified process (RUP) model and agile methodology.Keywords: agile methodology, health analytics, unified process model, UML
Procedia PDF Downloads 50826687 Use of Life Cycle Data for State-Oriented Maintenance
Authors: Maximilian Winkens, Matthias Goerke
Abstract:
The state-oriented maintenance enables the preventive intervention before the failure of a component and guarantees avoidance of expensive breakdowns. Because the timing of the maintenance is defined by the component’s state, the remaining service life can be exhausted to the limit. The basic requirement for the state-oriented maintenance is the ability to define the component’s state. New potential for this is offered by gentelligent components. They are developed at the Corporative Research Centre 653 of the German Research Foundation (DFG). Because of their sensory ability they enable the registration of stresses during the component’s use. The data is gathered and evaluated. The methodology developed determines the current state of the gentelligent component based on the gathered data. This article presents this methodology as well as current research. The main focus of the current scientific work is to improve the quality of the state determination based on the life-cycle data analysis. The methodology developed until now evaluates the data of the usage phase and based on it predicts the timing of the gentelligent component’s failure. The real failure timing though, deviate from the predicted one because the effects from the production phase aren’t considered. The goal of the current research is to develop a methodology for state determination which considers both production and usage data.Keywords: state-oriented maintenance, life-cycle data, gentelligent component, preventive intervention
Procedia PDF Downloads 49826686 Hopes of out of School Children with Disabilities for Educational Inclusion
Authors: Afaf Manzoor, Abdul Hameed
Abstract:
Hopes to attend school is the most effective means to overcome the burden of disability and become a self-reliant, productive citizen. The objectives of the study were to develop a valid and reliable scale to measure hopes of out of school children with disabilities and find an association between hopes and various demographic factors such as type of disability, gender, socio-economic status, and locale, etc. Child Hope theory by Snyder (2003) was used as a framework to develop a measure for the hopes of children. According to this theory, hope is defined as a set of cognition that includes self- perception which establish routes to achieve desired goals (pathways) and motivation for achieving the goals (agency). By applying this theory, inclusion hope scale was developed and validated. The data were collected from 361 out of school children with disabilities living in three districts (Lahore, Sheikupura, Kasur) of Lahore Division by using the cluster sampling technique. Findings of the study indicated that children with intellectual challenges were more hopeless as compared to other types of disabilities. Similarly, children living in urban areas have better hopes for inclusion in school. However, no gender disparity was found in terms of being hopeful to attend schools. The study also includes recommendations to improve hopes for educational inclusion among out of school children with disabilities.Keywords: out of school children, disability, hopes, inclusion
Procedia PDF Downloads 17526685 Assessing the Effect of Freezing and Thawing of Coverzone of Ground Granulated Blast-Furnace Slag Concrete
Authors: Abdulkarim Mohammed Iliyasu, Mahmud Abba Tahir
Abstract:
Freezing and thawing are considered to be one of the major causes of concrete deterioration in the cold regions. This study aimed at assessing the freezing and thawing of concrete within the cover zone by monitoring the formation of ice and melting at different temperatures using electrical measurement technique. A multi-electrode array system was used to obtain the resistivity of ice formation and melting at discrete depths within the cover zone of the concrete. A total number of four concrete specimens (250 mm x 250 mm x 150 mm) made of ordinary Portland cement concrete and ordinary Portland cement replaced by 65% ground granulated blast furnace slag (GGBS) is investigated. Water/binder ratios of 0.35 and 0.65 were produced and ponded with water to ensure full saturation and then subjected to freezing and thawing process in a refrigerator within a temperature range of -30 0C and 20 0C over a period of time 24 hours. The data were collected and analysed. The obtained results show that the addition of GGBS changed the pore structure of the concrete which resulted in the decrease in conductance. It was recommended among others that, the surface of the concrete structure should be protected as this will help to prevent the instantaneous propagation of ice trough the rebar and to avoid corrosion and subsequent damage.Keywords: concrete, conductance, deterioration, freezing and thawing
Procedia PDF Downloads 41926684 Strategic Asset Allocation Optimization: Enhancing Portfolio Performance Through PCA-Driven Multi-Objective Modeling
Authors: Ghita Benayad
Abstract:
Asset allocation, which affects the long-term profitability of portfolios by distributing assets to fulfill a range of investment objectives, is the cornerstone of investment management in the dynamic and complicated world of financial markets. This paper offers a technique for optimizing strategic asset allocation with the goal of improving portfolio performance by addressing the inherent complexity and uncertainty of the market through the use of Principal Component Analysis (PCA) in a multi-objective modeling framework. The study's first section starts with a critical evaluation of conventional asset allocation techniques, highlighting how poorly they are able to capture the intricate relationships between assets and the volatile nature of the market. In order to overcome these challenges, the project suggests a PCA-driven methodology that isolates important characteristics influencing asset returns by decreasing the dimensionality of the investment universe. This decrease provides a stronger basis for asset allocation decisions by facilitating a clearer understanding of market structures and behaviors. Using a multi-objective optimization model, the project builds on this foundation by taking into account a number of performance metrics at once, including risk minimization, return maximization, and the accomplishment of predetermined investment goals like regulatory compliance or sustainability standards. This model provides a more comprehensive understanding of investor preferences and portfolio performance in comparison to conventional single-objective optimization techniques. While applying the PCA-driven multi-objective optimization model to historical market data, aiming to construct portfolios better under different market situations. As compared to portfolios produced from conventional asset allocation methodologies, the results show that portfolios optimized using the proposed method display improved risk-adjusted returns, more resilience to market downturns, and better alignment with specified investment objectives. The study also looks at the implications of this PCA technique for portfolio management, including the prospect that it might give investors a more advanced framework for navigating financial markets. The findings suggest that by combining PCA with multi-objective optimization, investors may obtain a more strategic and informed asset allocation that is responsive to both market conditions and individual investment preferences. In conclusion, this capstone project improves the field of financial engineering by creating a sophisticated asset allocation optimization model that integrates PCA with multi-objective optimization. In addition to raising concerns about the condition of asset allocation today, the proposed method of portfolio management opens up new avenues for research and application in the area of investment techniques.Keywords: asset allocation, portfolio optimization, principle component analysis, multi-objective modelling, financial market
Procedia PDF Downloads 4926683 An Assessment of Financial Viability and Sustainability of Hydroponics Using Reclaimed Water Using LCA and LCC
Authors: Muhammad Abdullah, Muhammad Atiq Ur Rehman Tariq, Faraz Ul Haq
Abstract:
In developed countries, sustainability measures are widely accepted and acknowledged as crucial for addressing environmental concerns. Hydroponics, a soilless cultivation technique, has emerged as a potentially sustainable solution as it can reduce water consumption, land use, and environmental impacts. However, hydroponics may not be economically viable, especially when using reclaimed water, which may entail additional costs and risks. This study aims to address the critical question of whether hydroponics using reclaimed water can achieve a balance between sustainability and financial viability. Life Cycle Assessment (LCA) and Life Cycle Cost (LCC) will be integrated to assess the potential of hydroponics whether it is environmentally sustainable and economically viable. Life cycle assessment, or LCA, is a methodology for assessing environmental impacts associated with all the stages of the life cycle of a commercial product, process, or service. While Life Cycle Cost (LCC) is an approach that assesses the total cost of an asset over its life cycle, including initial capital costs and maintenance costs. The expected benefits of this study include supporting evidence-based decision-making for policymakers, farmers, and stakeholders involved in agriculture. By quantifying environmental impacts and economic costs, this research will facilitate informed choices regarding the adoption of hydroponics with reclaimed water. It is believed that the outcomes of this research work will help to achieve a sustainable approach to agricultural production, aligning with sustainability goals while considering economic factors by adopting hydroponic technique.Keywords: hydroponic, life cycle assessment, life cycle cost, sustainability
Procedia PDF Downloads 7426682 A Hybrid System for Boreholes Soil Sample
Authors: Ali Ulvi Uzer
Abstract:
Data reduction is an important topic in the field of pattern recognition applications. The basic concept is the reduction of multitudinous amounts of data down to the meaningful parts. The Principal Component Analysis (PCA) method is frequently used for data reduction. The Support Vector Machine (SVM) method is a discriminative classifier formally defined by a separating hyperplane. In other words, given labeled training data, the algorithm outputs an optimal hyperplane which categorizes new examples. This study offers a hybrid approach that uses the PCA for data reduction and Support Vector Machines (SVM) for classification. In order to detect the accuracy of the suggested system, two boreholes taken from the soil sample was used. The classification accuracies for this dataset were obtained through using ten-fold cross-validation method. As the results suggest, this system, which is performed through size reduction, is a feasible system for faster recognition of dataset so our study result appears to be very promising.Keywords: feature selection, sequential forward selection, support vector machines, soil sample
Procedia PDF Downloads 45826681 Geochemical and Petrological Survey in Northern Ethiopia Basement Rocks for Investigation of Gold and Base Metal Mineral Potential in Finarwa, Southeast Tigray, Ethiopia
Authors: Siraj Beyan Mohamed, Woldia University
Abstract:
The study is accompanied in northern Ethiopian basement rocks, Finarwa area, and its surrounding areas, south eastern Tigray. From the field observations, the geology of the area haven been described and mapped based on mineral composition, texture, structure, and colour of both fresh and weather rocks. Inductively coupled plasma mass spectrometry (ICP-MS) and atomic absorption spectrometry (AAS) have conducted to analysis gold and base metal mineralization. The ore mineral under microscope are commonly base metal sulphides pyrrhotite, Chalcopyrite, pentilanditeoccurring in variable proportions. Galena, chalcopyrite, pyrite, and gold mineral are hosted in quartz vein. Pyrite occurs both in quartz vein and enclosing rocks as a primary mineral. The base metal sulfides occur as disseminated, vein filling, and replacement. Geochemical analyses result determination of the threshold of geochemical anomalies is directly related to the identification of mineralization information. From samples, stream sediment samples and the soil samples indicated that the most promising mineralization occur in the prospect area are gold(Au), copper (Cu), and zinc (Zn). This is also supported by the abundance of chalcopyrite and sphalerite in some highly altered samples. The stream sediment geochemical survey data shows relatively higher values for zinc compared to Pb and Cu. The moderate concentration of the base metals in some of the samples indicates availability base metal mineralization in the study area requiring further investigation. The rock and soil geochemistry shows the significant concentration of gold with maximum value of 0.33ppm and 0.97 ppm in the south western part of the study area. In Finarwa, artisanal gold mining has become an increasingly widespread economic activity of the local people undertaken by socially differentiated groups with a wide range of education levels and economic backgrounds incorporating a wide variety of ‘labour intensive activities without mechanisation.Keywords: gold, base metal, anomaly, threshold
Procedia PDF Downloads 12826680 Continual Learning Using Data Generation for Hyperspectral Remote Sensing Scene Classification
Authors: Samiah Alammari, Nassim Ammour
Abstract:
When providing a massive number of tasks successively to a deep learning process, a good performance of the model requires preserving the previous tasks data to retrain the model for each upcoming classification. Otherwise, the model performs poorly due to the catastrophic forgetting phenomenon. To overcome this shortcoming, we developed a successful continual learning deep model for remote sensing hyperspectral image regions classification. The proposed neural network architecture encapsulates two trainable subnetworks. The first module adapts its weights by minimizing the discrimination error between the land-cover classes during the new task learning, and the second module tries to learn how to replicate the data of the previous tasks by discovering the latent data structure of the new task dataset. We conduct experiments on HSI dataset Indian Pines. The results confirm the capability of the proposed method.Keywords: continual learning, data reconstruction, remote sensing, hyperspectral image segmentation
Procedia PDF Downloads 27026679 Local Differential Privacy-Based Data-Sharing Scheme for Smart Utilities
Authors: Veniamin Boiarkin, Bruno Bogaz Zarpelão, Muttukrishnan Rajarajan
Abstract:
The manufacturing sector is a vital component of most economies, which leads to a large number of cyberattacks on organisations, whereas disruption in operation may lead to significant economic consequences. Adversaries aim to disrupt the production processes of manufacturing companies, gain financial advantages, and steal intellectual property by getting unauthorised access to sensitive data. Access to sensitive data helps organisations to enhance the production and management processes. However, the majority of the existing data-sharing mechanisms are either susceptible to different cyber attacks or heavy in terms of computation overhead. In this paper, a privacy-preserving data-sharing scheme for smart utilities is proposed. First, a customer’s privacy adjustment mechanism is proposed to make sure that end-users have control over their privacy, which is required by the latest government regulations, such as the General Data Protection Regulation. Secondly, a local differential privacy-based mechanism is proposed to ensure the privacy of the end-users by hiding real data based on the end-user preferences. The proposed scheme may be applied to different industrial control systems, whereas in this study, it is validated for energy utility use cases consisting of smart, intelligent devices. The results show that the proposed scheme may guarantee the required level of privacy with an expected relative error in utility.Keywords: data-sharing, local differential privacy, manufacturing, privacy-preserving mechanism, smart utility
Procedia PDF Downloads 7926678 Changes in the Subjective Interpretation of Poverty Due to COVID-19: The Case of a Peripheral County of Hungary
Authors: Eszter Siposne Nandori
Abstract:
The paper describes how the subjective interpretation of poverty changed during the COVID-19 pandemic. The results of data collection at the end of 2020 are compared to the results of a similar survey from 2019. The methods of systematic data collection are used to collect data about the beliefs of the population about poverty. The analysis is carried out in Borsod-Abaúj-Zemplén County, one of the most backward areas in Hungary. The paper concludes that poverty is mainly linked to material values, and it did not change from 2019 to 2020. Some slight changes, however, highlight the effect of the pandemic: poverty is increasingly seen as a generational problem in 2020, and another important change is that isolation became more closely related to poverty.Keywords: Hungary, interpretation of poverty, pandemic, systematic data collection, subjective poverty
Procedia PDF Downloads 13126677 Pattern Recognition Approach Based on Metabolite Profiling Using In vitro Cancer Cell Line
Authors: Amanina Iymia Jeffree, Reena Thriumani, Mohammad Iqbal Omar, Ammar Zakaria, Yumi Zuhanis Has-Yun Hashim, Ali Yeon Md Shakaff
Abstract:
Metabolite profiling is a strategy to be approached in the pattern recognition method focused on three types of cancer cell line that driving the most to death specifically lung, breast, and colon cancer. The purpose of this study was to discriminate the VOCs pattern among cancerous and control group based on metabolite profiling. The sampling was executed utilizing the cell culture technique. All culture flasks were incubated till 72 hours and data collection started after 24 hours. Every running sample took 24 minutes to be completed accordingly. The comparative metabolite patterns were identified by the implementation of headspace-solid phase micro-extraction (HS-SPME) sampling coupled with gas chromatography-mass spectrometry (GCMS). The optimizations of the main experimental variables such as oven temperature and time were evaluated by response surface methodology (RSM) to get the optimal condition. Volatiles were acknowledged through the National Institute of Standards and Technology (NIST) mass spectral database and retention time libraries. To improve the reliability of significance, it is of crucial importance to eliminate background noise which data from 3rd minutes to 17th minutes were selected for statistical analysis. Targeted metabolites, of which were annotated as known compounds with the peak area greater than 0.5 percent were highlighted and subsequently treated statistically. Volatiles produced contain hundreds to thousands of compounds; therefore, it will be optimized by chemometric analysis, such as principal component analysis (PCA) as a preliminary analysis before subjected to a pattern classifier for identification of VOC samples. The volatile organic compound profiling has shown to be significantly distinguished among cancerous and control group based on metabolite profiling.Keywords: in vitro cancer cell line, metabolite profiling, pattern recognition, volatile organic compounds
Procedia PDF Downloads 37026676 The Response of the Central Bank to the Exchange Rate Movement: A Dynamic Stochastic General Equilibrium-Vector Autoregressive Approach for Tunisian Economy
Authors: Abdelli Soulaima, Belhadj Besma
Abstract:
The paper examines the choice of the central bank toward the movements of the nominal exchange rate and evaluates its effects on the volatility of the output growth and the inflation. The novel hybrid method of the dynamic stochastic general equilibrium called the DSGE-VAR is proposed for analyzing this policy experiment in a small scale open economy in particular Tunisia. The contribution is provided to the empirical literature as we apply the Tunisian data with this model, which is rarely used in this context. Note additionally that the issue of treating the degree of response of the central bank to the exchange rate in Tunisia is special. To ameliorate the estimation, the Bayesian technique is carried out for the sample 1980:q1 to 2011 q4. Our results reveal that the central bank should not react or softly react to the exchange rate. The variance decomposition displayed that the overall inflation volatility is more pronounced with the fixed exchange rate regime for most of the shocks except for the productivity and the interest rate. The output volatility is also higher with this regime with the majority of the shocks exempting the foreign interest rate and the interest rate shocks.Keywords: DSGE-VAR modeling, exchange rate, monetary policy, Bayesian estimation
Procedia PDF Downloads 30226675 An Encapsulation of a Navigable Tree Position: Theory, Specification, and Verification
Authors: Nicodemus M. J. Mbwambo, Yu-Shan Sun, Murali Sitaraman, Joan Krone
Abstract:
This paper presents a generic data abstraction that captures a navigable tree position. The mathematical modeling of the abstraction encapsulates the current tree position, which can be used to navigate and modify the tree. The encapsulation of the tree position in the data abstraction specification avoids the use of explicit references and aliasing, thereby simplifying verification of (imperative) client code that uses the data abstraction. To ease the tasks of such specification and verification, a general tree theory, rich with mathematical notations and results, has been developed. The paper contains an example to illustrate automated verification ramifications. With sufficient tree theory development, automated proving seems plausible even in the absence of a special-purpose tree solver.Keywords: automation, data abstraction, maps, specification, tree, verification
Procedia PDF Downloads 16826674 Accurate Position Electromagnetic Sensor Using Data Acquisition System
Authors: Z. Ezzouine, A. Nakheli
Abstract:
This paper presents a high position electromagnetic sensor system (HPESS) that is applicable for moving object detection. The authors have developed a high-performance position sensor prototype dedicated to students’ laboratory. The challenge was to obtain a highly accurate and real-time sensor that is able to calculate position, length or displacement. An electromagnetic solution based on a two coil induction principal was adopted. The HPESS converts mechanical motion to electric energy with direct contact. The output signal can then be fed to an electronic circuit. The voltage output change from the sensor is captured by data acquisition system using LabVIEW software. The displacement of the moving object is determined. The measured data are transmitted to a PC in real-time via a DAQ (NI USB -6281). This paper also describes the data acquisition analysis and the conditioning card developed specially for sensor signal monitoring. The data is then recorded and viewed using a user interface written using National Instrument LabVIEW software. On-line displays of time and voltage of the sensor signal provide a user-friendly data acquisition interface. The sensor provides an uncomplicated, accurate, reliable, inexpensive transducer for highly sophisticated control systems.Keywords: electromagnetic sensor, accurately, data acquisition, position measurement
Procedia PDF Downloads 28726673 The Quality of the Presentation Influence the Audience Perceptions
Authors: Gilang Maulana, Dhika Rahma Qomariah, Yasin Fadil
Abstract:
Purpose: This research meant to measure the magnitude of the influence of the quality of the presentation to the targeted audience perception in catching information presentation. Design/Methodology/Approach: This research uses a quantitative research method. The kind of data that uses in this research is the primary data. The population in this research are students the economics faculty of Semarang State University. The sampling techniques uses in this research is purposive sampling. The retrieving data uses questionnaire on 30 respondents. The data analysis uses descriptive analysis. Result: The quality of presentation influential positive against perception of the audience. This proved that the more qualified presentation will increase the perception of the audience. Limitation: Respondents were limited to only 30 people.Keywords: quality of presentation, presentation, audience, perception, semarang state university
Procedia PDF Downloads 39426672 Rapid Soil Classification Using Computer Vision with Electrical Resistivity and Soil Strength
Authors: Eugene Y. J. Aw, J. W. Koh, S. H. Chew, K. E. Chua, P. L. Goh, Grace H. B. Foo, M. L. Leong
Abstract:
This paper presents the evaluation of various soil testing methods such as the four-probe soil electrical resistivity method and cone penetration test (CPT) that can complement a newly developed novel rapid soil classification scheme using computer vision, to improve the accuracy and productivity of on-site classification of excavated soil. In Singapore, excavated soils from the local construction industry are transported to Staging Grounds (SGs) to be reused as fill material for land reclamation. Excavated soils are mainly categorized into two groups (“Good Earth” and “Soft Clay”) based on particle size distribution (PSD) and water content (w) from soil investigation reports and on-site visual survey, such that proper treatment and usage can be exercised. However, this process is time-consuming and labor-intensive. Thus, a rapid classification method is needed at the SGs. Four-probe soil electrical resistivity and CPT were evaluated for their feasibility as suitable additions to the computer vision system to further develop this innovative non-destructive and instantaneous classification method. The computer vision technique comprises soil image acquisition using an industrial-grade camera; image processing and analysis via calculation of Grey Level Co-occurrence Matrix (GLCM) textural parameters; and decision-making using an Artificial Neural Network (ANN). It was found from the previous study that the ANN model coupled with ρ can classify soils into “Good Earth” and “Soft Clay” in less than a minute, with an accuracy of 85% based on selected representative soil images. To further improve the technique, the following three items were targeted to be added onto the computer vision scheme: the apparent electrical resistivity of soil (ρ) measured using a set of four probes arranged in Wenner’s array, the soil strength measured using a modified mini cone penetrometer, and w measured using a set of time-domain reflectometry (TDR) probes. Laboratory proof-of-concept was conducted through a series of seven tests with three types of soils – “Good Earth”, “Soft Clay,” and a mix of the two. Validation was performed against the PSD and w of each soil type obtained from conventional laboratory tests. The results show that ρ, w and CPT measurements can be collectively analyzed to classify soils into “Good Earth” or “Soft Clay” and are feasible as complementing methods to the computer vision system.Keywords: computer vision technique, cone penetration test, electrical resistivity, rapid and non-destructive, soil classification
Procedia PDF Downloads 24326671 Examining Statistical Monitoring Approach against Traditional Monitoring Techniques in Detecting Data Anomalies during Conduct of Clinical Trials
Authors: Sheikh Omar Sillah
Abstract:
Introduction: Monitoring is an important means of ensuring the smooth implementation and quality of clinical trials. For many years, traditional site monitoring approaches have been critical in detecting data errors but not optimal in identifying fabricated and implanted data as well as non-random data distributions that may significantly invalidate study results. The objective of this paper was to provide recommendations based on best statistical monitoring practices for detecting data-integrity issues suggestive of fabrication and implantation early in the study conduct to allow implementation of meaningful corrective and preventive actions. Methodology: Electronic bibliographic databases (Medline, Embase, PubMed, Scopus, and Web of Science) were used for the literature search, and both qualitative and quantitative studies were sought. Search results were uploaded into Eppi-Reviewer Software, and only publications written in the English language from 2012 were included in the review. Gray literature not considered to present reproducible methods was excluded. Results: A total of 18 peer-reviewed publications were included in the review. The publications demonstrated that traditional site monitoring techniques are not efficient in detecting data anomalies. By specifying project-specific parameters such as laboratory reference range values, visit schedules, etc., with appropriate interactive data monitoring, statistical monitoring can offer early signals of data anomalies to study teams. The review further revealed that statistical monitoring is useful to identify unusual data patterns that might be revealing issues that could impact data integrity or may potentially impact study participants' safety. However, subjective measures may not be good candidates for statistical monitoring. Conclusion: The statistical monitoring approach requires a combination of education, training, and experience sufficient to implement its principles in detecting data anomalies for the statistical aspects of a clinical trial.Keywords: statistical monitoring, data anomalies, clinical trials, traditional monitoring
Procedia PDF Downloads 8326670 Expression of PGC-1 Alpha Isoforms in Response to Eccentric and Concentric Resistance Training in Healthy Subjects
Authors: Pejman Taghibeikzadehbadr
Abstract:
Background and Aim: PGC-1 alpha is a transcription factor that was first detected in brown adipose tissue. Since its discovery, PGC-1 alpha has been known to facilitate beneficial adaptations such as mitochondrial biogenesis and increased angiogenesis in skeletal muscle following aerobic exercise. Therefore, the purpose of this study was to investigate the expression of PGC-1 alpha isoforms in response to eccentric and concentric resistance training in healthy subjects. Materials and Methods: Ten healthy men were randomly divided into two groups (5 patients in eccentric group - 5 in eccentric group). Isokinetic contraction protocols included eccentric and concentric knee extension with maximum power and angular velocity of 60 degrees per second. The torques assigned to each subject were considered to match the workload in both protocols, with a rotational speed of 60 degrees per second. Contractions consisted of a maximum of 12 sets of 10 repetitions for the right leg, a rest time of 30 seconds between each set. At the beginning and end of the study, biopsy of the lateral broad muscle tissue was performed. Biopsies were performed in both distal and proximal directions of the lateral flank. To evaluate the expression of PGC1α-1 and PGC1α-4 genes, tissue analysis was performed in each group using Real-Time PCR technique. Data were analyzed using dependent t-test and covariance test. SPSS21 software and Exell 2013 software were used for data analysis. Results: The results showed that intra-group changes of PGC1α-1 after one session of activity were not significant in eccentric (p = 0.168) and concentric (p = 0.959) groups. Also, inter-group changes showed no difference between the two groups (p = 0.681). Also, intra-group changes of PGC1α-4 after one session of activity were significant in an eccentric group (p = 0.012) and concentric group (p = 0.02). Also, inter-group changes showed no difference between the two groups (p = 0.362). Conclusion: It seems that the lack of significant changes in the desired variables due to the lack of exercise pressure is sufficient to stimulate the increase of PGC1α-1 and PGC1α-4. And with regard to reviewing the answer, it seems that the compatibility debate has different results that need to be addressed.Keywords: eccentric contraction, concentric contraction, PGC1α-1 و PGC1α-4, human subject
Procedia PDF Downloads 8026669 An ALM Matrix Completion Algorithm for Recovering Weather Monitoring Data
Authors: Yuqing Chen, Ying Xu, Renfa Li
Abstract:
The development of matrix completion theory provides new approaches for data gathering in Wireless Sensor Networks (WSN). The existing matrix completion algorithms for WSN mainly consider how to reduce the sampling number without considering the real-time performance when recovering the data matrix. In order to guarantee the recovery accuracy and reduce the recovery time consumed simultaneously, we propose a new ALM algorithm to recover the weather monitoring data. A lot of experiments have been carried out to investigate the performance of the proposed ALM algorithm by using different parameter settings, different sampling rates and sampling models. In addition, we compare the proposed ALM algorithm with some existing algorithms in the literature. Experimental results show that the ALM algorithm can obtain better overall recovery accuracy with less computing time, which demonstrate that the ALM algorithm is an effective and efficient approach for recovering the real world weather monitoring data in WSN.Keywords: wireless sensor network, matrix completion, singular value thresholding, augmented Lagrange multiplier
Procedia PDF Downloads 38626668 Application of the Finite Window Method to a Time-Dependent Convection-Diffusion Equation
Authors: Raoul Ouambo Tobou, Alexis Kuitche, Marcel Edoun
Abstract:
The FWM (Finite Window Method) is a new numerical meshfree technique for solving problems defined either in terms of PDEs (Partial Differential Equation) or by a set of conservation/equilibrium laws. The principle behind the FWM is that in such problem each element of the concerned domain is interacting with its neighbors and will always try to adapt to keep in equilibrium with respect to those neighbors. This leads to a very simple and robust problem solving scheme, well suited for transfer problems. In this work, we have applied the FWM to an unsteady scalar convection-diffusion equation. Despite its simplicity, it is well known that convection-diffusion problems can be challenging to be solved numerically, especially when convection is highly dominant. This has led researchers to set the scalar convection-diffusion equation as a benchmark one used to analyze and derive the required conditions or artifacts needed to numerically solve problems where convection and diffusion occur simultaneously. We have shown here that the standard FWM can be used to solve convection-diffusion equations in a robust manner as no adjustments (Upwinding or Artificial Diffusion addition) were required to obtain good results even for high Peclet numbers and coarse space and time steps. A comparison was performed between the FWM scheme and both a first order implicit Finite Volume Scheme (Upwind scheme) and a third order implicit Finite Volume Scheme (QUICK Scheme). The results of the comparison was that for equal space and time grid spacing, the FWM yields a much better precision than the used Finite Volume schemes, all having similar computational cost and conditioning number.Keywords: Finite Window Method, Convection-Diffusion, Numerical Technique, Convergence
Procedia PDF Downloads 33526667 Field Production Data Collection, Analysis and Reporting Using Automated System
Authors: Amir AlAmeeri, Mohamed Ibrahim
Abstract:
Various data points are constantly being measured in the production system, and due to the nature of the wells, these data points, such as pressure, temperature, water cut, etc.., fluctuations are constant, which requires high frequency monitoring and collection. It is a very difficult task to analyze these parameters manually using spreadsheets and email. An automated system greatly enhances efficiency, reduce errors, the need for constant emails which take up disk space, and frees up time for the operator to perform other critical tasks. Various production data is being recorded in an oil field, and this huge volume of data can be seen as irrelevant to some, especially when viewed on its own with no context. In order to fully utilize all this information, it needs to be properly collected, verified and stored in one common place and analyzed for surveillance and monitoring purposes. This paper describes how data is recorded by different parties and departments in the field, and verified numerous times as it is being loaded into a repository. Once it is loaded, a final check is done before being entered into a production monitoring system. Once all this is collected, various calculations are performed to report allocated production. Calculated production data is used to report field production automatically. It is also used to monitor well and surface facility performance. Engineers can use this for their studies and analyses to ensure field is performing as it should be, predict and forecast production, and monitor any changes in wells that could affect field performance.Keywords: automation, oil production, Cheleken, exploration and production (E&P), Caspian Sea, allocation, forecast
Procedia PDF Downloads 158