Search results for: healthcare data security
23003 Optimal Cropping Pattern in an Irrigation Project: A Hybrid Model of Artificial Neural Network and Modified Simplex Algorithm
Authors: Safayat Ali Shaikh
Abstract:
Software has been developed for optimal cropping pattern in an irrigation project considering land constraint, water availability constraint and pick up flow constraint using modified Simplex Algorithm. Artificial Neural Network Models (ANN) have been developed to predict rainfall. AR (1) model used to generate 1000 years rainfall data to train the ANN. Simulation has been done with expected rainfall data. Eight number crops and three types of soil class have been considered for optimization model. Area under each crop and each soil class have been quantified using Modified Simplex Algorithm to get optimum net return. Efficacy of the software has been tested using data of large irrigation project in India.Keywords: artificial neural network, large irrigation project, modified simplex algorithm, optimal cropping pattern
Procedia PDF Downloads 20723002 Two-Phase Sampling for Estimating a Finite Population Total in Presence of Missing Values
Authors: Daniel Fundi Murithi
Abstract:
Missing data is a real bane in many surveys. To overcome the problems caused by missing data, partial deletion, and single imputation methods, among others, have been proposed. However, problems such as discarding usable data and inaccuracy in reproducing known population parameters and standard errors are associated with them. For regression and stochastic imputation, it is assumed that there is a variable with complete cases to be used as a predictor in estimating missing values in the other variable, and the relationship between the two variables is linear, which might not be realistic in practice. In this project, we estimate population total in presence of missing values in two-phase sampling. Instead of regression or stochastic models, non-parametric model based regression model is used in imputing missing values. Empirical study showed that nonparametric model-based regression imputation is better in reproducing variance of population total estimate obtained when there were no missing values compared to mean, median, regression, and stochastic imputation methods. Although regression and stochastic imputation were better than nonparametric model-based imputation in reproducing population total estimates obtained when there were no missing values in one of the sample sizes considered, nonparametric model-based imputation may be used when the relationship between outcome and predictor variables is not linear.Keywords: finite population total, missing data, model-based imputation, two-phase sampling
Procedia PDF Downloads 13423001 Social Work in Rehabilitation: Improving Practice Through Action Research
Authors: Poglajen Andrej, Malečihar Špela
Abstract:
Social work in rehabilitation needs constant development and embetterment of its practitioners. This became even more evident during the covid pandemic at times when outside sources of help, care and support were non-existent, or the access to such sources was severely limited. Social workers are, at our core, researchers of the rehabilitated world – from a personal and intrapersonal to a systematic perspective. This is also why a method of research was used in order to see if clinical social work practice can be further improved. The first stage of research showcased how action research and social work practice share many of the core values, whereas the Implementation of the new behaviour principle was severely lacking and thus became the main focus of the follow-up research. Twenty randomly selected case files of clinical social work practice in rehabilitation were qualitatively analyzed and potential benefits of action research on practice were assessed in the process of intervention while also getting feedback of the usefulness by the patients themselves using pre and post evaluation forms where a mixed-method approach was used. Implementation of new behaviour principle was recognized as a potential, improving factor of clinical social work practice in most analyzed cases, while it wasn’t deemed necessary in all of them. Potential improvements of newly implemented behaviour span across different areas of life and were also noted in the feedback from the rehabilitates. Despite the benefits of practice embetterment, the inclusion and focus on Implementation of new behaviour principle also caused additional workload, lack of time and stressful situations for the practitioners, which showcased the need to address certain systemic obstacles in the context of social work in healthcare in Slovenia.Keywords: action research, practice, rehabilitation, social work
Procedia PDF Downloads 16423000 The Effects of Multiple Levels of Intelligence in an Algebra 1 Classroom
Authors: Abigail Gragg
Abstract:
The goal of this research study was to adjudicate if implementing Howard Gardner’s multiple levels of intelligence would enhance student achievement levels in an Algebra 1 College Preparatory class. This was conducted within every class by incorporating one level of the eight levels of intelligence into small group work in stations. Every class was conducted utilizing small-group instruction. Achievement levels were measured through various forms of collected data that expressed student understandings in class through formative assessments versus student understandings on summative assessments. The data samples included: assessments (i.e. summative and formative assessments), observable data, video recordings, a daily log book, student surveys, and checklists kept during the observation periods. Formative assessments were analyzed during each class period to measure in-class understanding. Summative assessments were dissected per question per accuracy to review the effects of each intelligence implemented. The data was collated into a coding workbook for further analysis to conclude the resulting themes of the research. These themes include 1) there was no correlation to multiple levels of intelligence enhancing student achievement, 2) bodily-kinesthetic intelligence showed to be the intelligence that had the most improvement on test questions and 3) out of all of the bits of intelligence, interpersonal intelligence enhanced student understanding in class.Keywords: stations, small group instruction, multiple levels of intelligence, Mathematics, Algebra 1, student achievement, secondary school, instructional Pedagogies
Procedia PDF Downloads 11422999 The Impact of Climate Change on the Spread of Potato Pests in Kazakhstan
Authors: R. Zh. Abdukerim, D. A. Absatarova, A. T. Aitbayeva, M. A. Askarova, S. T. Turuspekova, E. V. Zhunus
Abstract:
The resilience of agricultural systems at the global level to climate change and their ability to recover determines the prospects for food security on a global scale. Since climate change will lead to changes in temperatures, precipitation, weather conditions and mass outbreaks of harmful organisms. The issue of adaptation to climate change in the agricultural sector is one of the priorities of Kazakhstan's Development Strategy for the period up to 2050. Since Kazakhstan is an agroindustrial country in which agriculture plays an important economic role. Kazakhstan is the largest potato producer in Central Asia, accounting for about 60% of the total vegetable production, which determines the urgency of solving the problem of increasing yields and quality. The control harmful organisms plays an important role in solving this issue. Due to the fact that climate change can lead to an increase in the number of harmful organisms and, accordingly, to a complete loss of harvest.Keywords: potato pests, Colorado potato beetle, soil pests, global climate change
Procedia PDF Downloads 6822998 Performance Analysis of Multichannel OCDMA-FSO Network under Different Pervasive Conditions
Authors: Saru Arora, Anurag Sharma, Harsukhpreet Singh
Abstract:
To meet the growing need of high data rate and bandwidth, various efforts has been made nowadays for the efficient communication systems. Optical Code Division Multiple Access over Free space optics communication system seems an effective role for providing transmission at high data rate with low bit error rate and low amount of multiple access interference. This paper demonstrates the OCDMA over FSO communication system up to the range of 7000 m at a data rate of 5 Gbps. Initially, the 8 user OCDMA-FSO system is simulated and pseudo orthogonal codes are used for encoding. Also, the simulative analysis of various performance parameters like power and core effective area that are having an effect on the Bit error rate (BER) of the system is carried out. The simulative analysis reveals that the length of the transmission is limited by the multi-access interference (MAI) effect which arises when the number of users increases in the system.Keywords: FSO, PSO, bit error rate (BER), opti system simulation, multiple access interference (MAI), q-factor
Procedia PDF Downloads 36822997 Analysis of Factors Affecting the Number of Infant and Maternal Mortality in East Java with Geographically Weighted Bivariate Generalized Poisson Regression Method
Authors: Luh Eka Suryani, Purhadi
Abstract:
Poisson regression is a non-linear regression model with response variable in the form of count data that follows Poisson distribution. Modeling for a pair of count data that show high correlation can be analyzed by Poisson Bivariate Regression. Data, the number of infant mortality and maternal mortality, are count data that can be analyzed by Poisson Bivariate Regression. The Poisson regression assumption is an equidispersion where the mean and variance values are equal. However, the actual count data has a variance value which can be greater or less than the mean value (overdispersion and underdispersion). Violations of this assumption can be overcome by applying Generalized Poisson Regression. Characteristics of each regency can affect the number of cases occurred. This issue can be overcome by spatial analysis called geographically weighted regression. This study analyzes the number of infant mortality and maternal mortality based on conditions in East Java in 2016 using Geographically Weighted Bivariate Generalized Poisson Regression (GWBGPR) method. Modeling is done with adaptive bisquare Kernel weighting which produces 3 regency groups based on infant mortality rate and 5 regency groups based on maternal mortality rate. Variables that significantly influence the number of infant and maternal mortality are the percentages of pregnant women visit health workers at least 4 times during pregnancy, pregnant women get Fe3 tablets, obstetric complication handled, clean household and healthy behavior, and married women with the first marriage age under 18 years.Keywords: adaptive bisquare kernel, GWBGPR, infant mortality, maternal mortality, overdispersion
Procedia PDF Downloads 16422996 A Fully-Automated Disturbance Analysis Vision for the Smart Grid Based on Smart Switch Data
Authors: Bernardo Cedano, Ahmed H. Eltom, Bob Hay, Jim Glass, Raga Ahmed
Abstract:
The deployment of smart grid devices such as smart meters and smart switches (SS) supported by a reliable and fast communications system makes automated distribution possible, and thus, provides great benefits to electric power consumers and providers alike. However, more research is needed before the full utility of smart switch data is realized. This paper presents new automated switching techniques using SS within the electric power grid. A concise background of the SS is provided, and operational examples are shown. Organization and presentation of data obtained from SS are shown in the context of the future goal of total automation of the distribution network. The description of application techniques, the examples of success with SS, and the vision outlined in this paper serve to motivate future research pertinent to disturbance analysis automation.Keywords: disturbance automation, electric power grid, smart grid, smart switches
Procedia PDF Downloads 31022995 Estimating Air Particulate Matter 10 Using Satellite Data and Analyzing Its Annual Temporal Pattern over Gaza Strip, Palestine
Authors: ِAbdallah A. A. Shaheen
Abstract:
Gaza Strip faces economic and political issues such as conflict, siege and urbanization; all these have led to an increase in the air pollution over Gaza Strip. In this study, Particulate matter 10 (PM10) concentration over Gaza Strip has been estimated by Landsat Thematic Mapper (TM) and Landsat Enhanced Thematic Mapper Plus (ETM+) data, based on a multispectral algorithm. Simultaneously, in-situ measurements for the corresponding particulate are acquired for selected time period. Landsat and ground data for eleven years are used to develop the algorithm while four years data (2002, 2006, 2010 and 2014) have been used to validate the results of algorithm. The developed algorithm gives highest regression, R coefficient value i.e. 0.86; RMSE value as 9.71 µg/m³; P values as 0. Average validation of algorithm show that calculated PM10 strongly correlates with measured PM10, indicating high efficiency of algorithm for the mapping of PM10 concentration during the years 2000 to 2014. Overall results show increase in minimum, maximum and average yearly PM10 concentrations, also presents similar trend over urban area. The rate of urbanization has been evaluated by supervised classification of the Landsat image. Urban sprawl from year 2000 to 2014 results in a high concentration of PM10 in the study area.Keywords: PM10, landsat, atmospheric reflectance, Gaza strip, urbanization
Procedia PDF Downloads 25822994 Simulation IDM for Schedule Generation of Slip-Form Operations
Authors: Hesham A. Khalek, Shafik S. Khoury, Remon F. Aziz, Mohamed A. Hakam
Abstract:
Slipforming operation’s linearity is a source of planning complications, and operation is usually subjected to bottlenecks at any point, so careful planning is required in order to achieve success. On the other hand, Discrete-event simulation concepts can be applied to simulate and analyze construction operations and to efficiently support construction scheduling. Nevertheless, preparation of input data for construction simulation is very challenging, time-consuming and human prone-error source. Therefore, to enhance the benefits of using DES in construction scheduling, this study proposes an integrated module to establish a framework for automating the generation of time schedules and decision support for Slipform construction projects, particularly through the project feasibility study phase by using data exchange between project data stored in an Intermediate database, DES and Scheduling software. Using the stored information, proposed system creates construction tasks attribute [e.g. activities durations, material quantities and resources amount], then DES uses all the given information to create a proposal for the construction schedule automatically. This research is considered a demonstration of a flexible Slipform project modeling, rapid scenario-based planning and schedule generation approach that may be of interest to both practitioners and researchers.Keywords: discrete-event simulation, modeling, construction planning, data exchange, scheduling generation, EZstrobe
Procedia PDF Downloads 38222993 Small Micro and Medium Enterprises Perception-Based Framework to Access Financial Support
Authors: Melvin Mothoa
Abstract:
Small Micro and Medium Enterprises are very significant for the development of their market economies. They are the main creators of the new working places, and they present a vital core of the market economy in countries across the globe. Access to finance is identified as crucial for small, micro, and medium-sized enterprises for their growth and innovation. This paper is conceived to propose a perception-based SMME framework to aid in access to financial support. Furthermore, the study will address issues that impede SMMEs in South Africa from obtaining finance from financial institutions. The framework will be tested against data collected from 200 Small Micro & Medium Enterprises in the Gauteng province of South Africa. The study adopts a quantitative method, and the delivery of self-administered questionnaires to SMMEs will be the primary data collection tool. Structural equation modeling will be used to further analyse the data collected.Keywords: finance, small business, growth, development
Procedia PDF Downloads 12022992 Kinetics, Equilibrium and Thermodynamics of the Adsorption of Triphenyltin onto NanoSiO₂/Fly Ash/Activated Carbon Composite
Authors: Olushola S. Ayanda, Olalekan S. Fatoki, Folahan A. Adekola, Bhekumusa J. Ximba, Cecilia O. Akintayo
Abstract:
In the present study, the kinetics, equilibrium and thermodynamics of the adsorption of triphenyltin (TPT) from TPT-contaminated water onto nanoSiO2/fly ash/activated carbon composite was investigated in batch adsorption system. Equilibrium adsorption data were analyzed using Langmuir, Freundlich, Temkin and Dubinin–Radushkevich (D-R) isotherm models. Pseudo first- and second-order, Elovich and fractional power models were applied to test the kinetic data and in order to understand the mechanism of adsorption, thermodynamic parameters such as ΔG°, ΔSo and ΔH° were also calculated. The results showed a very good compliance with pseudo second-order equation while the Freundlich and D-R models fit the experiment data. Approximately 99.999 % TPT was removed from the initial concentration of 100 mg/L TPT at 80oC, contact time of 60 min, pH 8 and a stirring speed of 200 rpm. Thus, nanoSiO2/fly ash/activated carbon composite could be used as effective adsorbent for the removal of TPT from contaminated water and wastewater.Keywords: isotherm, kinetics, nanoSiO₂/fly ash/activated carbon composite, tributyltin
Procedia PDF Downloads 29622991 A Comparation Analysis of Islamic Bank Efficiency in the United Kingdom and Indonesia during Eurozone Crisis Using Data Envelopment Analysis
Authors: Nisful Laila, Fatin Fadhilah Hasib, Puji Sucia Sukmaningrum, Achsania Hendratmi
Abstract:
The purpose of this study is to determine and comparing the level of efficiency of Islamic Banks in Indonesia and United Kingdom during eurozone sovereign debt crisis. This study using a quantitative non-parametric approach with Data Envelopment Analysis (DEA) VRS assumption, and a statistical tool Mann-Whitney U-Test. The samples are 11 Islamic Banks in Indonesia and 4 Islamic Banks in England. This research used mediating approach. Input variable consists of total deposit, asset, and the cost of labour. Output variable consists of financing and profit/loss. This study shows that the efficiency of Islamic Bank in Indonesia and United Kingdom are varied and fluctuated during the observation period. There is no significant different the efficiency performance of Islamic Banks in Indonesia and United Kingdom.Keywords: data envelopment analysis, efficiency, eurozone crisis, islamic bank
Procedia PDF Downloads 32922990 Knowledge Representation and Inconsistency Reasoning of Class Diagram Maintenance in Big Data
Authors: Chi-Lun Liu
Abstract:
Requirements modeling and analysis are important in successful information systems' maintenance. Unified Modeling Language (UML) class diagrams are useful standards for modeling information systems. To our best knowledge, there is a lack of a systems development methodology described by the organism metaphor. The core concept of this metaphor is adaptation. Using the knowledge representation and reasoning approach and ontologies to adopt new requirements are emergent in recent years. This paper proposes an organic methodology which is based on constructivism theory. This methodology is a knowledge representation and reasoning approach to analyze new requirements in the class diagrams maintenance. The process and rules in the proposed methodology automatically analyze inconsistencies in the class diagram. In the big data era, developing an automatic tool based on the proposed methodology to analyze large amounts of class diagram data is an important research topic in the future.Keywords: knowledge representation, reasoning, ontology, class diagram, software engineering
Procedia PDF Downloads 24622989 Investigating Self-Confidence Influence on English as a Foreign Language Student English Language Proficiency Level
Authors: Ali A. Alshahrani
Abstract:
This study aims to identify Saudi English as a Foreign Language (EFL) students' perspectives towards using the English language in their studies. The study explores students' self-confident and its association with students' actual performance in English courses in their different academic programs. A multimodal methodology was used to fulfill the research purpose and answer the research questions. A 25-item survey questionnaire and final examination grades were used to collect data. Two hundred forty-one students agreed to participate in the study. They completed the questionnaire and agreed to release their final grades to be a part of the collected data. The data were coded and analyzed by SPSS software. The findings indicated a significant difference in students' performance in English courses between participants' academic programs on the one hand. Students' self-confidence in their English language skills, on the other hand, was not significantly different between participants' academic programs. Data analysis also revealed no correlational relationship between students' self-confidence level and their language skills and their performance. The study raises more questions about other vital factors such as course instructors' views of the materials, faculty members of the target department, family belief in the usefulness of the program, potential employers. These views and beliefs shape the student's preparation process and, therefore, should be explored further.Keywords: English language intensive program, language proficiency, performance, self-confidence
Procedia PDF Downloads 14022988 Efficiency of the Slovak Commercial Banks Applying the DEA Window Analysis
Authors: Iveta Řepková
Abstract:
The aim of this paper is to estimate the efficiency of the Slovak commercial banks employing the Data Envelopment Analysis (DEA) window analysis approach during the period 2003-2012. The research is based on unbalanced panel data of the Slovak commercial banks. Undesirable output was included into analysis of banking efficiency. It was found that most efficient banks were Postovabanka, UniCredit Bank and Istrobanka in CCR model and the most efficient banks were Slovenskasporitelna, Istrobanka and UniCredit Bank in BCC model. On contrary, the lowest efficient banks were found Privatbanka and CitiBank. We found that the largest banks in the Slovak banking market were lower efficient than medium-size and small banks. Results of the paper is that during the period 2003-2008 the average efficiency was increasing and then during the period 2010-2011 the average efficiency decreased as a result of financial crisis.Keywords: data envelopment analysis, efficiency, Slovak banking sector, window analysis
Procedia PDF Downloads 36322987 Using Textual Pre-Processing and Text Mining to Create Semantic Links
Authors: Ricardo Avila, Gabriel Lopes, Vania Vidal, Jose Macedo
Abstract:
This article offers a approach to the automatic discovery of semantic concepts and links in the domain of Oil Exploration and Production (E&P). Machine learning methods combined with textual pre-processing techniques were used to detect local patterns in texts and, thus, generate new concepts and new semantic links. Even using more specific vocabularies within the oil domain, our approach has achieved satisfactory results, suggesting that the proposal can be applied in other domains and languages, requiring only minor adjustments.Keywords: semantic links, data mining, linked data, SKOS
Procedia PDF Downloads 18422986 Remote Sensing through Deep Neural Networks for Satellite Image Classification
Authors: Teja Sai Puligadda
Abstract:
Satellite images in detail can serve an important role in the geographic study. Quantitative and qualitative information provided by the satellite and remote sensing images minimizes the complexity of work and time. Data/images are captured at regular intervals by satellite remote sensing systems, and the amount of data collected is often enormous, and it expands rapidly as technology develops. Interpreting remote sensing images, geographic data mining, and researching distinct vegetation types such as agricultural and forests are all part of satellite image categorization. One of the biggest challenge data scientists faces while classifying satellite images is finding the best suitable classification algorithms based on the available that could able to classify images with utmost accuracy. In order to categorize satellite images, which is difficult due to the sheer volume of data, many academics are turning to deep learning machine algorithms. As, the CNN algorithm gives high accuracy in image recognition problems and automatically detects the important features without any human supervision and the ANN algorithm stores information on the entire network (Abhishek Gupta., 2020), these two deep learning algorithms have been used for satellite image classification. This project focuses on remote sensing through Deep Neural Networks i.e., ANN and CNN with Deep Sat (SAT-4) Airborne dataset for classifying images. Thus, in this project of classifying satellite images, the algorithms ANN and CNN are implemented, evaluated & compared and the performance is analyzed through evaluation metrics such as Accuracy and Loss. Additionally, the Neural Network algorithm which gives the lowest bias and lowest variance in solving multi-class satellite image classification is analyzed.Keywords: artificial neural network, convolutional neural network, remote sensing, accuracy, loss
Procedia PDF Downloads 16522985 Principal Components Analysis of the Causes of High Blood Pressure at Komfo Anokye Teaching Hospital, Ghana
Authors: Joseph K. A. Johnson
Abstract:
Hypertension affects 20 percent of the people within the ages 55 upward in Ghana. Of these, almost one-third are unaware of their condition. Also at the age of 55, more men turned to have hypertension than women. After that age, the condition becomes more prevalent with women. Hypertension is significantly more common in African Americans of both sexes than the racial or ethnic groups. This study was conducted to determine the causes of high blood pressure in Ashanti Region, Ghana. The study employed One Hundred and Seventy (170) respondents. The sample population for the study was all the available respondents at the time of the data collection. The research was conducted using primary data where convenience sampling was used to locate the respondents. A set of questionnaire were used to gather the data for the study. The gathered data was analysed using principal component analysis. The study revealed that, personal description, lifestyle behavior and risk awareness as some of the causes of high blood pressure in Ashanti Region. The study therefore recommend that people must be advice to see to their personal characteristics that may contribute to high blood pressure such as controlling of their temper and how to react perfectly to stressful situations. They must be educated on the factors that may increase the level of their blood pressure such as the essence of seeing a medical doctor before taking in any drug. People must also be made known by the public health officers to those lifestyles behaviour such as smoking and drinking of alcohol which are major contributors of high blood pressure.Keywords: high blood pressure, principal component analysis, hypertension, public health
Procedia PDF Downloads 48822984 Opportunities and Challenges to Local Legislation at the Height of the COVID-19 Pandemic: Evidence from a Fifth Class Municipality in the Visayas, Philippines
Authors: Renz Paolo B. Ramos, Jake S. Espina
Abstract:
The Local Government Academy of the Philippines explains that Local legislation is both a power and a process by which it enacts ordinances and resolutions that have the force and effect of law while engaging with a range of stakeholders for their implementation. Legislative effectiveness is crucial for the development of any given area. This study's objective is to evaluate the legislative performance of the 10th Sangguniang of Kawayan, a legislative body in a fifth-class municipality in the Province of Biliran, during the height of the COVID-19 pandemic (2019-2021) with a focus on legislation, accountability, and participation, institution-building, and intergovernmental relations. The aim of the study was that a mixed-methods strategy was used to gather data. The Local Legislative Performance Appraisal Form (LLPAF) was completed, while Focus Interviews for Local Government Unit (LGU) personnel, a survey questionnaire for constituents, and ethnographic diary-writing were conducted. Convenience Sampling was utilized for LGU workers, whereas Simple Random Sampling was used to identify the number of constituents participating. Interviews were analyzed using thematic analysis, while frequency data analysis was employed to describe and evaluate the nature and connection of the data to the underlying population. From this data, the researchers draw opportunities and challenges met by the local legislature during the height of the pandemic.Keywords: local legislation, local governance, legislative effectiveness, legislative analysis
Procedia PDF Downloads 7822983 Design of a Small and Medium Enterprise Growth Prediction Model Based on Web Mining
Authors: Yiea Funk Te, Daniel Mueller, Irena Pletikosa Cvijikj
Abstract:
Small and medium enterprises (SMEs) play an important role in the economy of many countries. When the overall world economy is considered, SMEs represent 95% of all businesses in the world, accounting for 66% of the total employment. Existing studies show that the current business environment is characterized as highly turbulent and strongly influenced by modern information and communication technologies, thus forcing SMEs to experience more severe challenges in maintaining their existence and expanding their business. To support SMEs at improving their competitiveness, researchers recently turned their focus on applying data mining techniques to build risk and growth prediction models. However, data used to assess risk and growth indicators is primarily obtained via questionnaires, which is very laborious and time-consuming, or is provided by financial institutes, thus highly sensitive to privacy issues. Recently, web mining (WM) has emerged as a new approach towards obtaining valuable insights in the business world. WM enables automatic and large scale collection and analysis of potentially valuable data from various online platforms, including companies’ websites. While WM methods have been frequently studied to anticipate growth of sales volume for e-commerce platforms, their application for assessment of SME risk and growth indicators is still scarce. Considering that a vast proportion of SMEs own a website, WM bears a great potential in revealing valuable information hidden in SME websites, which can further be used to understand SME risk and growth indicators, as well as to enhance current SME risk and growth prediction models. This study aims at developing an automated system to collect business-relevant data from the Web and predict future growth trends of SMEs by means of WM and data mining techniques. The envisioned system should serve as an 'early recognition system' for future growth opportunities. In an initial step, we examine how structured and semi-structured Web data in governmental or SME websites can be used to explain the success of SMEs. WM methods are applied to extract Web data in a form of additional input features for the growth prediction model. The data on SMEs provided by a large Swiss insurance company is used as ground truth data (i.e. growth-labeled data) to train the growth prediction model. Different machine learning classification algorithms such as the Support Vector Machine, Random Forest and Artificial Neural Network are applied and compared, with the goal to optimize the prediction performance. The results are compared to those from previous studies, in order to assess the contribution of growth indicators retrieved from the Web for increasing the predictive power of the model.Keywords: data mining, SME growth, success factors, web mining
Procedia PDF Downloads 27122982 Seismic Response Analysis of Frame Structures Based on Super Joint Element Model
Authors: Li Xu, Yang Hong, T. Zhao Wen
Abstract:
Experimental results of many RC beam-column subassemblage indicate that slippage of longitudinal beam rebar within the joint and the shear deformation of joint core have significant influence on seismic behavior of the subassemblage. However, rigid joint assumption has been generally used in the seismic response analysis of RC frames, in which two kinds of inelastic deformation of joint have been ignored. Based on OpenSees platform, ‘Super Joint Element Model’ with more detailed inelastic mechanism is used to simulate the inelastic response of joints. Two finite element models of typical RC plane frame, namely considering or ignoring the inelastic deformation of joint respectively, were established and analyzed under seven strong earthquake waves. The simulated global and local inelastic deformations of the RC plane frame is shown and discussed. The analyses also confirm the security of the earthquake-resistant frame designed according to Chinese codes.Keywords: frame structure, beam-column joint, longitudinal bar slippage, shear deformation, nonlinear analysis
Procedia PDF Downloads 41322981 Evaluating Effectiveness of Training and Development Corporate Programs: The Russian Agribusiness Context
Authors: Ekaterina Tikhonova
Abstract:
This research is aimed to evaluate the effectiveness of T&D (Training and Development) on the example of two T&D programs for the Executive TOP Management run in 2012, 2015-2016 in Komos Group. This study is commissioned to research the effectiveness of two similar corporate T&D programs (within one company) in two periods of time (2012, 2015-2016) through evaluating the programs’ effectiveness using the four-level Kirkpatrick’s model of evaluating T&D programs and calculating ROI as an instrument for T&D program measuring by Phillips’ formula. The research investigates the correlation of two figures: the ROI calculated and the rating percentage scale per the ROI implementation (Wagle’s scale). The study includes an assessment of feedback 360 (Kirkpatrick's model) and Phillips’ ROI Methodology that provides a step-by-step process for collecting data, summarizing and processing the collected information. The data is collected from the company accounting data, the HR budgets, MCFO and the company annual reports for the research periods. All analyzed data and reports are organized and presented in forms of tables, charts, and graphs. The paper also gives a brief description of some constrains of the research considered. After ROI calculation, the study reveals that ROI ranges between the average implementation (65% to 75%) by Wagle’s scale that can be considered as a positive outcome. The paper also gives some recommendations how to use ROI in practice and describes main benefits of ROI implementation.Keywords: ROI, organizational performance, efficacy of T&D program, employee performance
Procedia PDF Downloads 25522980 Spatially Encoded Hyperspectral Compressive Microscope for Broadband VIS/NIR Imaging
Authors: Lukáš Klein, Karel Žídek
Abstract:
Hyperspectral imaging counts among the most frequently used multidimensional sensing methods. While there are many approaches to capturing a hyperspectral data cube, optical compression is emerging as a valuable tool to reduce the setup complexity and the amount of data storage needed. Hyperspectral compressive imagers have been created in the past; however, they have primarily focused on relatively narrow sections of the electromagnetic spectrum. A broader spectral study of samples can provide helpful information, especially for applications involving the harmonic generation and advanced material characterizations. We demonstrate a broadband hyperspectral microscope based on the single-pixel camera principle. Captured spatially encoded data are processed to reconstruct a hyperspectral cube in a combined visible and near-infrared spectrum (from 400 to 2500 nm). Hyperspectral cubes can be reconstructed with a spectral resolution of up to 3 nm and spatial resolution of up to 7 µm (subject to diffraction) with a high compressive ratio.Keywords: compressive imaging, hyperspectral imaging, near-infrared spectrum, single-pixel camera, visible spectrum
Procedia PDF Downloads 9122979 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model
Authors: Chaudhuri Manoj Kumar Swain, Susmita Das
Abstract:
This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis
Procedia PDF Downloads 18322978 A Spatial Information Network Traffic Prediction Method Based on Hybrid Model
Authors: Jingling Li, Yi Zhang, Wei Liang, Tao Cui, Jun Li
Abstract:
Compared with terrestrial network, the traffic of spatial information network has both self-similarity and short correlation characteristics. By studying its traffic prediction method, the resource utilization of spatial information network can be improved, and the method can provide an important basis for traffic planning of a spatial information network. In this paper, considering the accuracy and complexity of the algorithm, the spatial information network traffic is decomposed into approximate component with long correlation and detail component with short correlation, and a time series hybrid prediction model based on wavelet decomposition is proposed to predict the spatial network traffic. Firstly, the original traffic data are decomposed to approximate components and detail components by using wavelet decomposition algorithm. According to the autocorrelation and partial correlation smearing and truncation characteristics of each component, the corresponding model (AR/MA/ARMA) of each detail component can be directly established, while the type of approximate component modeling can be established by ARIMA model after smoothing. Finally, the prediction results of the multiple models are fitted to obtain the prediction results of the original data. The method not only considers the self-similarity of a spatial information network, but also takes into account the short correlation caused by network burst information, which is verified by using the measured data of a certain back bone network released by the MAWI working group in 2018. Compared with the typical time series model, the predicted data of hybrid model is closer to the real traffic data and has a smaller relative root means square error, which is more suitable for a spatial information network.Keywords: spatial information network, traffic prediction, wavelet decomposition, time series model
Procedia PDF Downloads 15322977 Features of Formation and Development of Possessory Risk Management Systems of Organization in the Russian Economy
Authors: Mikhail V. Khachaturyan, Inga A. Koryagina, Maria Nikishova
Abstract:
The study investigates the impact of the ongoing financial crisis, started in the 2nd half of 2014, on marketing budgets spent by Fast-moving consumer goods companies. In these conditions, special importance is given to efficient possessory risk management systems. The main objective for establishing and developing possessory risk management systems for FMCG companies in a crisis is to analyze the data relating to the external environment and consumer behavior in a crisis. Another important objective for possessory risk management systems of FMCG companies is to develop measures and mechanisms to maintain and stimulate sales. In this regard, analysis of risks and threats which consumers define as the main reasons affecting their level of consumption become important. It is obvious that in crisis conditions the effective risk management systems responsible for development and implementation of strategies for consumer demand stimulation, as well as the identification, analysis, assessment and management of other types of risks of economic security will be the key to sustainability of a company. In terms of financial and economic crisis, the problem of forming and developing possessory risk management systems becomes critical not only in the context of management models of FMCG companies, but for all the companies operating in other sectors of the Russian economy. This study attempts to analyze the specifics of formation and development of company possessory risk management systems. In the modern economy, special importance among all the types of owner’s risks has the risk of reduction in consumer activity. This type of risk is common not only for the consumer goods trade. Study of consumer activity decline is especially important for Russia due to domestic market of consumer goods being still in the development stage, despite its significant growth. In this regard, it is especially important to form and develop possessory risk management systems for FMCG companies. The authors offer their own interpretation of the process of forming and developing possessory risk management systems within owner’s management models of FMCG companies as well as in Russian economy in general. Proposed methods and mechanisms of problem analysis of formation and development of possessory risk management systems in FMCG companies and the results received can be helpful for researchers interested in problems of consumer goods market development in Russia and overseas.Keywords: FMCG companies, marketing budget, risk management, owner, Russian economy, organization, formation, development, system
Procedia PDF Downloads 38122976 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable
Authors: Xinyuan Y. Song, Kai Kang
Abstract:
Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data
Procedia PDF Downloads 14822975 The Application and Relevance of Costing Techniques in Service-Oriented Business Organizations a Review of the Activity-Based Costing (ABC) Technique
Authors: Udeh Nneka Evelyn
Abstract:
The shortcoming of traditional costing system in terms of validity, accuracy, consistency, and Relevance increased the need for modern management accounting system. Activity –Based Costing (ABC) can be used as a modern tool for planning, Control and decision making for management. Past studies on ABC system have focused on manufacturing firms thereby making the studies on service firms scanty to some extent. This paper reviewed the application and relevance of activity-based costing technique in service oriented business organizations by employing a qualitative research method which relied heavily on literature review of past and current relevant articles focusing on ABC. Findings suggest that ABC is not only appropriate for use in a manufacturing environment; it is also most appropriate for service organizations such as financial institutions, the healthcare industry and government organization. In fact, some banking and financial institutions have been applying the concept for years under other names. One of them is unit costing, which is used to calculate the cost of banking services by determining the cost and consumption of each unit of output of functions required to deliver the service. ABC in very basic terms may provide very good payback for businesses. Some of the benefits that relate directly to the financial services industry are: identification the most profitable customers: more accurate product and service pricing: increase product profitability: Well organized process costs.Keywords: business, costing, organizations, planning, techniques
Procedia PDF Downloads 24422974 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models
Authors: Rossella Arcucci, Luisa D'Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti
Abstract:
This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.Keywords: data assimilation, GPU architectures, ocean models, parallel algorithm
Procedia PDF Downloads 416