Search results for: data analytics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24410

Search results for: data analytics

22460 Calculation the Left Ventricle Wall Radial Strain and Radial SR Using Tagged Magnetic Resonance Imaging Data (tMRI)

Authors: Mohammed Alenezy

Abstract:

The function of cardiac motion can be used as an indicator of the heart abnormality by evaluating longitudinal, circumferential, and Radial Strain of the left ventricle. In this paper, the Radial Strain and SR is studied using tagged MRI (tMRI) data during the cardiac cycle on the mid-ventricle level of the left ventricle. Materials and methods: The short-axis view of the left ventricle of five healthy human (three males and two females) and four healthy male rats were imaged using tagged magnetic resonance imaging (tMRI) technique covering the whole cardiac cycle on the mid-ventricle level. Images were processed using Image J software to calculate the left ventricle wall Radial Strain and radial SR. The left ventricle Radial Strain and radial SR were calculated at the mid-ventricular level during the cardiac cycle. The peak Radial Strain for the human and rat heart was 40.7±1.44, and 46.8±0.68 respectively, and it occurs at 40% of the cardiac cycle for both human and rat heart. The peak diastolic and systolic radial SR for human heart was -1.78 s-1 ± 0.02 s-1 and 1.10±0.08 s-1 respectively, while for rat heart it was -5.16± 0.23s-1 and 4.25±0.02 s-1 respectively. Conclusion: This results show the ability of the tMRI data to characterize the cardiac motion during the cardiac cycle including diastolic and systolic phases which can be used as an indicator of the cardiac dysfunction by estimating the left ventricle Radial Strain and radial SR at different locations of the cardiac tissue. This study approves the validity of the tagged MRI data to describe accurately the cardiac radial motion.

Keywords: left ventricle, radial strain, tagged MRI, cardiac cycle

Procedia PDF Downloads 467
22459 Allocating Channels and Flow Estimation at Flood Prone Area in Desert, Example from AlKharj City, Saudi Arabia

Authors: Farhan Aljuaidi

Abstract:

The rapid expansion of Alkarj city, Saudi Arabia, towards the outlet of Wadi AlAin is critical for the planners and decision makers. Nowadays, two major projects such as Salman bin Abdulaziz University compound and new industrial area are developed in this flood prone area where no channels are clear and identified. The main contribution of this study is to divert the flow away from these vital projects by reconstructing new channels. To do so, Lidar data were used to generate contour lines for the actual elevation of the highways and local roads. These data were analyzed and compared to the contour lines derived from the topographical maps 1:50.000. The magnitude of the expected flow was estimated using Snyder's Model based on the morphometric data acquired by DEM of the catchment area. The results indicate that maximum discharge peak reaches 2694,3 m3/sec, the mean is 303,7 m3/sec and the minimum is 74,3 m3/sec. The runoff was estimated at 252,2. 610 m3/s, the mean is 41,5. 610 m3/s and the minimum is 12,4. 610 m3/s.

Keywords: Desert flood, Saudi Arabia, Snyder's Model, flow estimation

Procedia PDF Downloads 289
22458 Public Bus Transport Passenger Safety Evaluations in Ghana: A Phenomenological Constructivist Exploration

Authors: Enoch F. Sam, Kris Brijs, Stijn Daniels, Tom Brijs, Geert Wets

Abstract:

Notwithstanding the growing body of literature that recognises the importance of personal safety to public transport (PT) users, it remains unclear what PT users consider regarding their safety. In this study, we explore the criteria PT users in Ghana use to assess bus safety. This knowledge will afford a better understanding of PT users’ risk perceptions and assessments which may contribute to theoretical models of PT risk perceptions. We utilised phenomenological research methodology, with data drawn from 61 purposively sampled participants. Data collection (through focus group discussions and in-depth interviews) and analyses were done concurrently to the point of saturation. Our inductive data coding and analyses through the constant comparison and content analytic techniques resulted in 4 code categories (conceptual dimensions), 27 codes (safety items/criteria), and 100 quotations (data segments). Of the number of safety criteria participants use to assess bus safety, vehicle condition, driver’s marital status, and transport operator’s safety records were the most considered. With each criterion, participants rightly demonstrated its respective relevance to bus safety. These findings imply that investment in and maintenance of safer vehicles, and responsible and safety-conscious drivers, and prioritization of passengers’ safety are key-targets for public bus/minibus operators in Ghana.

Keywords: safety evaluations, public bus/minibus, passengers, phenomenology, Ghana

Procedia PDF Downloads 309
22457 Data-Driven Analysis of Velocity Gradient Dynamics Using Neural Network

Authors: Nishant Parashar, Sawan S. Sinha, Balaji Srinivasan

Abstract:

We perform an investigation of the unclosed terms in the evolution equation of the velocity gradient tensor (VGT) in compressible decaying turbulent flow. Velocity gradients in a compressible turbulent flow field influence several important nonlinear turbulent processes like cascading and intermittency. In an attempt to understand the dynamics of the velocity gradients various researchers have tried to model the unclosed terms in the evolution equation of the VGT. The existing models proposed for these unclosed terms have limited applicability. This is mainly attributable to the complex structure of the higher order gradient terms appearing in the evolution equation of VGT. We investigate these higher order gradients using the data from direct numerical simulation (DNS) of compressible decaying isotropic turbulent flow. The gas kinetic method aided with weighted essentially non-oscillatory scheme (WENO) based flow- reconstruction is employed to generate DNS data. By applying neural-network to the DNS data, we map the structure of the unclosed higher order gradient terms in the evolution of the equation of the VGT with VGT itself. We validate our findings by performing alignment based study of the unclosed higher order gradient terms obtained using the neural network with the strain rate eigenvectors.

Keywords: compressible turbulence, neural network, velocity gradient tensor, direct numerical simulation

Procedia PDF Downloads 146
22456 Comparison of Authentication Methods in Internet of Things Technology

Authors: Hafizah Che Hasan, Fateen Nazwa Yusof, Maslina Daud

Abstract:

Internet of Things (IoT) is a powerful industry system, which end-devices are interconnected and automated, allowing the devices to analyze data and execute actions based on the analysis. The IoT technology leverages the technology of Radio-Frequency Identification (RFID) and Wireless Sensor Network (WSN), including mobile and sensor. These technologies contribute to the evolution of IoT. However, due to more devices are connected each other in the Internet, and data from various sources exchanged between things, confidentiality of the data becomes a major concern. This paper focuses on one of the major challenges in IoT; authentication, in order to preserve data integrity and confidentiality are in place. A few solutions are reviewed based on papers from the last few years. One of the proposed solutions is securing the communication between IoT devices and cloud servers with Elliptic Curve Cryptograhpy (ECC) based mutual authentication protocol. This solution focuses on Hyper Text Transfer Protocol (HTTP) cookies as security parameter.  Next proposed solution is using keyed-hash scheme protocol to enable IoT devices to authenticate each other without the presence of a central control server. Another proposed solution uses Physical Unclonable Function (PUF) based mutual authentication protocol. It emphasizes on tamper resistant and resource-efficient technology, which equals a 3-way handshake security protocol.

Keywords: Internet of Things (IoT), authentication, PUF ECC, keyed-hash scheme protocol

Procedia PDF Downloads 239
22455 Data Analysis Tool for Predicting Water Scarcity in Industry

Authors: Tassadit Issaadi Hamitouche, Nicolas Gillard, Jean Petit, Valerie Lavaste, Celine Mayousse

Abstract:

Water is a fundamental resource for the industry. It is taken from the environment either from municipal distribution networks or from various natural water sources such as the sea, ocean, rivers, aquifers, etc. Once used, water is discharged into the environment, reprocessed at the plant or treatment plants. These withdrawals and discharges have a direct impact on natural water resources. These impacts can apply to the quantity of water available, the quality of the water used, or to impacts that are more complex to measure and less direct, such as the health of the population downstream from the watercourse, for example. Based on the analysis of data (meteorological, river characteristics, physicochemical substances), we wish to predict water stress episodes and anticipate prefectoral decrees, which can impact the performance of plants and propose improvement solutions, help industrialists in their choice of location for a new plant, visualize possible interactions between companies to optimize exchanges and encourage the pooling of water treatment solutions, and set up circular economies around the issue of water. The development of a system for the collection, processing, and use of data related to water resources requires the functional constraints specific to the latter to be made explicit. Thus the system will have to be able to store a large amount of data from sensors (which is the main type of data in plants and their environment). In addition, manufacturers need to have 'near-real-time' processing of information in order to be able to make the best decisions (to be rapidly notified of an event that would have a significant impact on water resources). Finally, the visualization of data must be adapted to its temporal and geographical dimensions. In this study, we set up an infrastructure centered on the TICK application stack (for Telegraf, InfluxDB, Chronograf, and Kapacitor), which is a set of loosely coupled but tightly integrated open source projects designed to manage huge amounts of time-stamped information. The software architecture is coupled with the cross-industry standard process for data mining (CRISP-DM) data mining methodology. The robust architecture and the methodology used have demonstrated their effectiveness on the study case of learning the level of a river with a 7-day horizon. The management of water and the activities within the plants -which depend on this resource- should be considerably improved thanks, on the one hand, to the learning that allows the anticipation of periods of water stress, and on the other hand, to the information system that is able to warn decision-makers with alerts created from the formalization of prefectoral decrees.

Keywords: data mining, industry, machine Learning, shortage, water resources

Procedia PDF Downloads 101
22454 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate

Authors: Angela Maria Fasnacht

Abstract:

Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.

Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive

Procedia PDF Downloads 102
22453 Generating Arabic Fonts Using Rational Cubic Ball Functions

Authors: Fakharuddin Ibrahim, Jamaludin Md. Ali, Ahmad Ramli

Abstract:

In this paper, we will discuss about the data interpolation by using the rational cubic Ball curve. To generate a curve with a better and satisfactory smoothness, the curve segments must be connected with a certain amount of continuity. The continuity that we will consider is of type G1 continuity. The conditions considered are known as the G1 Hermite condition. A simple application of the proposed method is to generate an Arabic font satisfying the required continuity.

Keywords: data interpolation, rational ball curve, hermite condition, continuity

Procedia PDF Downloads 406
22452 Teenagers’ Decisions to Undergo Orthodontic Treatment: A Qualitative Study

Authors: Babak Nematshahrbabaki, Fallahi Arezoo

Abstract:

Objective: The aim of this study was to describe teenagers’ decisions to undergo orthodontic treatment through a qualitative study. Materials and methods: Twenty-three patients (12 girls), aged 12–18 years, at a dental clinic in Sanandaj the western part of Iran participated. Face-to-face and semi-structured interviews and two focus group discussions were held to gather data. Data analyzed by the grounded theory method. Results: ‘Decision-making’ was the core category. During the data analysis four main themes were developed: ‘being like everyone else’, ‘being diagnosed’, ‘maintaining the mouth’ and ‘cultural-social and environmental factors’. Conclusions: cultural- social and environmental factors have crucial role in decision-making to undergo orthodontic treatment. The teenagers were not fully conscious of these external influences. They thought their decision to undergo orthodontic treatment is independent while it is related to cultural- social and environmental factors.

Keywords: decision-making, qualitative study, teenager, orthodontic treatment

Procedia PDF Downloads 429
22451 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading

Authors: Robert Caulk

Abstract:

A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.

Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration

Procedia PDF Downloads 71
22450 Assessment of Land Suitability for Tea Cultivation Using Geoinformatics in the Mansehra and Abbottabad District, Pakistan

Authors: Nasir Ashraf, Sajid Rahid Ahmad, Adeel Ahmad

Abstract:

Pakistan is a major tea consumer country and ranked as the third largest importer of tea worldwide. Out of all beverage consumed in Pakistan, tea is the one with most demand for which tea import is inevitable. Being an agrarian country, Pakistan should cultivate its own tea and save the millions of dollars cost from tea import. So the need is to identify the most suitable areas with favorable weather condition and suitable soils where tea can be planted. This research is conducted over District Mansehra and District Abbottabad in Khyber Pakhtoonkhwah Province of Pakistan where the most favorable conditions for tea cultivation already exist and National Tea Research Institute has done successful experiments to cultivate high quality tea. High tech approach is adopted to meet the objectives of this research by using the remotely sensed data i.e. Aster DEM, Landsat8 Imagery. The Remote Sensing data was processed in Erdas Imagine, Envi and further analyzed in ESRI ArcGIS spatial analyst for final results and representation of result data in map layouts. Integration of remote sensing data with GIS provided the perfect suitability analysis. The results showed that out of all study area, 13.4% area is highly suitable while 33.44% area is suitable for tea plantation. The result of this research is an impressive GIS based outcome and structured format of data for the agriculture planners and Tea growers. Identification of suitable tea growing areas by using remotely sensed data and GIS techniques is a pressing need for the country. Analysis of this research lets the planners to address variety of action plans in an economical and scientific manner which can lead tea production in Pakistan to meet demand. This geomatics based model and approach may be used to identify more areas for tea cultivation to meet our demand which we can reduce by planting our own tea, and our country can be independent in tea production.

Keywords: agrarian country, GIS, geoinformatics, suitability analysis, remote sensing

Procedia PDF Downloads 369
22449 Machine Learning Algorithms for Rocket Propulsion

Authors: Rômulo Eustáquio Martins de Souza, Paulo Alexandre Rodrigues de Vasconcelos Figueiredo

Abstract:

In recent years, there has been a surge in interest in applying artificial intelligence techniques, particularly machine learning algorithms. Machine learning is a data-analysis technique that automates the creation of analytical models, making it especially useful for designing complex situations. As a result, this technology aids in reducing human intervention while producing accurate results. This methodology is also extensively used in aerospace engineering since this is a field that encompasses several high-complexity operations, such as rocket propulsion. Rocket propulsion is a high-risk operation in which engine failure could result in the loss of life. As a result, it is critical to use computational methods capable of precisely representing the spacecraft's analytical model to guarantee its security and operation. Thus, this paper describes the use of machine learning algorithms for rocket propulsion to aid the realization that this technique is an efficient way to deal with challenging and restrictive aerospace engineering activities. The paper focuses on three machine-learning-aided rocket propulsion applications: set-point control of an expander-bleed rocket engine, supersonic retro-propulsion of a small-scale rocket, and leak detection and isolation on rocket engine data. This paper describes the data-driven methods used for each implementation in depth and presents the obtained results.

Keywords: data analysis, modeling, machine learning, aerospace, rocket propulsion

Procedia PDF Downloads 91
22448 Resource Sharing Issues of Distributed Systems Influences on Healthcare Sector Concurrent Environment

Authors: Soo Hong Da, Ng Zheng Yao, Burra Venkata Durga Kumar

Abstract:

The Healthcare sector is a business that consists of providing medical services, manufacturing medical equipment and drugs as well as providing medical insurance to the public. Most of the time, the data stored in the healthcare database is to be related to patient’s information which is required to be accurate when it is accessed by authorized stakeholders. In distributed systems, one important issue is concurrency in the system as it ensures the shared resources to be synchronized and remains consistent through multiple read and write operations by multiple clients. The problems of concurrency in the healthcare sector are who gets the access and how the shared data is synchronized and remains consistent when there are two or more stakeholders attempting to the shared data simultaneously. In this paper, a framework that is beneficial to distributed healthcare sector concurrent environment is proposed. In the proposed framework, four different level nodes of the database, which are national center, regional center, referral center, and local center are explained. Moreover, the frame synchronization is not symmetrical. There are two synchronization techniques, which are complete and partial synchronization operation are explained. Furthermore, when there are multiple clients accessed at the same time, synchronization types are also discussed with cases at different levels and priorities to ensure data is synchronized throughout the processes.

Keywords: resources, healthcare, concurrency, synchronization, stakeholders, database

Procedia PDF Downloads 128
22447 Evaluation of Longitudinal Relaxation Time (T1) of Bone Marrow in Lumbar Vertebrae of Leukaemia Patients Undergoing Magnetic Resonance Imaging

Authors: M. G. R. S. Perera, B. S. Weerakoon, L. P. G. Sherminie, M. L. Jayatilake, R. D. Jayasinghe, W. Huang

Abstract:

The aim of this study was to measure and evaluate the Longitudinal Relaxation Times (T1) in bone marrow of an Acute Myeloid Leukaemia (AML) patient in order to explore the potential for a prognostic biomarker using Magnetic Resonance Imaging (MRI) which will be a non-invasive prognostic approach to AML. MR image data were collected in the DICOM format and MATLAB Simulink software was used in the image processing and data analysis. For quantitative MRI data analysis, Region of Interests (ROI) on multiple image slices were drawn encompassing vertebral bodies of L3, L4, and L5. T1 was evaluated using the T1 maps obtained. The estimated bone marrow mean value of T1 was 790.1 (ms) at 3T. However, the reported T1 value of healthy subjects is significantly (946.0 ms) higher than the present finding. This suggests that the T1 for bone marrow can be considered as a potential prognostic biomarker for AML patients.

Keywords: acute myeloid leukaemia, longitudinal relaxation time, magnetic resonance imaging, prognostic biomarker.

Procedia PDF Downloads 505
22446 Research on Reservoir Lithology Prediction Based on Residual Neural Network and Squeeze-and- Excitation Neural Network

Authors: Li Kewen, Su Zhaoxin, Wang Xingmou, Zhu Jian Bing

Abstract:

Conventional reservoir prediction methods ar not sufficient to explore the implicit relation between seismic attributes, and thus data utilization is low. In order to improve the predictive classification accuracy of reservoir lithology, this paper proposes a deep learning lithology prediction method based on ResNet (Residual Neural Network) and SENet (Squeeze-and-Excitation Neural Network). The neural network model is built and trained by using seismic attribute data and lithology data of Shengli oilfield, and the nonlinear mapping relationship between seismic attribute and lithology marker is established. The experimental results show that this method can significantly improve the classification effect of reservoir lithology, and the classification accuracy is close to 70%. This study can effectively predict the lithology of undrilled area and provide support for exploration and development.

Keywords: convolutional neural network, lithology, prediction of reservoir, seismic attributes

Procedia PDF Downloads 158
22445 A Comparison of Caesarean Section Indications and Characteristics in 2009 and 2020 in a Saudi Tertiary Hospital

Authors: Sarah K. Basudan, Ragad I. Al Jazzar, Zeinah Sulaihim, Hanan M. Al-Kadri

Abstract:

Background: Cesarean section has been increasing in recent years, with a wide range of etiologies contributing to this rise. This study aimed to assess the indications, outcomes, and complications in Riyadh, Saudi Arabia. Methods: A Retrospective Cohort study was conducted at King Abdulaziz medical city. The study includes two cohorts: G1 (2009) and G2 (2020) groups who met the inclusion criteria. The data was transferred to the SPSS (statistical package for social sciences) version 24 for analysis. The initial descriptive statistics were run for all variables, including numerical and categorical data. The numerical data were reported as median, and standard deviation and categorical data were reported as frequencies and percentages. Results: The data were collected from 399 women who were divided into two groups, G1(199) and G2(200). The mean age of all participants is 32+-6​; G1 and G2 had significant differences in age means with 30+-6 and 34+-5, respectively, with a p-value of <0.001, which indicates delayed fertility by four years. Moreover, a breech presentation was less likely to occur in G2 (OR 0.64, CI: 0.21-0.62. P<0.001). Nonetheless, maternal causes such as repeated C-sections and maternal medical conditions were more likely to happen in G2 (OR 1.5, CI: 1.04-2.38, p=0.03) and (OR 5.4, CI: 1.12-23.9, P=0.01), respectively. Furthermore, postpartum hemorrhage showed an increase of 12% in G2 (OR 5.4, CI: 2.2-13.4, p<0.001). G2 was more likely to be admitted to the neonatal intensive care unit (NICU) (OR 16, CI: 7.4-38.7) and to special care baby (SCB) (OR 7.2, CI: 3.9-13.1), both with a p-value<0.001 compared to regular nursery admission. Conclusion: There are multiple factors that are contributing to the increase in c section rate in a Saudi tertiary hospitals. The factors were suggested to be previous c-sections, abnormal fetal heart rate, malpresentation, and maternal or fetal medical conditions.

Keywords: cesarean sections, maternal indications, maternal complications, neonatal condition

Procedia PDF Downloads 57
22444 Optimal Cropping Pattern in an Irrigation Project: A Hybrid Model of Artificial Neural Network and Modified Simplex Algorithm

Authors: Safayat Ali Shaikh

Abstract:

Software has been developed for optimal cropping pattern in an irrigation project considering land constraint, water availability constraint and pick up flow constraint using modified Simplex Algorithm. Artificial Neural Network Models (ANN) have been developed to predict rainfall. AR (1) model used to generate 1000 years rainfall data to train the ANN. Simulation has been done with expected rainfall data. Eight number crops and three types of soil class have been considered for optimization model. Area under each crop and each soil class have been quantified using Modified Simplex Algorithm to get optimum net return. Efficacy of the software has been tested using data of large irrigation project in India.

Keywords: artificial neural network, large irrigation project, modified simplex algorithm, optimal cropping pattern

Procedia PDF Downloads 184
22443 Two-Phase Sampling for Estimating a Finite Population Total in Presence of Missing Values

Authors: Daniel Fundi Murithi

Abstract:

Missing data is a real bane in many surveys. To overcome the problems caused by missing data, partial deletion, and single imputation methods, among others, have been proposed. However, problems such as discarding usable data and inaccuracy in reproducing known population parameters and standard errors are associated with them. For regression and stochastic imputation, it is assumed that there is a variable with complete cases to be used as a predictor in estimating missing values in the other variable, and the relationship between the two variables is linear, which might not be realistic in practice. In this project, we estimate population total in presence of missing values in two-phase sampling. Instead of regression or stochastic models, non-parametric model based regression model is used in imputing missing values. Empirical study showed that nonparametric model-based regression imputation is better in reproducing variance of population total estimate obtained when there were no missing values compared to mean, median, regression, and stochastic imputation methods. Although regression and stochastic imputation were better than nonparametric model-based imputation in reproducing population total estimates obtained when there were no missing values in one of the sample sizes considered, nonparametric model-based imputation may be used when the relationship between outcome and predictor variables is not linear.

Keywords: finite population total, missing data, model-based imputation, two-phase sampling

Procedia PDF Downloads 110
22442 The Effects of Multiple Levels of Intelligence in an Algebra 1 Classroom

Authors: Abigail Gragg

Abstract:

The goal of this research study was to adjudicate if implementing Howard Gardner’s multiple levels of intelligence would enhance student achievement levels in an Algebra 1 College Preparatory class. This was conducted within every class by incorporating one level of the eight levels of intelligence into small group work in stations. Every class was conducted utilizing small-group instruction. Achievement levels were measured through various forms of collected data that expressed student understandings in class through formative assessments versus student understandings on summative assessments. The data samples included: assessments (i.e. summative and formative assessments), observable data, video recordings, a daily log book, student surveys, and checklists kept during the observation periods. Formative assessments were analyzed during each class period to measure in-class understanding. Summative assessments were dissected per question per accuracy to review the effects of each intelligence implemented. The data was collated into a coding workbook for further analysis to conclude the resulting themes of the research. These themes include 1) there was no correlation to multiple levels of intelligence enhancing student achievement, 2) bodily-kinesthetic intelligence showed to be the intelligence that had the most improvement on test questions and 3) out of all of the bits of intelligence, interpersonal intelligence enhanced student understanding in class.

Keywords: stations, small group instruction, multiple levels of intelligence, Mathematics, Algebra 1, student achievement, secondary school, instructional Pedagogies

Procedia PDF Downloads 82
22441 Performance Analysis of Multichannel OCDMA-FSO Network under Different Pervasive Conditions

Authors: Saru Arora, Anurag Sharma, Harsukhpreet Singh

Abstract:

To meet the growing need of high data rate and bandwidth, various efforts has been made nowadays for the efficient communication systems. Optical Code Division Multiple Access over Free space optics communication system seems an effective role for providing transmission at high data rate with low bit error rate and low amount of multiple access interference. This paper demonstrates the OCDMA over FSO communication system up to the range of 7000 m at a data rate of 5 Gbps. Initially, the 8 user OCDMA-FSO system is simulated and pseudo orthogonal codes are used for encoding. Also, the simulative analysis of various performance parameters like power and core effective area that are having an effect on the Bit error rate (BER) of the system is carried out. The simulative analysis reveals that the length of the transmission is limited by the multi-access interference (MAI) effect which arises when the number of users increases in the system.

Keywords: FSO, PSO, bit error rate (BER), opti system simulation, multiple access interference (MAI), q-factor

Procedia PDF Downloads 351
22440 The Study of Implications on Modern Businesses Performances by Digital Communities: Case of Data Leak

Authors: Asim Majeed, Anwar Ul Haq, Ayesha Asim, Mike Lloyd-Williams, Arshad Jamal, Usman Butt

Abstract:

This study aims to investigate the impact of data leak of M&S customers on digital communities. Modern businesses are using digital communities as an important public relations tool for marketing purposes. This form of communication helps companies to build better relationship with their customers which also act as another source of information. The communication between the customers and the organizations is not regulated so users may post positive and negative comments. There are new platforms being developed on a daily basis and it is very crucial for the businesses to not only get themselves familiar with those but also know how to reach their existing and perspective consumers. The driving force of marketing and communication in modern businesses is the digital communities and these are continuously increasing and developing. This phenomenon is changing the way marketing is conducted. The current research has discussed the implications on M&S business performance since the data was exploited on digital communities; users contacted M&S and raised the security concerns. M&S closed down its website for few hours to try to resolve the issue. The next day M&S made a public apology about this incidence. This information was proliferated on various digital communities and it has impacted negatively on M&S brand name, sales and customers. The content analysis approach is being used to collect qualitative data from 100 digital bloggers including social media communities such as Facebook and Twitter. The results and finding provide useful new insights into the nature and form of security concerns of digital users. Findings have theoretical and practical implications. This research will showcase a large corporation utilizing various digital community platforms and can serve as a model for future organizations.

Keywords: Digital, communities, performance, dissemination, implications, data, exploitation

Procedia PDF Downloads 375
22439 Analysis of Factors Affecting the Number of Infant and Maternal Mortality in East Java with Geographically Weighted Bivariate Generalized Poisson Regression Method

Authors: Luh Eka Suryani, Purhadi

Abstract:

Poisson regression is a non-linear regression model with response variable in the form of count data that follows Poisson distribution. Modeling for a pair of count data that show high correlation can be analyzed by Poisson Bivariate Regression. Data, the number of infant mortality and maternal mortality, are count data that can be analyzed by Poisson Bivariate Regression. The Poisson regression assumption is an equidispersion where the mean and variance values are equal. However, the actual count data has a variance value which can be greater or less than the mean value (overdispersion and underdispersion). Violations of this assumption can be overcome by applying Generalized Poisson Regression. Characteristics of each regency can affect the number of cases occurred. This issue can be overcome by spatial analysis called geographically weighted regression. This study analyzes the number of infant mortality and maternal mortality based on conditions in East Java in 2016 using Geographically Weighted Bivariate Generalized Poisson Regression (GWBGPR) method. Modeling is done with adaptive bisquare Kernel weighting which produces 3 regency groups based on infant mortality rate and 5 regency groups based on maternal mortality rate. Variables that significantly influence the number of infant and maternal mortality are the percentages of pregnant women visit health workers at least 4 times during pregnancy, pregnant women get Fe3 tablets, obstetric complication handled, clean household and healthy behavior, and married women with the first marriage age under 18 years.

Keywords: adaptive bisquare kernel, GWBGPR, infant mortality, maternal mortality, overdispersion

Procedia PDF Downloads 138
22438 A Fully-Automated Disturbance Analysis Vision for the Smart Grid Based on Smart Switch Data

Authors: Bernardo Cedano, Ahmed H. Eltom, Bob Hay, Jim Glass, Raga Ahmed

Abstract:

The deployment of smart grid devices such as smart meters and smart switches (SS) supported by a reliable and fast communications system makes automated distribution possible, and thus, provides great benefits to electric power consumers and providers alike. However, more research is needed before the full utility of smart switch data is realized. This paper presents new automated switching techniques using SS within the electric power grid. A concise background of the SS is provided, and operational examples are shown. Organization and presentation of data obtained from SS are shown in the context of the future goal of total automation of the distribution network. The description of application techniques, the examples of success with SS, and the vision outlined in this paper serve to motivate future research pertinent to disturbance analysis automation.

Keywords: disturbance automation, electric power grid, smart grid, smart switches

Procedia PDF Downloads 290
22437 Estimating Air Particulate Matter 10 Using Satellite Data and Analyzing Its Annual Temporal Pattern over Gaza Strip, Palestine

Authors: ِAbdallah A. A. Shaheen

Abstract:

Gaza Strip faces economic and political issues such as conflict, siege and urbanization; all these have led to an increase in the air pollution over Gaza Strip. In this study, Particulate matter 10 (PM10) concentration over Gaza Strip has been estimated by Landsat Thematic Mapper (TM) and Landsat Enhanced Thematic Mapper Plus (ETM+) data, based on a multispectral algorithm. Simultaneously, in-situ measurements for the corresponding particulate are acquired for selected time period. Landsat and ground data for eleven years are used to develop the algorithm while four years data (2002, 2006, 2010 and 2014) have been used to validate the results of algorithm. The developed algorithm gives highest regression, R coefficient value i.e. 0.86; RMSE value as 9.71 µg/m³; P values as 0. Average validation of algorithm show that calculated PM10 strongly correlates with measured PM10, indicating high efficiency of algorithm for the mapping of PM10 concentration during the years 2000 to 2014. Overall results show increase in minimum, maximum and average yearly PM10 concentrations, also presents similar trend over urban area. The rate of urbanization has been evaluated by supervised classification of the Landsat image. Urban sprawl from year 2000 to 2014 results in a high concentration of PM10 in the study area.

Keywords: PM10, landsat, atmospheric reflectance, Gaza strip, urbanization

Procedia PDF Downloads 232
22436 Simulation IDM for Schedule Generation of Slip-Form Operations

Authors: Hesham A. Khalek, Shafik S. Khoury, Remon F. Aziz, Mohamed A. Hakam

Abstract:

Slipforming operation’s linearity is a source of planning complications, and operation is usually subjected to bottlenecks at any point, so careful planning is required in order to achieve success. On the other hand, Discrete-event simulation concepts can be applied to simulate and analyze construction operations and to efficiently support construction scheduling. Nevertheless, preparation of input data for construction simulation is very challenging, time-consuming and human prone-error source. Therefore, to enhance the benefits of using DES in construction scheduling, this study proposes an integrated module to establish a framework for automating the generation of time schedules and decision support for Slipform construction projects, particularly through the project feasibility study phase by using data exchange between project data stored in an Intermediate database, DES and Scheduling software. Using the stored information, proposed system creates construction tasks attribute [e.g. activities durations, material quantities and resources amount], then DES uses all the given information to create a proposal for the construction schedule automatically. This research is considered a demonstration of a flexible Slipform project modeling, rapid scenario-based planning and schedule generation approach that may be of interest to both practitioners and researchers.

Keywords: discrete-event simulation, modeling, construction planning, data exchange, scheduling generation, EZstrobe

Procedia PDF Downloads 356
22435 Small Micro and Medium Enterprises Perception-Based Framework to Access Financial Support

Authors: Melvin Mothoa

Abstract:

Small Micro and Medium Enterprises are very significant for the development of their market economies. They are the main creators of the new working places, and they present a vital core of the market economy in countries across the globe. Access to finance is identified as crucial for small, micro, and medium-sized enterprises for their growth and innovation. This paper is conceived to propose a perception-based SMME framework to aid in access to financial support. Furthermore, the study will address issues that impede SMMEs in South Africa from obtaining finance from financial institutions. The framework will be tested against data collected from 200 Small Micro & Medium Enterprises in the Gauteng province of South Africa. The study adopts a quantitative method, and the delivery of self-administered questionnaires to SMMEs will be the primary data collection tool. Structural equation modeling will be used to further analyse the data collected.

Keywords: finance, small business, growth, development

Procedia PDF Downloads 86
22434 Kinetics, Equilibrium and Thermodynamics of the Adsorption of Triphenyltin onto NanoSiO₂/Fly Ash/Activated Carbon Composite

Authors: Olushola S. Ayanda, Olalekan S. Fatoki, Folahan A. Adekola, Bhekumusa J. Ximba, Cecilia O. Akintayo

Abstract:

In the present study, the kinetics, equilibrium and thermodynamics of the adsorption of triphenyltin (TPT) from TPT-contaminated water onto nanoSiO2/fly ash/activated carbon composite was investigated in batch adsorption system. Equilibrium adsorption data were analyzed using Langmuir, Freundlich, Temkin and Dubinin–Radushkevich (D-R) isotherm models. Pseudo first- and second-order, Elovich and fractional power models were applied to test the kinetic data and in order to understand the mechanism of adsorption, thermodynamic parameters such as ΔG°, ΔSo and ΔH° were also calculated. The results showed a very good compliance with pseudo second-order equation while the Freundlich and D-R models fit the experiment data. Approximately 99.999 % TPT was removed from the initial concentration of 100 mg/L TPT at 80oC, contact time of 60 min, pH 8 and a stirring speed of 200 rpm. Thus, nanoSiO2/fly ash/activated carbon composite could be used as effective adsorbent for the removal of TPT from contaminated water and wastewater.

Keywords: isotherm, kinetics, nanoSiO₂/fly ash/activated carbon composite, tributyltin

Procedia PDF Downloads 278
22433 A Comparation Analysis of Islamic Bank Efficiency in the United Kingdom and Indonesia during Eurozone Crisis Using Data Envelopment Analysis

Authors: Nisful Laila, Fatin Fadhilah Hasib, Puji Sucia Sukmaningrum, Achsania Hendratmi

Abstract:

The purpose of this study is to determine and comparing the level of efficiency of Islamic Banks in Indonesia and United Kingdom during eurozone sovereign debt crisis. This study using a quantitative non-parametric approach with Data Envelopment Analysis (DEA) VRS assumption, and a statistical tool Mann-Whitney U-Test. The samples are 11 Islamic Banks in Indonesia and 4 Islamic Banks in England. This research used mediating approach. Input variable consists of total deposit, asset, and the cost of labour. Output variable consists of financing and profit/loss. This study shows that the efficiency of Islamic Bank in Indonesia and United Kingdom are varied and fluctuated during the observation period. There is no significant different the efficiency performance of Islamic Banks in Indonesia and United Kingdom.

Keywords: data envelopment analysis, efficiency, eurozone crisis, islamic bank

Procedia PDF Downloads 309
22432 Knowledge Representation and Inconsistency Reasoning of Class Diagram Maintenance in Big Data

Authors: Chi-Lun Liu

Abstract:

Requirements modeling and analysis are important in successful information systems' maintenance. Unified Modeling Language (UML) class diagrams are useful standards for modeling information systems. To our best knowledge, there is a lack of a systems development methodology described by the organism metaphor. The core concept of this metaphor is adaptation. Using the knowledge representation and reasoning approach and ontologies to adopt new requirements are emergent in recent years. This paper proposes an organic methodology which is based on constructivism theory. This methodology is a knowledge representation and reasoning approach to analyze new requirements in the class diagrams maintenance. The process and rules in the proposed methodology automatically analyze inconsistencies in the class diagram. In the big data era, developing an automatic tool based on the proposed methodology to analyze large amounts of class diagram data is an important research topic in the future.

Keywords: knowledge representation, reasoning, ontology, class diagram, software engineering

Procedia PDF Downloads 214
22431 Investigating Self-Confidence Influence on English as a Foreign Language Student English Language Proficiency Level

Authors: Ali A. Alshahrani

Abstract:

This study aims to identify Saudi English as a Foreign Language (EFL) students' perspectives towards using the English language in their studies. The study explores students' self-confident and its association with students' actual performance in English courses in their different academic programs. A multimodal methodology was used to fulfill the research purpose and answer the research questions. A 25-item survey questionnaire and final examination grades were used to collect data. Two hundred forty-one students agreed to participate in the study. They completed the questionnaire and agreed to release their final grades to be a part of the collected data. The data were coded and analyzed by SPSS software. The findings indicated a significant difference in students' performance in English courses between participants' academic programs on the one hand. Students' self-confidence in their English language skills, on the other hand, was not significantly different between participants' academic programs. Data analysis also revealed no correlational relationship between students' self-confidence level and their language skills and their performance. The study raises more questions about other vital factors such as course instructors' views of the materials, faculty members of the target department, family belief in the usefulness of the program, potential employers. These views and beliefs shape the student's preparation process and, therefore, should be explored further.

Keywords: English language intensive program, language proficiency, performance, self-confidence

Procedia PDF Downloads 114