Search results for: Data delivery
5753 Perspectives on Neuropsychological Testimony
Authors: Valene J. Gresham, MA, Laura A. Brodie
Abstract:
For the last decade, statistics show traumatic brain injury (TBI) is a growing concern in our legal system. In an effort to obtain data regarding the influence of neuropsychological expert witness testimony in a criminal case, this study tested three hypotheses. H1: The majority of jurors will vote not guilty, due to mild head injury. H2: The jurors will give more credence to the testimony of the neuropsychologist rather than the psychiatrist. H3: The jurors will be more lenient in their sentencing, given the testimony of the neuropsychologist-s testimony. The criterion for inclusion in the study as a participant is identical to those used for inclusion in the eligibility for jury duty in the United States. A chisquared test was performed to analyze the data for the three hypotheses. The results supported all of the hypotheses; however statistical significance was seen in H1 and H2 only.Keywords: Expert witness, jury decision, neuropsychology, traumatic brain injury.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23325752 Computation of Flood and Drought Years over the North-West Himalayan Region Using Indian Meteorological Department Rainfall Data
Authors: Sudip Kumar Kundu, Charu Singh
Abstract:
The climatic condition over Indian region is highly dependent on monsoon. India receives maximum amount of rainfall during southwest monsoon. Indian economy is highly dependent on agriculture. The presence of flood and drought years influenced the total cultivation system as well as the economy of the country as Indian agricultural systems is still highly dependent on the monsoon rainfall. The present study has been planned to investigate the flood and drought years for the north-west Himalayan region from 1951 to 2014 by using area average Indian Meteorological Department (IMD) rainfall data. For this investigation the Normalized index (NI) has been utilized to find out whether the particular year is drought or flood. The data have been extracted for the north-west Himalayan (NWH) region states namely Uttarakhand (UK), Himachal Pradesh (HP) and Jammu and Kashmir (J&K) to find out the rainy season average rainfall for each year, climatological mean and the standard deviation. After calculation it has been plotted by the diagrams (or graphs) to show the results- some of the years associated with drought years, some are flood years and rest are neutral. The flood and drought years can also relate with the large-scale phenomena El-Nino and La-Lina.
Keywords: Indian Meteorological Department, Rainfall, Normalized index, Flood, Drought, NWH.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8165751 Analyzing Keyword Networks for the Identification of Correlated Research Topics
Authors: Thiago M. R. Dias, Patrícia M. Dias, Gray F. Moita
Abstract:
The production and publication of scientific works have increased significantly in the last years, being the Internet the main factor of access and distribution of these works. Faced with this, there is a growing interest in understanding how scientific research has evolved, in order to explore this knowledge to encourage research groups to become more productive. Therefore, the objective of this work is to explore repositories containing data from scientific publications and to characterize keyword networks of these publications, in order to identify the most relevant keywords, and to highlight those that have the greatest impact on the network. To do this, each article in the study repository has its keywords extracted and in this way the network is characterized, after which several metrics for social network analysis are applied for the identification of the highlighted keywords.Keywords: Extraction and data integration, bibliometrics, scientometrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6955750 Optimizing of Fuzzy C-Means Clustering Algorithm Using GA
Authors: Mohanad Alata, Mohammad Molhim, Abdullah Ramini
Abstract:
Fuzzy C-means Clustering algorithm (FCM) is a method that is frequently used in pattern recognition. It has the advantage of giving good modeling results in many cases, although, it is not capable of specifying the number of clusters by itself. In FCM algorithm most researchers fix weighting exponent (m) to a conventional value of 2 which might not be the appropriate for all applications. Consequently, the main objective of this paper is to use the subtractive clustering algorithm to provide the optimal number of clusters needed by FCM algorithm by optimizing the parameters of the subtractive clustering algorithm by an iterative search approach and then to find an optimal weighting exponent (m) for the FCM algorithm. In order to get an optimal number of clusters, the iterative search approach is used to find the optimal single-output Sugenotype Fuzzy Inference System (FIS) model by optimizing the parameters of the subtractive clustering algorithm that give minimum least square error between the actual data and the Sugeno fuzzy model. Once the number of clusters is optimized, then two approaches are proposed to optimize the weighting exponent (m) in the FCM algorithm, namely, the iterative search approach and the genetic algorithms. The above mentioned approach is tested on the generated data from the original function and optimal fuzzy models are obtained with minimum error between the real data and the obtained fuzzy models.Keywords: Fuzzy clustering, Fuzzy C-Means, Genetic Algorithm, Sugeno fuzzy systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32565749 Computational Fluid Dynamics Modeling of Downward Bubbly Flows
Authors: Mahmood Reza Rahimi, Hajir Karimi
Abstract:
Downward turbulent bubbly flows in pipes were modeled using computational fluid dynamics tools. The Hydrodynamics, phase distribution and turbulent structure of twophase air-water flow in a 57.15 mm diameter and 3.06 m length vertical pipe was modeled by using the 3-D Eulerian-Eulerian multiphase flow approach. Void fraction, liquid velocity and turbulent fluctuations profiles were calculated and compared against experimental data. CFD results are in good agreement with experimental data.Keywords: CFD, Bubbly flow, Vertical pipe, Population balance modeling, Gas void fraction, Liquid velocity, Normal turbulent stresses.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24855748 Data Mining to Capture User-Experience: A Case Study in Notebook Product Appearance Design
Authors: Rhoann Kerh, Chen-Fu Chien, Kuo-Yi Lin
Abstract:
In the era of rapidly increasing notebook market, consumer electronics manufacturers are facing a highly dynamic and competitive environment. In particular, the product appearance is the first part for user to distinguish the product from the product of other brands. Notebook product should differ in its appearance to engage users and contribute to the user experience (UX). The UX evaluates various product concepts to find the design for user needs; in addition, help the designer to further understand the product appearance preference of different market segment. However, few studies have been done for exploring the relationship between consumer background and the reaction of product appearance. This study aims to propose a data mining framework to capture the user’s information and the important relation between product appearance factors. The proposed framework consists of problem definition and structuring, data preparation, rules generation, and results evaluation and interpretation. An empirical study has been done in Taiwan that recruited 168 subjects from different background to experience the appearance performance of 11 different portable computers. The results assist the designers to develop product strategies based on the characteristics of consumers and the product concept that related to the UX, which help to launch the products to the right customers and increase the market shares. The results have shown the practical feasibility of the proposed framework.
Keywords: Consumers Decision Making, Product Design, Rough Set Theory, User Experience.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35125747 Evaluation of the Role of Advocacy and the Quality of Care in Reducing Health Inequalities for People with Autism, Intellectual and Developmental Disabilities at Sheffield Teaching Hospitals
Authors: Jonathan Sahu, Jill Aylott
Abstract:
Individuals with Autism, Intellectual and Developmental disabilities (AIDD) are one of the most vulnerable groups in society, hampered not only by their own limitations to understand and interact with the wider society, but also societal limitations in perception and understanding. Communication to express their needs and wishes is fundamental to enable such individuals to live and prosper in society. This research project was designed as an organisational case study, in a large secondary health care hospital within the National Health Service (NHS), to assess the quality of care provided to people with AIDD and to review the role of advocacy to reduce health inequalities in these individuals. Methods: The research methodology adopted was as an “insider researcher”. Data collection included both quantitative and qualitative data i.e. a mixed method approach. A semi-structured interview schedule was designed and used to obtain qualitative and quantitative primary data from a wide range of interdisciplinary frontline health care workers to assess their understanding and awareness of systems, processes and evidence based practice to offer a quality service to people with AIDD. Secondary data were obtained from sources within the organisation, in keeping with “Case Study” as a primary method, and organisational performance data were then compared against national benchmarking standards. Further data sources were accessed to help evaluate the effectiveness of different types of advocacy that were present in the organisation. This was gauged by measures of user and carer experience in the form of retrospective survey analysis, incidents and complaints. Results: Secondary data demonstrate near compliance of the Organisation with the current national benchmarking standard (Monitor Compliance Framework). However, primary data demonstrate poor knowledge of the Mental Capacity Act 2005, poor knowledge of organisational systems, processes and evidence based practice applied for people with AIDD. In addition there was poor knowledge and awareness of frontline health care workers of advocacy and advocacy schemes for this group. Conclusions: A significant amount of work needs to be undertaken to improve the quality of care delivered to individuals with AIDD. An operational strategy promoting the widespread dissemination of information may not be the best approach to deliver quality care and optimal patient experience and patient advocacy. In addition, a more robust set of standards, with appropriate metrics, needs to be developed to assess organisational performance which will stand the test of professional and public scrutiny.Keywords: Autism, intellectual developmental disabilities, advocacy, health inequalities, quality of care.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8925746 Impacts of Financial Development and Operating Scale on Bank Efficiencies in Taiwan
Authors: Ying-Hsiu Chen, Pao-Peng Hsu
Abstract:
This paper adopts a two-stage data envelopment analysis to explore the impacts of financial development and bank operating scale on bank efficiencies. The sample comprises unbalanced panel data of 32 Taiwanese listed domestic commercial banks over the period 1998 to 2013. Empirical results show that pure technical efficiency is positively related to financial development, whereas the effect of financial development on scale efficiency is insignificant. Enlargement of bank operating scale improves bank efficiencies, but the efficiency gains are decreased gradually when the scale increases. Increases in capital adequacy ratio and market power of loans lead into a growth of bank efficiencies.Keywords: Financial development, Operating scale, Efficiency, DEA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17485745 Comparative Study of Decision Trees and Rough Sets Theory as Knowledge ExtractionTools for Design and Control of Industrial Processes
Authors: Marcin Perzyk, Artur Soroczynski
Abstract:
General requirements for knowledge representation in the form of logic rules, applicable to design and control of industrial processes, are formulated. Characteristic behavior of decision trees (DTs) and rough sets theory (RST) in rules extraction from recorded data is discussed and illustrated with simple examples. The significance of the models- drawbacks was evaluated, using simulated and industrial data sets. It is concluded that performance of DTs may be considerably poorer in several important aspects, compared to RST, particularly when not only a characterization of a problem is required, but also detailed and precise rules are needed, according to actual, specific problems to be solved.Keywords: Knowledge extraction, decision trees, rough setstheory, industrial processes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16335744 Urbanization and Income Inequality in Thailand
Authors: Acumsiri Tantiakrnpanit
Abstract:
This paper aims to examine the relationship between urbanization and income inequality in Thailand during the period 2002–2020, using a panel of data for 76 provinces collected from Thailand’s National Statistical Office (Labor Force Survey: LFS), as well as geospatial data from the U.S. Air Force Defense Meteorological Satellite Program (DMSP) and the Visible Infrared Imaging Radiometer Suite Day/Night band (VIIRS-DNB) satellite for 19 selected years. This paper employs two different definitions to identify urban areas: 1) Urban areas defined by Thailand's National Statistical Office (LFS), and 2) Urban areas estimated using nighttime light data from the DMSP and VIIRS-DNB satellite. The second method includes two sub-categories: 2.1) Determining urban areas by calculating nighttime light density with a population density of 300 people per square kilometer, and 2.2) Calculating urban areas based on nighttime light density corresponding to a population density of 1,500 people per square kilometer. The empirical analysis based on Ordinary Least Squares (OLS), fixed effects, and random effects models reveals a consistent U-shaped relationship between income inequality and urbanization. The findings from the econometric analysis demonstrate that urbanization or population density has a significant and negative impact on income inequality. Moreover, the square of urbanization shows a statistically significant positive impact on income inequality. Additionally, there is a negative association between logarithmically transformed income and income inequality. This paper also proposes the inclusion of satellite imagery, geospatial data, and spatial econometric techniques in future studies to conduct quantitative analysis of spatial relationships.
Keywords: Income inequality, nighttime light, population density, Thailand, urbanization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1275743 Impact of Grade Sensitivity on Learning Motivation and Academic Performance
Authors: Salwa Aftab, Sehrish Riaz
Abstract:
The objective of this study was to check the impact of grade sensitivity on learning motivation and academic performance of students and to remove the degree of difference that exists among students regarding the cause of their learning motivation and also to gain knowledge about this matter since it has not been adequately researched. Data collection was primarily done through the academic sector of Pakistan and was depended upon the responses given by students solely. A sample size of 208 university students was selected. Both paper and online surveys were used to collect data from respondents. The results of the study revealed that grade sensitivity has a positive relationship with the learning motivation of students and their academic performance. These findings were carried out through systematic correlation and regression analysis.Keywords: Academic performance, correlation, grade sensitivity, learning motivation, regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27805742 3D Star Skeleton for Fast Human Posture Representation
Authors: Sungkuk Chun, Kwangjin Hong, Keechul Jung
Abstract:
In this paper, we propose an improved 3D star skeleton technique, which is a suitable skeletonization for human posture representation and reflects the 3D information of human posture. Moreover, the proposed technique is simple and then can be performed in real-time. The existing skeleton construction techniques, such as distance transformation, Voronoi diagram, and thinning, focus on the precision of skeleton information. Therefore, those techniques are not applicable to real-time posture recognition since they are computationally expensive and highly susceptible to noise of boundary. Although a 2D star skeleton was proposed to complement these problems, it also has some limitations to describe the 3D information of the posture. To represent human posture effectively, the constructed skeleton should consider the 3D information of posture. The proposed 3D star skeleton contains 3D data of human, and focuses on human action and posture recognition. Our 3D star skeleton uses the 8 projection maps which have 2D silhouette information and depth data of human surface. And the extremal points can be extracted as the features of 3D star skeleton, without searching whole boundary of object. Therefore, on execution time, our 3D star skeleton is faster than the “greedy" 3D star skeleton using the whole boundary points on the surface. Moreover, our method can offer more accurate skeleton of posture than the existing star skeleton since the 3D data for the object is concerned. Additionally, we make a codebook, a collection of representative 3D star skeletons about 7 postures, to recognize what posture of constructed skeleton is.Keywords: computer vision, gesture recognition, skeletonization, human posture representation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21225741 Polymeric Sustained Biodegradable Patch Formulation for Wound Healing
Authors: Abhay Asthana, Gyati Shilakari Asthana
Abstract:
It is the patient compliance and stability in combination with controlled drug delivery and biocompatibility that forms the core feature in present research and development of sustained biodegradable patch formulation intended for wound healing. The aim was to impart sustained degradation, sterile formulation, significant folding endurance, elasticity, biodegradability, bio-acceptability and strength. The optimized formulation comprised of polymers including Hydroxypropyl methyl cellulose, Ethylcellulose, and Gelatin, and Citric Acid PEG Citric acid (CPEGC) triblock dendrimers and active Curcumin. Polymeric mixture dissolved in geometric order in suitable medium through continuous stirring under ambient conditions. With continued stirring Curcumin was added with aid of DCM and Methanol in optimized ratio to get homogenous dispersion. The dispersion was sonicated with optimum frequency and for given time and later casted to form a patch form. All steps were carried out under strict aseptic conditions. The formulations obtained in the acceptable working range were decided based on thickness, uniformity of drug content, smooth texture and flexibility and brittleness. The patch kept on stability using butter paper in sterile pack displayed folding endurance in range of 20 to 23 times without any evidence of crack in an optimized formulation at room temperature (RT) (24 ± 2°C). The patch displayed acceptable parameters after stability study conducted in refrigerated conditions (8±0.2°C) and at RT (24 ± 2°C) up to 90 days. Further, no significant changes were observed in critical parameters such as elasticity, biodegradability, drug release and drug content during stability study conducted at RT 24±2°C for 45 and 90 days. The drug content was in range 95 to 102%, moisture content didn’t exceeded 19.2% and patch passed the content uniformity test. Percentage cumulative drug release was found to be 80% in 12h and matched the biodegradation rate as drug release with correlation factor R2>0.9. The biodegradable patch based formulation developed shows promising results in terms of stability and release profiles.Keywords: Sustained biodegradation, wound healing, polymeric patch, stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23035740 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective
Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou
Abstract:
The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.
Keywords: Mortality map, spatial patterns, statistical area, variation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9905739 A New Approach for Image Segmentation using Pillar-Kmeans Algorithm
Authors: Ali Ridho Barakbah, Yasushi Kiyoki
Abstract:
This paper presents a new approach for image segmentation by applying Pillar-Kmeans algorithm. This segmentation process includes a new mechanism for clustering the elements of high-resolution images in order to improve precision and reduce computation time. The system applies K-means clustering to the image segmentation after optimized by Pillar Algorithm. The Pillar algorithm considers the pillars- placement which should be located as far as possible from each other to withstand against the pressure distribution of a roof, as identical to the number of centroids amongst the data distribution. This algorithm is able to optimize the K-means clustering for image segmentation in aspects of precision and computation time. It designates the initial centroids- positions by calculating the accumulated distance metric between each data point and all previous centroids, and then selects data points which have the maximum distance as new initial centroids. This algorithm distributes all initial centroids according to the maximum accumulated distance metric. This paper evaluates the proposed approach for image segmentation by comparing with K-means and Gaussian Mixture Model algorithm and involving RGB, HSV, HSL and CIELAB color spaces. The experimental results clarify the effectiveness of our approach to improve the segmentation quality in aspects of precision and computational time.Keywords: Image segmentation, K-means clustering, Pillaralgorithm, color spaces.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33725738 The Relations between Seismic Results and Groundwater near the Gokpinar Damp Area, Denizli, Turkey
Authors: Mahmud Gungor, Ali Aydin, Erdal Akyol, Suat Tasdelen
Abstract:
The understanding of geotechnical characteristics of near-surface material and the effects of the groundwater is very important problem in such as site studies. For showing the relations between seismic data and groundwater, we selected about 25 km2 as the study area. It has been presented which is a detailed work of seismic data and groundwater depths of Gokpinar Damp area. Seismic waves velocity (Vp and Vs) are very important parameters showing the soil properties. The seismic records were used the method of the multichannel analysis of surface waves near area of Gokpinar Damp area. Sixty sites in this area have been investigated with survey lines about 60 m in length. MASW (Multichannel analysis of surface wave) method has been used to generate onedimensional shear wave velocity profile at locations. These shear wave velocities are used to estimate equivalent shear wave velocity in the study area at every 2 and 5 m intervals up to a depth of 45 m. Levels of equivalent shear wave velocity of soil are used the classified of the study area. After the results of the study, it must be considered as components of urban planning and building design of Gokpinar Damp area, Denizli and the application and use of these results should be required and enforced by municipal authorities.
Keywords: Seismic data, Gokpinar Damp, urban planning, Denizli.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23585737 Human Absorbed Dose Estimation of a New IN-111 Imaging Agent Based on Rat Data
Authors: H. Yousefnia, S. Zolghadri
Abstract:
The measurement of organ radiation exposure dose is one of the most important steps to be taken initially, for developing a new radiopharmaceutical. In this study, the dosimetric studies of a novel agent for SPECT-imaging of the bone metastasis, 111In- 1,4,7,10-tetraazacyclododecane-1,4,7,10 tetraethylene phosphonic acid (111In-DOTMP) complex, have been carried out to estimate the dose in human organs based on the data derived from rats. The radiolabeled complex was prepared with high radiochemical purity in the optimal conditions. Biodistribution studies of the complex was investigated in the male Syrian rats at selected times after injection (2, 4, 24 and 48 h). The human absorbed dose estimation of the complex was made based on data derived from the rats by the radiation absorbed dose assessment resource (RADAR) method. 111In-DOTMP complex was prepared with high radiochemical purity of >99% (ITLC). Total body effective absorbed dose for 111In- DOTMP was 0.061 mSv/MBq. This value is comparable to the other 111In clinically used complexes. The results show that the dose with respect to the critical organs is satisfactory within the acceptable range for diagnostic nuclear medicine procedures. Generally, 111In- DOTMP has interesting characteristics and can be considered as a viable agent for SPECT-imaging of the bone metastasis in the near future.Keywords: In-111, DOTMP, Internal Dosimetry, RADAR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19525736 Novel GPU Approach in Predicting the Directional Trend of the S&P 500
Authors: A. J. Regan, F. J. Lidgey, M. Betteridge, P. Georgiou, C. Toumazou, K. Hayatleh, J. R. Dibble
Abstract:
Our goal is development of an algorithm capable of predicting the directional trend of the Standard and Poor’s 500 index (S&P 500). Extensive research has been published attempting to predict different financial markets using historical data testing on an in-sample and trend basis, with many authors employing excessively complex mathematical techniques. In reviewing and evaluating these in-sample methodologies, it became evident that this approach was unable to achieve sufficiently reliable prediction performance for commercial exploitation. For these reasons, we moved to an out-ofsample strategy based on linear regression analysis of an extensive set of financial data correlated with historical closing prices of the S&P 500. We are pleased to report a directional trend accuracy of greater than 55% for tomorrow (t+1) in predicting the S&P 500.
Keywords: Financial algorithm, GPU, S&P 500, stock market prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17355735 The Performance of Predictive Classification Using Empirical Bayes
Authors: N. Deetae, S. Sukparungsee, Y. Areepong, K. Jampachaisri
Abstract:
This research is aimed to compare the percentages of correct classification of Empirical Bayes method (EB) to Classical method when data are constructed as near normal, short-tailed and long-tailed symmetric, short-tailed and long-tailed asymmetric. The study is performed using conjugate prior, normal distribution with known mean and unknown variance. The estimated hyper-parameters obtained from EB method are replaced in the posterior predictive probability and used to predict new observations. Data are generated, consisting of training set and test set with the sample sizes 100, 200 and 500 for the binary classification. The results showed that EB method exhibited an improved performance over Classical method in all situations under study.
Keywords: Classification, Empirical Bayes, Posterior predictive probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15975734 Improved C-Fuzzy Decision Tree for Intrusion Detection
Authors: Krishnamoorthi Makkithaya, N. V. Subba Reddy, U. Dinesh Acharya
Abstract:
As the number of networked computers grows, intrusion detection is an essential component in keeping networks secure. Various approaches for intrusion detection are currently being in use with each one has its own merits and demerits. This paper presents our work to test and improve the performance of a new class of decision tree c-fuzzy decision tree to detect intrusion. The work also includes identifying best candidate feature sub set to build the efficient c-fuzzy decision tree based Intrusion Detection System (IDS). We investigated the usefulness of c-fuzzy decision tree for developing IDS with a data partition based on horizontal fragmentation. Empirical results indicate the usefulness of our approach in developing the efficient IDS.Keywords: Data mining, Decision tree, Feature selection, Fuzzyc- means clustering, Intrusion detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15775733 A Web Pages Automatic Filtering System
Authors: O. Nouali, A. Saidi, H. Chahrat, A. Krinah, B. Toursel
Abstract:
This article describes a Web pages automatic filtering system. It is an open and dynamic system based on multi agents architecture. This system is built up by a set of agents having each a quite precise filtering task of to carry out (filtering process broken up into several elementary treatments working each one a partial solution). New criteria can be added to the system without stopping its execution or modifying its environment. We want to show applicability and adaptability of the multi-agents approach to the networks information automatic filtering. In practice, most of existing filtering systems are based on modular conception approaches which are limited to centralized applications which role is to resolve static data flow problems. Web pages filtering systems are characterized by a data flow which varies dynamically.Keywords: Agent, Distributed Artificial Intelligence, Multiagents System, Web pages filtering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13775732 Reconstruction of the Most Energetic Modes in a Fully Developed Turbulent Channel Flow with Density Variation
Authors: Elteyeb Eljack, Takashi Ohta
Abstract:
Proper orthogonal decomposition (POD) is used to reconstruct spatio-temporal data of a fully developed turbulent channel flow with density variation at Reynolds number of 150, based on the friction velocity and the channel half-width, and Prandtl number of 0.71. To apply POD to the fully developed turbulent channel flow with density variation, the flow field (velocities, density, and temperature) is scaled by the corresponding root mean square values (rms) so that the flow field becomes dimensionless. A five-vector POD problem is solved numerically. The reconstructed second-order moments of velocity, temperature, and density from POD eigenfunctions compare favorably to the original Direct Numerical Simulation (DNS) data.
Keywords: Pattern Recognition, POD, Coherent Structures, Low dimensional modelling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13735731 Strategic Leadership and Sustainable Project Management in Enugu, Nigeria
Authors: Nnadi Ezekiel Ejiofor
Abstract:
The study investigates the connection between strategic leadership and project management sustainability, with an emphasis on building projects Nigeria. The study set out to accomplish two specific goals: first, it sought to establish a link between creative project management and resource efficiency in construction projects in Nigeria; and second, it sought to establish a link between innovative thinking and waste minimization in those same projects. A structured questionnaire was used to collect primary data from 45 registered construction enterprises in the study area as part of the study's descriptive research approach. Due to the nonparametric nature of the data, Spearman Rank Order Correlation was used to evaluate the acquired data. The findings demonstrate that creative project management had a significant positive impact on resource efficiency in construction projects carried out by project management firms (r =.849; p.001), and that innovative thinking had a significant impact on waste reduction in those same projects (r =.849; p.001). It was determined that strategic leadership had a significant impact on the sustainability of project management, and it was thus advised that project managers should foresee, prepare for, and effectively communicate present and future developments to project staff in order to ensure that the objective of sustainable initiatives, such as recycling and reuse, is implemented in construction projects.
Keywords: Construction, project management, strategic leadership, sustainability, waste reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1125730 A Renovated Cook's Distance Based On The Buckley-James Estimate In Censored Regression
Authors: Nazrina Aziz, Dong Q. Wang
Abstract:
There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.
Keywords: Buckley-James estimators, censored regression, censored data, diagnostic analysis, product-limit estimator, renovated Cook's Distance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14385729 Quality of Service in Multioperator GPON Access Networks with Triple-Play Services
Authors: Germán Santos-Boada, Jordi Domingo-Pascual
Abstract:
Recently, in some places, optical-fibre access networks have been used with GPON technology belonging to organizations (in most cases public bodies) that act as neutral operators. These operators simultaneously provide network services to various telecommunications operators that offer integrated voice, data and television services. This situation creates new problems related to quality of service, since the interests of the users are intermingled with the interests of the operators. In this paper, we analyse this problem and consider solutions that make it possible to provide guaranteed quality of service for voice over IP, data services and interactive digital television.Keywords: GPON networks, multioperator, quality of service, triple-play services.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34335728 Gesture Recognition by Data Fusion of Time-of-Flight and Color Cameras
Authors: Piercarlo Dondi, Luca Lombardi, Marco Porta
Abstract:
In the last years numerous applications of Human- Computer Interaction have exploited the capabilities of Time-of- Flight cameras for achieving more and more comfortable and precise interactions. In particular, gesture recognition is one of the most active fields. This work presents a new method for interacting with a virtual object in a 3D space. Our approach is based on the fusion of depth data, supplied by a ToF camera, with color information, supplied by a HD webcam. The hand detection procedure does not require any learning phase and is able to concurrently manage gestures of two hands. The system is robust to the presence in the scene of other objects or people, thanks to the use of the Kalman filter for maintaining the tracking of the hands.Keywords: Gesture recognition, human-computer interaction, Time-of-Flight camera.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19425727 System Identification Based on Stepwise Regression for Dynamic Market Representation
Authors: Alexander Efremov
Abstract:
A system for market identification (SMI) is presented. The resulting representations are multivariable dynamic demand models. The market specifics are analyzed. Appropriate models and identification techniques are chosen. Multivariate static and dynamic models are used to represent the market behavior. The steps of the first stage of SMI, named data preprocessing, are mentioned. Next, the second stage, which is the model estimation, is considered in more details. Stepwise linear regression (SWR) is used to determine the significant cross-effects and the orders of the model polynomials. The estimates of the model parameters are obtained by a numerically stable estimator. Real market data is used to analyze SMI performance. The main conclusion is related to the applicability of multivariate dynamic models for representation of market systems.Keywords: market identification, dynamic models, stepwise regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16185726 A Hybrid Multi-Criteria Hotel Recommender System Using Explicit and Implicit Feedbacks
Authors: Ashkan Ebadi, Adam Krzyzak
Abstract:
Recommender systems, also known as recommender engines, have become an important research area and are now being applied in various fields. In addition, the techniques behind the recommender systems have been improved over the time. In general, such systems help users to find their required products or services (e.g. books, music) through analyzing and aggregating other users’ activities and behavior, mainly in form of reviews, and making the best recommendations. The recommendations can facilitate user’s decision making process. Despite the wide literature on the topic, using multiple data sources of different types as the input has not been widely studied. Recommender systems can benefit from the high availability of digital data to collect the input data of different types which implicitly or explicitly help the system to improve its accuracy. Moreover, most of the existing research in this area is based on single rating measures in which a single rating is used to link users to items. This paper proposes a highly accurate hotel recommender system, implemented in various layers. Using multi-aspect rating system and benefitting from large-scale data of different types, the recommender system suggests hotels that are personalized and tailored for the given user. The system employs natural language processing and topic modelling techniques to assess the sentiment of the users’ reviews and extract implicit features. The entire recommender engine contains multiple sub-systems, namely users clustering, matrix factorization module, and hybrid recommender system. Each sub-system contributes to the final composite set of recommendations through covering a specific aspect of the problem. The accuracy of the proposed recommender system has been tested intensively where the results confirm the high performance of the system.
Keywords: Tourism, hotel recommender system, hybrid, implicit features.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19025725 Modelling the Occurrence of Defects and Change Requests during User Acceptance Testing
Authors: Kevin McDaid, Simon P. Wilson
Abstract:
Software developed for a specific customer under contract typically undergoes a period of testing by the customer before acceptance. This is known as user acceptance testing and the process can reveal both defects in the system and requests for changes to the product. This paper uses nonhomogeneous Poisson processes to model a real user acceptance data set from a recently developed system. In particular a split Poisson process is shown to provide an excellent fit to the data. The paper explains how this model can be used to aid the allocation of resources through the accurate prediction of occurrences both during the acceptance testing phase and before this activity begins. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23405724 Applying a Noise Reduction Method to Reveal Chaos in the River Flow Time Series
Authors: Mohammad H. Fattahi
Abstract:
Chaotic analysis has been performed on the river flow time series before and after applying the wavelet based de-noising techniques in order to investigate the noise content effects on chaotic nature of flow series. In this study, 38 years of monthly runoff data of three gauging stations were used. Gauging stations were located in Ghar-e-Aghaj river basin, Fars province, Iran. Noise level of time series was estimated with the aid of Gaussian kernel algorithm. This step was found to be crucial in preventing removal of the vital data such as memory, correlation and trend from the time series in addition to the noise during de-noising process.
Keywords: Chaotic behavior, wavelet, noise reduction, river flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2095