Search results for: low data rate
8571 Investigation of Moisture Management Properties of Cotton and Blended Knitted Fabrics
Authors: N. S. Achour, M. Hamdaoui, S. Ben Nasrallah, A. Perwuelz
Abstract:
The main idea of this work is to investigate the effect of knitted fabrics characteristics on moisture management properties. Wetting and transport properties of single jersey, Rib 1&1 and English Rib fabrics made out of cotton and blended Cotton/Polyester yarns were studied. The dynamic water sorption of fabrics was investigated under same isothermal and terrestrial conditions at 20±2°C-65±4% by using the Moisture Management Tester (MMT) which can be used to quantitatively measure liquid moisture transfer in one step in a fabric in multidirections: Absorption rate, moisture absorbing time of the fabric's inner and outer surfaces, one-way transportation capability, the spreading/drying rate, the speed of liquid moisture spreading on fabric's inner and outer surfaces are measured, recorded and discussed. The results show that fabric’s composition and knit’s structure have a significant influence on those phenomena.Keywords: Knitted fabrics characteristics, moisture management properties, multidirections, the Moisture Management Tester.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31518570 Contribution of Vitaton (Β-Carotene) to the Rearing Factors Survival Rate and Visual Flesh Color of Rainbow Trout Fish in Comparison With Astaxanthin
Authors: M.Ghotbi, M.Ghotbi, Gh. Azari Takami
Abstract:
In this study Vitaton (an organic supplement which contains fermentative β-carotene) and synthetic astaxanthin (CAROPHYLL® Pink) were evaluated as pro-growth factors in Rainbow trout diet. An 8 week feeding trial was conducted to determine the effects of Vitaton versus astaxanthin on rearing factors, survival rate and visual flesh color of Rainbow trout (Oncorhnchynchus mykiss) with initial weight of 196±5. Four practical diets were formulated to contain 50 and 80 (ppm) of β- carotene and astaxanthin and also a control diet was prepared without any pigment. Each diet was fed to triplicate groups of fish rearing in fresh water. Fish were fed twice daily. The water temperature fluctuated from 12 to 15 (C˚) and also dissolved oxygen content was between 7 to 7.5 (mg/lit) during the experimental period. At the end of the experiment, growth and food utilization parameters and survival rate were unaffected by dietary treatments (p>0.05). Also, there was no significant difference between carcass yield within treatments (p>0.05). No significant difference recognized between visual flesh color (SalmoFan score) of fish fed Vitaton-containing diets. On the contrary, feeding on diets containing 50 and 80 (ppm) of astaxanthin, increased SalmoFan score (flesh astaxanthin concentration) from <20 (<1 mg/kg) to 23.33 (2.03 mg/kg) and 27.67 (5.74 mg/kg), respectively. Ultimately, a significant difference was seen between flesh carotenoid concentrations of fish feeding on astaxanthin containing treatments and control treatment (P<0.05). It should be mentioned that just raw fillet color of fish belonged to 80 (ppm) of astaxanthin treatment was seen to be close to color targets (SalmoFan scores) adopted for harvest-size fish.Keywords: Astaxanthin, Flesh color, Rainbow trout, Vitaton, β- carotene,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34328569 A Redesigned Pedagogy in Introductory Programming Reduces Failure and Withdrawal Rates by Half
Authors: Said C. Fares, Mary A. Fares
Abstract:
It is well documented that introductory computer programming courses are difficult and that failure rates are high. The aim of this project was to reduce the high failure and withdrawal rates in learning to program. This paper presents a number of changes in module organization and instructional delivery system in teaching CS1. Daily out of class help sessions and tutoring services were applied, interactive lectures and laboratories, online resources, and timely feedback were introduced. Five years of data of 563 students in 21 sections was collected and analyzed. The primary results show that the failure and withdrawal rates were cut by more than half. Student surveys indicate a positive evaluation of the modified instructional approach, overall satisfaction with the course and consequently, higher success and retention rates.
Keywords: Failure Rate, Interactive Learning, Student engagement, CS1.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17808568 XML Data Management in Compressed Relational Database
Authors: Hongzhi Wang, Jianzhong Li, Hong Gao
Abstract:
XML is an important standard of data exchange and representation. As a mature database system, using relational database to support XML data may bring some advantages. But storing XML in relational database has obvious redundancy that wastes disk space, bandwidth and disk I/O when querying XML data. For the efficiency of storage and query XML, it is necessary to use compressed XML data in relational database. In this paper, a compressed relational database technology supporting XML data is presented. Original relational storage structure is adaptive to XPath query process. The compression method keeps this feature. Besides traditional relational database techniques, additional query process technologies on compressed relations and for special structure for XML are presented. In this paper, technologies for XQuery process in compressed relational database are presented..Keywords: XML, compression, query processing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18078567 A System for Analyzing and Eliciting Public Grievances Using Cache Enabled Big Data
Authors: P. Kaladevi, N. Giridharan
Abstract:
The system for analyzing and eliciting public grievances serves its main purpose to receive and process all sorts of complaints from the public and respond to users. Due to the more number of complaint data becomes big data which is difficult to store and process. The proposed system uses HDFS to store the big data and uses MapReduce to process the big data. The concept of cache was applied in the system to provide immediate response and timely action using big data analytics. Cache enabled big data increases the response time of the system. The unstructured data provided by the users are efficiently handled through map reduce algorithm. The processing of complaints takes place in the order of the hierarchy of the authority. The drawbacks of the traditional database system used in the existing system are set forth by our system by using Cache enabled Hadoop Distributed File System. MapReduce framework codes have the possible to leak the sensitive data through computation process. We propose a system that add noise to the output of the reduce phase to avoid signaling the presence of sensitive data. If the complaints are not processed in the ample time, then automatically it is forwarded to the higher authority. Hence it ensures assurance in processing. A copy of the filed complaint is sent as a digitally signed PDF document to the user mail id which serves as a proof. The system report serves to be an essential data while making important decisions based on legislation.Keywords: Big Data, Hadoop, HDFS, Caching, MapReduce, web personalization, e-governance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15958566 Synthesis of Silk Fibroin Fiber for Indoor air Particulate Removal
Authors: Janjira Triped, Wipada Sanongraj, Bovornlak Oonkhanond, Sompop Sanongraj
Abstract:
The main objective of this research is to synthesize silk fibroin fiber for indoor air particulate removal. Silk cocoons were de-gummed using 0.5 wt % Na2CO3 alkaline solutions at 90 Ó╣ìC for 60 mins, washed with distilled water, and dried at 80 Ó╣ìC for 3 hrs in a vacuum oven. Two sets of experiment were conducted to investigate the impacts of initial particulate matter (PM) concentration and that of air flow rate on the removal efficiency. Rice bran collected from a local rice mill in Ubonratchathani province was used as indoor air contaminant in this work. The morphology and physical properties of silk fibroin (SF) fiber were measured. The SEM revealed the deposition of PM on the used fiber. The PM removal efficiencies of 72.29 ± 3.03 % and 39.33 ± 1.99 % were obtained of PM10 and PM2.5, respectively, when using the initial PM concentration at 0.040 mg/m3 and 0.020 mg/m3 of PM10 and PM2.5, respectively, with the air flow rate of 5 L/min.
Keywords: Indoor air, Particulate matter, Scanning electron microscope (SEM), Silk fibroin fiber.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18058565 A Perceptually Optimized Foveation Based Wavelet Embedded Zero Tree Image Coding
Authors: A. Bajit, M. Nahid, A. Tamtaoui, E. H. Bouyakhf
Abstract:
In this paper, we propose a Perceptually Optimized Foveation based Embedded ZeroTree Image Coder (POEFIC) that introduces a perceptual weighting to wavelet coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to a given bit rate a fixation point which determines the region of interest ROI. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEFIC quality assessment. Our POEFIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) foveation masking to remove or reduce considerable high frequencies from peripheral regions 2) luminance and Contrast masking, 3) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.
Keywords: DWT, linear-phase 9/7 filter, Foveation Filtering, CSF implementation approaches, 9/7 Wavelet JND Thresholds and Wavelet Error Sensitivity WES, Luminance and Contrast masking, standard SPIHT, Objective Quality Measure, Probability Score PS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17978564 Spatial Mapping of Dengue Incidence: A Case Study in Hulu Langat District, Selangor, Malaysia
Authors: Er, A. C., Rosli, M. H., Asmahani A., Mohamad Naim M. R., Harsuzilawati M.
Abstract:
Dengue is a mosquito-borne infection that has peaked to an alarming rate in recent decades. It can be found in tropical and sub-tropical climate. In Malaysia, dengue has been declared as one of the national health threat to the public. This study aimed to map the spatial distributions of dengue cases in the district of Hulu Langat, Selangor via a combination of Geographic Information System (GIS) and spatial statistic tools. Data related to dengue was gathered from the various government health agencies. The location of dengue cases was geocoded using a handheld GPS Juno SB Trimble. A total of 197 dengue cases occurring in 2003 were used in this study. Those data then was aggregated into sub-district level and then converted into GIS format. The study also used population or demographic data as well as the boundary of Hulu Langat. To assess the spatial distribution of dengue cases three spatial statistics method (Moran-s I, average nearest neighborhood (ANN) and kernel density estimation) were applied together with spatial analysis in the GIS environment. Those three indices were used to analyze the spatial distribution and average distance of dengue incidence and to locate the hot spot of dengue cases. The results indicated that the dengue cases was clustered (p < 0.01) when analyze using Moran-s I with z scores 5.03. The results from ANN analysis showed that the average nearest neighbor ratio is less than 1 which is 0.518755 (p < 0.0001). From this result, we can expect the dengue cases pattern in Hulu Langat district is exhibiting a cluster pattern. The z-score for dengue incidence within the district is -13.0525 (p < 0.0001). It was also found that the significant spatial autocorrelation of dengue incidences occurs at an average distance of 380.81 meters (p < 0.0001). Several locations especially residential area also had been identified as the hot spots of dengue cases in the district.
Keywords: Dengue, geographic information system (GIS), spatial analysis, spatial statistics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 53748563 An Evaluation of Sputum Smear Conversion and Haematological Parameter Alteration in Early Detection Period of New Pulmonary Tuberculosis (PTB) Patients
Authors: Tasnuva Tamanna, Sanjida Halim Topa
Abstract:
Sputum smear conversion after one month of antituberculosis therapy in new smear positive pulmonary tuberculosis patients (PTB+) is a vital indicator towards treatment success. The objective of this study is to determine the rate of sputum smear conversion in new PTB+ patients after one month under treatment of National Institute of Diseases of the Chest and Hospital (NIDCH). Analysis of sputum smear conversion was done by re-clinical examination with sputum smear microscopic test after one month. Socio-demographic and hematological parameters were evaluated to perceive the correlation with the disease status. Among all enrolled patients only 33.33% were available for follow up diagnosis and of them only 42.86% patients turned to smear negative. Probably this consequence is due to non-coherence to the proper disease management. 66.67% and 78.78% patients reported low haemoglobin and packed cell volume level respectively whereas 80% and 93.33% patients accounted accelerated platelet count and erythrocyte sedimentation rate correspondingly.Keywords: Followed up patients, PTB+ patients, sputum smear conversion, and sputum smear microscopic test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21738562 Blood Lactate, Heart Rate, and Rating of Perceived Exertion in Collegiate Sprint, Middle Distance, and Long Distance Runners after 400 and 1600 Meter Runs
Authors: Taylor J. Canfield, Kathe A. Gabel
Abstract:
The aim of this studywas toinvestigate the effect ofrunning classification (sprint, middle, and long distance)and two distances on blood lactate (BLa), heart rate (HR), and rating of perceived exertion (RPE) Borg scale ratings in collegiate athletes. On different days, runners (n = 15) ran 400m and 1600m at a five min mile pace, followed by a two min 6mph jog, and a two min 3mph walk as part of the cool down. BLa, HR, and RPE were taken at baseline, post-run, plus 2 and 4 min recovery times. The middle and long distance runners exhibited lower BLa concentrations than sprint runners after two min of recovery post 400 m runs, immediately after, and two and four min recovery periods post 1600 m runs. When compared to sprint runners, distance runners may have exhibited the ability to clear BLa more quickly, particularly after running 1600 m.
Keywords: Blood lactate, HR, RPE, running.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37278561 Evaluation of Top-down and Bottom-up Leadership Development Programs in a Finnish Company
Authors: Kati Skarp, Keijo Varis, Juha Kettunen
Abstract:
The purpose of this paper is to examine and evaluate the top-down and bottom-up leadership development programs focused on human capital that improve the performance of a company. This study reports on the external top-down leadership development program supported by a consulting company and the internal participatory action research of the bottom-up program. The sickness rate and the lost time incident failure rate decreased and the ideas produced for cost savings improved, leading to increased earnings during the top-down program. The estimated cost savings potential of the bottom-up program was 3.8 million euro based on the cost savings of meeting habits, maintenance practices and the way of working in production. The results of this study are useful for those who plan and evaluate leadership development and human capital productivity consultation programs to improve the performance of a company.
Keywords: Leadership, development, human resources, company, indicators, evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31308560 Improved K-Modes for Categorical Clustering Using Weighted Dissimilarity Measure
Authors: S.Aranganayagi, K.Thangavel
Abstract:
K-Modes is an extension of K-Means clustering algorithm, developed to cluster the categorical data, where the mean is replaced by the mode. The similarity measure proposed by Huang is the simple matching or mismatching measure. Weight of attribute values contribute much in clustering; thus in this paper we propose a new weighted dissimilarity measure for K-Modes, based on the ratio of frequency of attribute values in the cluster and in the data set. The new weighted measure is experimented with the data sets obtained from the UCI data repository. The results are compared with K-Modes and K-representative, which show that the new measure generates clusters with high purity.
Keywords: Clustering, categorical data, K-Modes, weighted dissimilarity measure
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36948559 Experimental Analysis and Optimization of Process Parameters in Plasma Arc Cutting Machine of EN-45A Material Using Taguchi and ANOVA Method
Authors: Sahil Sharma, Mukesh Gupta, Raj Kumar, N. S Bindra
Abstract:
This paper presents an experimental investigation on the optimization and the effect of the cutting parameters on Material Removal Rate (MRR) in Plasma Arc Cutting (PAC) of EN-45A Material using Taguchi L 16 orthogonal array method. Four process variables viz. cutting speed, current, stand-off-distance and plasma gas pressure have been considered for this experimental work. Analysis of variance (ANOVA) has been performed to get the percentage contribution of each process parameter for the response variable i.e. MRR. Based on ANOVA, it has been observed that the cutting speed, current and the plasma gas pressure are the major influencing factors that affect the response variable. Confirmation test based on optimal setting shows the better agreement with the predicted values.Keywords: Analysis of variance, Material removal rate, plasma arc cutting, Taguchi method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12598558 Mobile Phone as a Tool for Data Collection in Field Research
Authors: Sandro Mourão, Karla Okada
Abstract:
The necessity of accurate and timely field data is shared among organizations engaged in fundamentally different activities, public services or commercial operations. Basically, there are three major components in the process of the qualitative research: data collection, interpretation and organization of data, and analytic process. Representative technological advancements in terms of innovation have been made in mobile devices (mobile phone, PDA-s, tablets, laptops, etc). Resources that can be potentially applied on the data collection activity for field researches in order to improve this process. This paper presents and discuss the main features of a mobile phone based solution for field data collection, composed of basically three modules: a survey editor, a server web application and a client mobile application. The data gathering process begins with the survey creation module, which enables the production of tailored questionnaires. The field workforce receives the questionnaire(s) on their mobile phones to collect the interviews responses and sending them back to a server for immediate analysis.Keywords: Data Gathering, Field Research, Mobile Phone, Survey.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20628557 Hybrid Association Control Scheme and Load Balancing in Wireless LANs
Authors: Chutima Prommak, Airisa Jantaweetip
Abstract:
This paper presents a hybrid association control scheme that can maintain load balancing among access points in the wireless LANs and can satisfy the quality of service requirements of the multimedia traffic applications. The proposed model is mathematically described as a linear programming model. Simulation study and analysis were conducted in order to demonstrate the performance of the proposed hybrid load balancing and association control scheme. Simulation results shows that the proposed scheme outperforms the other schemes in term of the percentage of blocking and the quality of the data transfer rate providing to the multimedia and real-time applications.Keywords: Association control, Load balancing, Wireless LANs
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15318556 The Relationship between Fatigue Crack Growth and Residual Stress in Rails
Authors: F. Husem, M. E. Turan, Y. Sun, H. Ahlatci, I. Tozlu
Abstract:
Residual stress and fatigue crack growth rates are important to determine mechanical behavior of rails. This study aims to make relationship between residual stress and fatigue crack growth values in rails. For this purpose, three R260 quality rails (0.6-0.8% C, 0.6-1.25 Mn) were chosen. Residual stress of samples was measured by cutting method that is related in railway standard. Then samples were machined for fatigue crack growth test and analyze was completed according to the ASTM E647 standard which gives information about parameters of rails for this test. Microstructure characterizations were examined by Light Optic Microscope (LOM). The results showed that residual stress change with fatigue crack growth rate. The sample has highest residual stress exhibits highest crack growth rate and pearlitic structure can be seen clearly for all samples by microstructure analyze.
Keywords: Residual stress, fatigue crack growth, R260, LOM, ASTM E647.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16558555 On Pooling Different Levels of Data in Estimating Parameters of Continuous Meta-Analysis
Authors: N. R. N. Idris, S. Baharom
Abstract:
A meta-analysis may be performed using aggregate data (AD) or an individual patient data (IPD). In practice, studies may be available at both IPD and AD level. In this situation, both the IPD and AD should be utilised in order to maximize the available information. Statistical advantages of combining the studies from different level have not been fully explored. This study aims to quantify the statistical benefits of including available IPD when conducting a conventional summary-level meta-analysis. Simulated meta-analysis were used to assess the influence of the levels of data on overall meta-analysis estimates based on IPD-only, AD-only and the combination of IPD and AD (mixed data, MD), under different study scenario. The percentage relative bias (PRB), root mean-square-error (RMSE) and coverage probability were used to assess the efficiency of the overall estimates. The results demonstrate that available IPD should always be included in a conventional meta-analysis using summary level data as they would significantly increased the accuracy of the estimates.On the other hand, if more than 80% of the available data are at IPD level, including the AD does not provide significant differences in terms of accuracy of the estimates. Additionally, combining the IPD and AD has moderating effects on the biasness of the estimates of the treatment effects as the IPD tends to overestimate the treatment effects, while the AD has the tendency to produce underestimated effect estimates. These results may provide some guide in deciding if significant benefit is gained by pooling the two levels of data when conducting meta-analysis.
Keywords: Aggregate data, combined-level data, Individual patient data, meta analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17438554 Fuzzy Relatives of the CLARANS Algorithm With Application to Text Clustering
Authors: Mohamed A. Mahfouz, M. A. Ismail
Abstract:
This paper introduces new algorithms (Fuzzy relative of the CLARANS algorithm FCLARANS and Fuzzy c Medoids based on randomized search FCMRANS) for fuzzy clustering of relational data. Unlike existing fuzzy c-medoids algorithm (FCMdd) in which the within cluster dissimilarity of each cluster is minimized in each iteration by recomputing new medoids given current memberships, FCLARANS minimizes the same objective function minimized by FCMdd by changing current medoids in such away that that the sum of the within cluster dissimilarities is minimized. Computing new medoids may be effected by noise because outliers may join the computation of medoids while the choice of medoids in FCLARANS is dictated by the location of a predominant fraction of points inside a cluster and, therefore, it is less sensitive to the presence of outliers. In FCMRANS the step of computing new medoids in FCMdd is modified to be based on randomized search. Furthermore, a new initialization procedure is developed that add randomness to the initialization procedure used with FCMdd. Both FCLARANS and FCMRANS are compared with the robust and linearized version of fuzzy c-medoids (RFCMdd). Experimental results with different samples of the Reuter-21578, Newsgroups (20NG) and generated datasets with noise show that FCLARANS is more robust than both RFCMdd and FCMRANS. Finally, both FCMRANS and FCLARANS are more efficient and their outputs are almost the same as that of RFCMdd in terms of classification rate.Keywords: Data Mining, Fuzzy Clustering, Relational Clustering, Medoid-Based Clustering, Cluster Analysis, Unsupervised Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24058553 Multivariate Assessment of Mathematics Test Scores of Students in Qatar
Authors: Ali Rashash Alzahrani, Elizabeth Stojanovski
Abstract:
Data on various aspects of education are collected at the institutional and government level regularly. In Australia, for example, students at various levels of schooling undertake examinations in numeracy and literacy as part of NAPLAN testing, enabling longitudinal assessment of such data as well as comparisons between schools and states within Australia. Another source of educational data collected internationally is via the PISA study which collects data from several countries when students are approximately 15 years of age and enables comparisons in the performance of science, mathematics and English between countries as well as ranking of countries based on performance in these standardised tests. As well as student and school outcomes based on the tests taken as part of the PISA study, there is a wealth of other data collected in the study including parental demographics data and data related to teaching strategies used by educators. Overall, an abundance of educational data is available which has the potential to be used to help improve educational attainment and teaching of content in order to improve learning outcomes. A multivariate assessment of such data enables multiple variables to be considered simultaneously and will be used in the present study to help develop profiles of students based on performance in mathematics using data obtained from the PISA study.
Keywords: Cluster analysis, education, mathematics, profiles.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8958552 DIVAD: A Dynamic and Interactive Visual Analytical Dashboard for Exploring and Analyzing Transport Data
Authors: Tin Seong Kam, Ketan Barshikar, Shaun Tan
Abstract:
The advances in location-based data collection technologies such as GPS, RFID etc. and the rapid reduction of their costs provide us with a huge and continuously increasing amount of data about movement of vehicles, people and goods in an urban area. This explosive growth of geospatially-referenced data has far outpaced the planner-s ability to utilize and transform the data into insightful information thus creating an adverse impact on the return on the investment made to collect and manage this data. Addressing this pressing need, we designed and developed DIVAD, a dynamic and interactive visual analytics dashboard to allow city planners to explore and analyze city-s transportation data to gain valuable insights about city-s traffic flow and transportation requirements. We demonstrate the potential of DIVAD through the use of interactive choropleth and hexagon binning maps to explore and analyze large taxi-transportation data of Singapore for different geographic and time zones.Keywords: Geographic Information System (GIS), MovementData, GeoVisual Analytics, Urban Planning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23918551 Gene Expression Data Classification Using Discriminatively Regularized Sparse Subspace Learning
Authors: Chunming Xu
Abstract:
Sparse representation which can represent high dimensional data effectively has been successfully used in computer vision and pattern recognition problems. However, it doesn-t consider the label information of data samples. To overcome this limitation, we develop a novel dimensionality reduction algorithm namely dscriminatively regularized sparse subspace learning(DR-SSL) in this paper. The proposed DR-SSL algorithm can not only make use of the sparse representation to model the data, but also can effective employ the label information to guide the procedure of dimensionality reduction. In addition,the presented algorithm can effectively deal with the out-of-sample problem.The experiments on gene-expression data sets show that the proposed algorithm is an effective tool for dimensionality reduction and gene-expression data classification.Keywords: sparse representation, dimensionality reduction, labelinformation, sparse subspace learning, gene-expression data classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14508550 Determining Cluster Boundaries Using Particle Swarm Optimization
Authors: Anurag Sharma, Christian W. Omlin
Abstract:
Self-organizing map (SOM) is a well known data reduction technique used in data mining. Data visualization can reveal structure in data sets that is otherwise hard to detect from raw data alone. However, interpretation through visual inspection is prone to errors and can be very tedious. There are several techniques for the automatic detection of clusters of code vectors found by SOMs, but they generally do not take into account the distribution of code vectors; this may lead to unsatisfactory clustering and poor definition of cluster boundaries, particularly where the density of data points is low. In this paper, we propose the use of a generic particle swarm optimization (PSO) algorithm for finding cluster boundaries directly from the code vectors obtained from SOMs. The application of our method to unlabeled call data for a mobile phone operator demonstrates its feasibility. PSO algorithm utilizes U-matrix of SOMs to determine cluster boundaries; the results of this novel automatic method correspond well to boundary detection through visual inspection of code vectors and k-means algorithm.
Keywords: Particle swarm optimization, self-organizing maps, clustering, data mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17228549 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm
Authors: Ameur Abdelkader, Abed Bouarfa Hafida
Abstract:
Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.
Keywords: Predictive analysis, big data, predictive analysis algorithms. CART algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10788548 Production Planning for Animal Food Industry under Demand Uncertainty
Authors: Pirom Thangchitpianpol, Suttipong Jumroonrut
Abstract:
This research investigates the distribution of food demand for animal food and the optimum amount of that food production at minimum cost. The data consist of customer purchase orders for the food of laying hens, price of food for laying hens, cost per unit for the food inventory, cost related to food of laying hens in which the food is out of stock, such as fine, overtime, urgent purchase for material. They were collected from January, 1990 to December, 2013 from a factory in Nakhonratchasima province. The collected data are analyzed in order to explore the distribution of the monthly food demand for the laying hens and to see the rate of inventory per unit. The results are used in a stochastic linear programming model for aggregate planning in which the optimum production or minimum cost could be obtained. Programming algorithms in MATLAB and tools in Linprog software are used to get the solution. The distribution of the food demand for laying hens and the random numbers are used in the model. The study shows that the distribution of monthly food demand for laying has a normal distribution, the monthly average amount (unit: 30 kg) of production from January to December. The minimum total cost average for 12 months is Baht 62,329,181.77. Therefore, the production planning can reduce the cost by 14.64% from real cost.
Keywords: Animal food, Stochastic linear programming, Production planning, Demand Uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19218547 Enriching Egg Yolk with Carotenoids and Phenols
Authors: Amar Benakmoum, Rosa Larid, Sofiane Zidani
Abstract:
Dried tomato peel (DTP) was tested in vivo (n=10) in 42 week-old laying hens at rates of 0, 40, 70, 100 and 130g/kg DM feed. Laying hens were fed in group 120 g DM/day/animal for 26 days. After 21 days, feed intake was not affected after DTP incorporation (97% of the offered feed in the five groups). Laying rate was not significantly different after DTP incorporation at 4 and 10% from the control group. Egg yolk resulting from DTP-enriched diets, contained lower amounts of cholesterol (14 to 17mg/g) and triglyceride (188mg/g) compared to the control group (22 and 241 mg/g, respectively) (P<0.0001). After DTP-enriched diets, content in total phenol was 2.0 to 3.6-fold higher, β-carotene 1.7 to 2.7-fold higher, and lycopene increased between 26.5 and 42.8μg/g compared to the control (P<0.0001). The optimal incorporation rate was 7% DTP.
Keywords: Carotenoid, dried tomato peel, lycopene, laying hens, phenols.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36268546 In vitro Studies of Mucoadhesiveness and Release of Nicotinamide Oral Gels Prepared from Bioadhesive Polymers
Authors: Sarunyoo Songkro, Naranut Rajatasereekul, Nipapat Cheewasrirungrueng
Abstract:
The aim of the present study was to evaluate the mucoadhesion and the release of nicotinamide gel formulations using in vitro methods. An agar plate technique was used to investigate the adhesiveness of the gels whereas a diffusion apparatus was employed to determine the release of nicotinamide from the gels. In this respect, 10% w/w nicotinamide gels containing bioadhesive polymers: Carbopol 934P (0.5-2% w/w), hydroxypropylmethyl cellulose (HPMC) (4-10% w/w), sodium carboxymethyl cellulose (SCMC) (4-6% w/w) and methylcellulose 4000 (MC) (3-5% w/w) were prepared. The gel formulations had pH values in the range of 7.14 - 8.17, which were considered appropriate to oral mucosa application. In general, the rank order of pH values appeared to be SCMC > MC4000 > HPMC > Carbopol 934P. Types and concentrations of polymers used somewhat affected the adhesiveness. It was found that anionic polymers (Carbopol 934 and SCMC) adhered more firmly to the agar plate than the neutral polymers (HPMC and MC 4000). The formulation containing 0.5% Carbopol 934P (F1) showed the highest release rate. With the exception of the formulation F1, the neutral polymers tended to give higher relate rates than the anionic polymers. For oral tissue treatment, the optimum has to be balanced between the residence time (adhesiveness) of the formulations and the release rate of the drug. The formulations containing the anionic polymers: Carbopol 934P or SCMC possessed suitable physical properties (appearance, pH and viscosity). In addition, for anionic polymer formulations, justifiable mucoadhesive properties and reasonable release rates of nicotinamide were achieved. Accordingly, these gel formulations may be applied for the treatment of oral mucosal lesions.Keywords: Nicotinamide, bioadhesive polymer, mucoadhesiveness, release rate, gel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26958545 An Efficient Separation for Convolutive Mixtures
Authors: Salah Al-Din I. Badran, Samad Ahmadi, Dylan Menzies, Ismail Shahin
Abstract:
This paper describes a new efficient blind source separation method; in this method we uses a non-uniform filter bank and a new structure with different sub-bands. This method provides a reduced permutation and increased convergence speed comparing to the full-band algorithm. Recently, some structures have been suggested to deal with two problems: reducing permutation and increasing the speed of convergence of the adaptive algorithm for correlated input signals. The permutation problem is avoided with the use of adaptive filters of orders less than the full-band adaptive filter, which operate at a sampling rate lower than the sampling rate of the input signal. The decomposed signals by analysis bank filter are less correlated in each sub-band than the input signal at full-band, and can promote better rates of convergence.
Keywords: Blind source separation (BSS), estimates, full-band, mixtures, Sub-band.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17828544 A Business-to-Business Collaboration System That Promotes Data Utilization While Encrypting Information on the Blockchain
Authors: Hiroaki Nasu, Ryota Miyamoto, Yuta Kodera, Yasuyuki Nogami
Abstract:
To promote Industry 4.0 and Society 5.0 and so on, it is important to connect and share data so that every member can trust it. Blockchain (BC) technology is currently attracting attention as the most advanced tool and has been used in the financial field and so on. However, the data collaboration using BC has not progressed sufficiently among companies on the supply chain of the manufacturing industry that handle sensitive data such as product quality, manufacturing conditions, etc. There are two main reasons why data utilization is not sufficiently advanced in the industrial supply chain. The first reason is that manufacturing information is top secret and a source for companies to generate profits. It is difficult to disclose data even between companies with transactions in the supply chain. Blockchain mechanism such as Bitcoin using Public Key Infrastructure (PKI) requires plaintext to be shared between companies in order to verify the identity of the company that sent the data. Another reason is that the merits (scenarios) of collaboration data between companies are not specifically specified in the industrial supply chain. For these problems, this paper proposes a Business to Business (B2B) collaboration system using homomorphic encryption and BC technique. Using the proposed system, each company on the supply chain can exchange confidential information on encrypted data and utilize the data for their own business. In addition, this paper considers a scenario focusing on quality data, which was difficult to collaborate because it is top-secret. In this scenario, we show an implementation scheme and a benefit of concrete data collaboration by proposing a comparison protocol that can grasp the change in quality while hiding the numerical value of quality data.
Keywords: Business to business data collaboration, industrial supply chain, blockchain, homomorphic encryption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8308543 Emerging Wireless Standards - WiFi, ZigBee and WiMAX
Authors: Bhavneet Sidhu, Hardeep Singh, Amit Chhabra
Abstract:
The world of wireless telecommunications is rapidly evolving. Technologies under research and development promise to deliver more services to more users in less time. This paper presents the emerging technologies helping wireless systems grow from where we are today into our visions of the future. This paper will cover the applications and characteristics of emerging wireless technologies: Wireless Local Area Networks (WiFi-802.11n), Wireless Personal Area Networks (ZigBee) and Wireless Metropolitan Area Networks (WiMAX). The purpose of this paper is to explain the impending 802.11n standard and how it will enable WLANs to support emerging media-rich applications. The paper will also detail how 802.11n compares with existing WLAN standards and offer strategies for users considering higher-bandwidth alternatives. The emerging IEEE 802.15.4 (ZigBee) standard aims to provide low data rate wireless communications with high-precision ranging and localization, by employing UWB technologies for a low-power and low cost solution. WiMAX (Worldwide Interoperability for Microwave Access) is a standard for wireless data transmission covering a range similar to cellular phone towers. With high performance in both distance and throughput, WiMAX technology could be a boon to current Internet providers seeking to become the leader of next generation wireless Internet access. This paper also explores how these emerging technologies differ from one another.Keywords: MIMO technology, WiFi, WiMAX, ZigBee.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50028542 Improvement in Performance and Emission Characteristics of a Single Cylinder S.I. Engine Operated on Blends of CNG and Hydrogen
Authors: Sarbjot Singh Sandhu
Abstract:
This paper presents the experimental results of a single cylinder Enfield engine using an electronically controlled fuel injection system which was developed to carry out exhaustive tests using neat CNG, and mixtures of hydrogen in compressed natural gas (HCNG) as 0, 5, 10, 15 and 20% by energy. Experiments were performed at 2000 and 2400 rpm with wide open throttle and varying the equivalence ratio. Hydrogen which has fast burning rate, when added to compressed natural gas, enhances its flame propagation rate. The emissions of HC, CO, decreased with increasing percentage of hydrogen but NOx was found to increase. The results indicated a marked improvement in the brake thermal efficiency with the increase in percentage of hydrogen added. The improved thermal efficiency was clearly observed to be more in lean region as compared to rich region. This study is expected to reduce vehicular emissions along with increase in thermal efficiency and thus help in reduction of further environmental degradation.
Keywords: Hydrogen, CNG, HCNG, Emissions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2716