Search results for: very large scale integrated circuits.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4231

Search results for: very large scale integrated circuits.

1201 Spike Sorting Method Using Exponential Autoregressive Modeling of Action Potentials

Authors: Sajjad Farashi

Abstract:

Neurons in the nervous system communicate with each other by producing electrical signals called spikes. To investigate the physiological function of nervous system it is essential to study the activity of neurons by detecting and sorting spikes in the recorded signal. In this paper a method is proposed for considering the spike sorting problem which is based on the nonlinear modeling of spikes using exponential autoregressive model. The genetic algorithm is utilized for model parameter estimation. In this regard some selected model coefficients are used as features for sorting purposes. For optimal selection of model coefficients, self-organizing feature map is used. The results show that modeling of spikes with nonlinear autoregressive model outperforms its linear counterpart. Also the extracted features based on the coefficients of exponential autoregressive model are better than wavelet based extracted features and get more compact and well-separated clusters. In the case of spikes different in small-scale structures where principal component analysis fails to get separated clouds in the feature space, the proposed method can obtain well-separated cluster which removes the necessity of applying complex classifiers.

Keywords: Exponential autoregressive model, Neural data, spike sorting, time series modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1762
1200 Ultra-Light Overhead Conveyor Systems for Logistics Applications

Authors: Batin Latif Aylak, Bernd Noche

Abstract:

Overhead conveyor systems satisfy by their simple
construction, wide application range and their full compatibility with
other manufacturing systems, which are designed according to
international standards. Ultra-light overhead conveyor systems are
rope-based conveying systems with individually driven vehicles. The
vehicles can move automatically on the rope and this can be realized
by energy and signals. Crossings are realized by switches. Overhead
conveyor systems are particularly used in the automotive industry but
also at post offices. Overhead conveyor systems always must be
integrated with a logistical process by finding the best way for a
cheaper material flow and in order to guarantee precise and fast
workflows. With their help, any transport can take place without
wasting ground and space, without excessive company capacity, lost
or damaged products, erroneous delivery, endless travels and without
wasting time. Ultra-light overhead conveyor systems provide optimal
material flow, which produces profit and saves time. This article
illustrates the advantages of the structure of the ultra-light overhead
conveyor systems in logistics applications and explains the steps of
their system design. After an illustration of the steps, currently
available systems on the market will be shown by means of their
technical characteristics. Due to their simple construction, demands
to an ultra-light overhead conveyor system will be illustrated.

Keywords: Logistics, material flow, overhead conveyor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2057
1199 A Multi-Modal Virtual Walkthrough of the Virtual Past and Present Based on Panoramic View, Crowd Simulation and Acoustic Heritage on Mobile Platform

Authors: Lim Chen Kim, Tan Kian Lam, Chan Yi Chee

Abstract:

This research presents a multi-modal simulation in the reconstruction of the past and the construction of present in digital cultural heritage on mobile platform. In bringing the present life, the virtual environment is generated through a presented scheme for rapid and efficient construction of 360° panoramic view. Then, acoustical heritage model and crowd model are presented and improvised into the 360° panoramic view. For the reconstruction of past life, the crowd is simulated and rendered in an old trading port. However, the keystone of this research is in a virtual walkthrough that shows the virtual present life in 2D and virtual past life in 3D, both in an environment of virtual heritage sites in George Town through mobile device. Firstly, the 2D crowd is modelled and simulated using OpenGL ES 1.1 on mobile platform. The 2D crowd is used to portray the present life in 360° panoramic view of a virtual heritage environment based on the extension of Newtonian Laws. Secondly, the 2D crowd is animated and rendered into 3D with improved variety and incorporated into the virtual past life using Unity3D Game Engine. The behaviours of the 3D models are then simulated based on the enhancement of the classical model of Boid algorithm. Finally, a demonstration system is developed and integrated with the models, techniques and algorithms of this research. The virtual walkthrough is demonstrated to a group of respondents and is evaluated through the user-centred evaluation by navigating around the demonstration system. The results of the evaluation based on the questionnaires have shown that the presented virtual walkthrough has been successfully deployed through a multi-modal simulation and such a virtual walkthrough would be particularly useful in a virtual tour and virtual museum applications.

Keywords: Boid algorithm, crowd simulation, mobile platform, Newtonian laws, virtual heritage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1482
1198 A Prediction-Based Reversible Watermarking for MRI Images

Authors: Nuha Omran Abokhdair, Azizah Bt Abdul Manaf

Abstract:

Reversible watermarking is a special branch of image watermarking, that is able to recover the original image after extracting the watermark from the image. In this paper, an adaptive prediction-based reversible watermarking scheme is presented, in order to increase the payload capacity of MRI medical images. The scheme divides the image into two parts, Region of Interest (ROI) and Region of Non-Interest (RONI). Two bits are embedded in each embeddable pixel of RONI and one bit is embedded in each embeddable pixel of ROI. The experimental results demonstrate that the proposed scheme is able to achieve high embedding capacity. This is mainly caused by two reasons. First, the pixels that were excluded from data embedding due to overflow/underflow are used for data embedding. Second, large location map that need to be added to watermark data as overhead is eliminated and thus lower data embedding capacity is prevented. Moreover, the scheme provides good visual quality to the watermarked image.

Keywords: Medical image watermarking, reversible watermarking, Difference Expansion, Prediction-Error Expansion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1910
1197 Resveratrol Incorporated Liposomes Prepared from Pegylated Phospholipids and Cholesterol

Authors: Mont Kumpugdee-Vollrath, Khaled Abdallah

Abstract:

Liposomes and pegylated liposomes were widely used as drug delivery system in pharmaceutical field since a long time. However, in the former time, polyethylene glycol (PEG) was connected into phospholipid after the liposomes were already prepared. In this paper, we intend to study the possibility of applying phospholipids which already connected with PEG and then they were used to prepare liposomes. The model drug resveratrol was used because it can be applied against different diseases. Cholesterol was applied to stabilize the membrane of liposomes. The thin film technique in a laboratory scale was a preparation method. The liposomes were then characterized by nanoparticle tracking analysis (NTA), photon correlation spectroscopy (PCS) and light microscopic techniques. The stable liposomes can be produced and the particle sizes after filtration were in nanometers. The 2- and 3-chains-PEG-phospholipid (PL) caused in smaller particle size than the 4-chains-PEG-PL. Liposomes from PL 90G and cholesterol were stable during storage at 8 °C of 56 days because the particle sizes measured by PCS were almost not changed. There was almost no leakage of resveratrol from liposomes PL 90G with cholesterol after diffusion test in dialysis tube for 28 days. All liposomes showed the sustained release during measuring time of 270 min. The maximum release amount of 16-20% was detected with liposomes from 2- and 3-chains-PEG-PL. The other liposomes gave max. release amount of resveratrol only of 10%. The release kinetic can be explained by Korsmeyer-Peppas equation. 

Keywords: Liposome, NTA, resveratrol, pegylation, cholesterol.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1056
1196 Identification of Promising Infant Clusters to Obtain Improved Block Layout Designs

Authors: Mustahsan Mir, Ahmed Hassanin, Mohammed A. Al-Saleh

Abstract:

The layout optimization of building blocks of unequal areas has applications in many disciplines including VLSI floorplanning, macrocell placement, unequal-area facilities layout optimization, and plant or machine layout design. A number of heuristics and some analytical and hybrid techniques have been published to solve this problem. This paper presents an efficient high-quality building-block layout design technique especially suited for solving large-size problems. The higher efficiency and improved quality of optimized solutions are made possible by introducing the concept of Promising Infant Clusters in a constructive placement procedure. The results presented in the paper demonstrate the improved performance of the presented technique for benchmark problems in comparison with published heuristic, analytic, and hybrid techniques.

Keywords: Block layout problem, building-block layout design, CAD, optimization, search techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1233
1195 Compositional and Morphological Characteristics of the Tissues of Three Common Dates Grown in Algeria

Authors: H. Amellal-Chibane, Y. Noui, A. Djouab, S. Benamara

Abstract:

Mech-Degla, Degla-Beida and Frezza are the date (Phoenix dactylifera L.) common varieties with a more or less good availability and feeble trade value. Some morphologic and physicochemical factors were determined. Results show that the whole date weight is significantly different (P= 95%) concerning Mech-Degla and Degla-Beida which are more commercialized than Frezza whereas the pulp mass proportion in relation to whole fruits is highest for Frezza (88.28%). Moreover, there is a large variability concerning the weights and densities of constitutive tissues in each variety. The white tissue is dominant in Mech-Degla in opposite to the two other varieties. The variance analyze showed that the difference in weights between brown and white tissues is significant (P = 95%) for all studied varieties. Some other morphologic and chemical proprieties of the whole pulps and their two constitutive parts (brown or pigmented and white) are also investigated. The predominance of phenolics in Mech-Degla (4.01g/100g, w.b) and Frezza (4.96 g/100g, w.b) pulps brown part is the main result revealed in this study.

Keywords: Common dates, phenolics, sugars, tissues.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2266
1194 Metaheuristic Algorithms for Decoding Binary Linear Codes

Authors: Hassan Berbia, Faissal Elbouanani, Rahal Romadi, Mostafa Belkasmi

Abstract:

This paper introduces two decoders for binary linear codes based on Metaheuristics. The first one uses a genetic algorithm and the second is based on a combination genetic algorithm with a feed forward neural network. The decoder based on the genetic algorithms (DAG) applied to BCH and convolutional codes give good performances compared to Chase-2 and Viterbi algorithm respectively and reach the performances of the OSD-3 for some Residue Quadratic (RQ) codes. This algorithm is less complex for linear block codes of large block length; furthermore their performances can be improved by tuning the decoder-s parameters, in particular the number of individuals by population and the number of generations. In the second algorithm, the search space, in contrast to DAG which was limited to the code word space, now covers the whole binary vector space. It tries to elude a great number of coding operations by using a neural network. This reduces greatly the complexity of the decoder while maintaining comparable performances.

Keywords: Block code, decoding, methaheuristic, genetic algorithm, neural network

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2072
1193 Text Mining of Twitter Data Using a Latent Dirichlet Allocation Topic Model and Sentiment Analysis

Authors: Sidi Yang, Haiyi Zhang

Abstract:

Twitter is a microblogging platform, where millions of users daily share their attitudes, views, and opinions. Using a probabilistic Latent Dirichlet Allocation (LDA) topic model to discern the most popular topics in the Twitter data is an effective way to analyze a large set of tweets to find a set of topics in a computationally efficient manner. Sentiment analysis provides an effective method to show the emotions and sentiments found in each tweet and an efficient way to summarize the results in a manner that is clearly understood. The primary goal of this paper is to explore text mining, extract and analyze useful information from unstructured text using two approaches: LDA topic modelling and sentiment analysis by examining Twitter plain text data in English. These two methods allow people to dig data more effectively and efficiently. LDA topic model and sentiment analysis can also be applied to provide insight views in business and scientific fields.

Keywords: Text mining, Twitter, topic model, sentiment analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1798
1192 Adaptive Fourier Decomposition Based Signal Instantaneous Frequency Computation Approach

Authors: Liming Zhang

Abstract:

There have been different approaches to compute the analytic instantaneous frequency with a variety of background reasoning and applicability in practice, as well as restrictions. This paper presents an adaptive Fourier decomposition and (α-counting) based instantaneous frequency computation approach. The adaptive Fourier decomposition is a recently proposed new signal decomposition approach. The instantaneous frequency can be computed through the so called mono-components decomposed by it. Due to the fast energy convergency, the highest frequency of the signal will be discarded by the adaptive Fourier decomposition, which represents the noise of the signal in most of the situation. A new instantaneous frequency definition for a large class of so-called simple waves is also proposed in this paper. Simple wave contains a wide range of signals for which the concept instantaneous frequency has a perfect physical sense. The α-counting instantaneous frequency can be used to compute the highest frequency for a signal. Combination of these two approaches one can obtain the IFs of the whole signal. An experiment is demonstrated the computation procedure with promising results.

Keywords: Adaptive Fourier decomposition, Fourier series, signal processing, instantaneous frequency

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2344
1191 Implementation of Neural Network Based Electricity Load Forecasting

Authors: Myint Myint Yi, Khin Sandar Linn, Marlar Kyaw

Abstract:

This paper proposed a novel model for short term load forecast (STLF) in the electricity market. The prior electricity demand data are treated as time series. The model is composed of several neural networks whose data are processed using a wavelet technique. The model is created in the form of a simulation program written with MATLAB. The load data are treated as time series data. They are decomposed into several wavelet coefficient series using the wavelet transform technique known as Non-decimated Wavelet Transform (NWT). The reason for using this technique is the belief in the possibility of extracting hidden patterns from the time series data. The wavelet coefficient series are used to train the neural networks (NNs) and used as the inputs to the NNs for electricity load prediction. The Scale Conjugate Gradient (SCG) algorithm is used as the learning algorithm for the NNs. To get the final forecast data, the outputs from the NNs are recombined using the same wavelet technique. The model was evaluated with the electricity load data of Electronic Engineering Department in Mandalay Technological University in Myanmar. The simulation results showed that the model was capable of producing a reasonable forecasting accuracy in STLF.

Keywords: Neural network, Load forecast, Time series, wavelettransform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2484
1190 On Speeding Up Support Vector Machines: Proximity Graphs Versus Random Sampling for Pre-Selection Condensation

Authors: Xiaohua Liu, Juan F. Beltran, Nishant Mohanchandra, Godfried T. Toussaint

Abstract:

Support vector machines (SVMs) are considered to be the best machine learning algorithms for minimizing the predictive probability of misclassification. However, their drawback is that for large data sets the computation of the optimal decision boundary is a time consuming function of the size of the training set. Hence several methods have been proposed to speed up the SVM algorithm. Here three methods used to speed up the computation of the SVM classifiers are compared experimentally using a musical genre classification problem. The simplest method pre-selects a random sample of the data before the application of the SVM algorithm. Two additional methods use proximity graphs to pre-select data that are near the decision boundary. One uses k-Nearest Neighbor graphs and the other Relative Neighborhood Graphs to accomplish the task.

Keywords: Machine learning, data mining, support vector machines, proximity graphs, relative-neighborhood graphs, k-nearestneighbor graphs, random sampling, training data condensation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1912
1189 Polishing Machine Based on High-Pressure Water Jet

Authors: Mohammad A. Khasawneh

Abstract:

The design of high pressure water jet based polishing equipment and its fabrication conducted in this study is reported herein, together with some preliminary test results for assessing its applicability for HMA surface polishing. This study also provides preliminary findings concerning the test variables, such as the rotational speed, the water jet pressure, the abrasive agent used, and the impact angel that were experimentally investigated in this study. The preliminary findings based on four trial tests (two on large slab specimens and two on small size gyratory compacted specimens), however, indicate that both friction and texture values tend to increase with the polishing durations for two combinations of pressure and rotation speed of the rotary deck. It seems that the more polishing action the specimen is subjected to; the aggregate edges are created such that the surface texture values are increased with the accompanied increase in friction values. It may be of interest (but which is outside the scope of this study) to investigate if the similar trend exist for HMA prepared with aggregate source that is sand and gravel.

Keywords: High-pressure, water jet, Friction, Texture, Polishing, Statistical Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2045
1188 Computer Aided Design Solution Based on Genetic Algorithms for FMEA and Control Plan in Automotive Industry

Authors: Nadia Belu, Laurentiu M. Ionescu, Agnieszka Misztal

Abstract:

In this paper we propose a computer-aided solution with Genetic Algorithms in order to reduce the drafting of reports: FMEA analysis and Control Plan required in the manufacture of the product launch and improved knowledge development teams for future projects. The solution allows to the design team to introduce data entry required to FMEA. The actual analysis is performed using Genetic Algorithms to find optimum between RPN risk factor and cost of production. A feature of Genetic Algorithms is that they are used as a means of finding solutions for multi criteria optimization problems. In our case, along with three specific FMEA risk factors is considered and reduce production cost. Analysis tool will generate final reports for all FMEA processes. The data obtained in FMEA reports are automatically integrated with other entered parameters in Control Plan. Implementation of the solution is in the form of an application running in an intranet on two servers: one containing analysis and plan generation engine and the other containing the database where the initial parameters and results are stored. The results can then be used as starting solutions in the synthesis of other projects. The solution was applied to welding processes, laser cutting and bending to manufacture chassis for buses. Advantages of the solution are efficient elaboration of documents in the current project by automatically generating reports FMEA and Control Plan using multiple criteria optimization of production and build a solid knowledge base for future projects. The solution which we propose is a cheap alternative to other solutions on the market using Open Source tools in implementation.

Keywords: Automotive industry, control plan, FMEA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2868
1187 Combining an Optimized Closed Principal Curve-Based Method and Evolutionary Neural Network for Ultrasound Prostate Segmentation

Authors: Tao Peng, Jing Zhao, Yanqing Xu, Jing Cai

Abstract:

Due to missing/ambiguous boundaries between the prostate and neighboring structures, the presence of shadow artifacts, as well as the large variability in prostate shapes, ultrasound prostate segmentation is challenging. To handle these issues, this paper develops a hybrid method for ultrasound prostate segmentation by combining an optimized closed principal curve-based method and the evolutionary neural network; the former can fit curves with great curvature and generate a contour composed of line segments connected by sorted vertices, and the latter is used to express an appropriate map function (represented by parameters of evolutionary neural network) for generating the smooth prostate contour to match the ground truth contour. Both qualitative and quantitative experimental results showed that our proposed method obtains accurate and robust performances.

Keywords: Ultrasound prostate segmentation, optimized closed polygonal segment method, evolutionary neural network, smooth mathematical model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 437
1186 Determinants of R&D Outsourcing at Japanese Firms: Transaction Cost and Strategic Management Perspectives

Authors: Dai Miyamoto

Abstract:

This paper examines the factors, which determine R&D outsourcing behaviour at Japanese firms, from the viewpoints of transaction cost and strategic management, since the latter half of the 1990s. This study uses empirical analysis, which involves the application of large-sample data. The principal findings of this paper are listed below. Firms that belong to a wider corporate group are more active in executing R&D outsourcing activities. Diversification strategies such as the expansion of product and sales markets have a positive effect on the R&D outsourcing behaviour of firms. Moreover, while quantitative R&D resources have positive influences on R&D outsourcing, qualitative indices have no effect. These facts suggest that R&D outsourcing behaviour of Japanese firms are consistent with the two perspectives of transaction cost and strategic management. Specifically, a conventional corporate group network plays an important role in R&D outsourcing behaviour. Firms that execute R&D outsourcing leverage 'old' networks to construct 'new' networks and use both networks properly.

Keywords: Corporate Group Networks, R&D Outsourcing, Strategic Management Perspective, Transaction Cost Perspective.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1670
1185 A Particle Swarm Optimization Approach for the Earliness-Tardiness No-Wait Flowshop Scheduling Problem

Authors: Sedighe Arabameri, Nasser Salmasi

Abstract:

In this researcha particle swarm optimization (PSO) algorithm is proposedfor no-wait flowshopsequence dependent setuptime scheduling problem with weighted earliness-tardiness penalties as the criterion (|, |Σ   " ).The smallestposition value (SPV) rule is applied to convert the continuous value of position vector of particles in PSO to job permutations.A timing algorithm is generated to find the optimal schedule and calculate the objective function value of a given sequence in PSO algorithm. Twodifferent neighborhood structures are applied to improve the solution quality of PSO algorithm.The first one is based on variable neighborhood search (VNS) and the second one is a simple one with invariable structure. In order to compare the performance of two neighborhood structures, random test problems are generated and solved by both neighborhood approaches.Computational results show that the VNS algorithmhas better performance than the other one especially for the large sized problems.

Keywords: minimization of summation of weighed earliness and tardiness, no-wait flowshop scheduling, particle swarm optimization, sequence dependent setup times

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1617
1184 Floating Offshore Wind: A Review of Installation Vessel Requirements

Authors: A. P. Crowle

Abstract:

Floating offshore wind farms may provide in the future large quantities of renewable energy. One of the challenges to their future development is the provision of installation vessels for the offshore installation of floating wind turbines. This paper examines the current fleet of vessels that can be used for inshore construction. Separate vessels are required for the ocean tow out and the offshore installation. Information will be provided on what new vessels might be required to improve the efficiency and reduce costs of installing floating wind turbines. Specialized cargo vessels are required for this initial mobilization. Anchor handling vessels are required to tow the floating wind turbine offshore and to install and connect the moorings. Subsea work vessels are required to install the dynamic cables whilst cable lay vessels are required for the export power cable. This paper reviews the existing and future installation vessel requirement for floating wind. Dedicated ports are required for vertical integration of the substructure and the tower, nacelle and blades.

Keywords: Floating wind, naval architecture, offshore installation vessels, ports for renewable energy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 105
1183 Antecedent Factors of Ethical Ideologies in Moral Judgment: Evidence from the Mixed Method Study

Authors: N. Mustamil, M. Quaddus

Abstract:

This research investigates the factors that influence moral judgments when dealing with ethical dilemmas in the organizational context. It also investigates the antecedents of individual ethical ideology (idealism and relativism). A mixed method study, which combines qualitative (field study) and quantitative (survey) approaches, was used in this study. An initial model was developed first, which was then fine-tuned based on field studies. Data were collected from managers in Malaysian large organizations. The results of this study reveal that in-group collectivism culture, power distance culture, parental values, and religiosity were significant as antecedents of ethical ideology. However, direct effects of these variables on moral judgment were not significant. Furthermore, the results of this study confirm the significant effects of ethical ideology on moral judgment. This study provides valuable insight into evaluating the validity of existing theory as proposed in the literature and offers significant practical implications.

Keywords: Antecedents Factors, Ethical Ideology, Mixed Method, Moral Judgment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2418
1182 Hairy Beggarticks (Bidens pilosa L. - Asteraceae) Control in Sunflower Fields Using Pre-Emergence Herbicides

Authors: Alexandre M. Brighenti

Abstract:

One of the most damaging species in sunflower crops in Brazil is the hairy beggarticks (Bidens pilosa L.). The large number of seeds, the various vegetative cycles during the year, the staggered germination and the scarcity of selective and effective herbicides to control this weed in sunflower are some of attributes that hinder the effectiveness in controlling hairy beggarticks populations. The experiment was carried out with the objectives of evaluating the control of hairy beggarticks plants in sunflower crops, and to assess sunflower tolerance to residual herbicides. The treatments were as follows: S-metolachlor (1,200 and 2,400 g ai ha-1), flumioxazin (60 and 120 g ai ha-1), sulfentrazone (150 and 300 g ai ha-1) and two controls (weedy and weed-free check). Phytotoxicity on sunflower plants, percentage of control and density of hairy beggarticks plants, sunflower stand and plant height, head diameter, oil content and sunflower yield were evaluated. The herbicides flumioxazin and sulfentrazone were the most efficient in hairy beggarticks control. S-metolachlor provided acceptable control levels. S-metolachlor (1,200 g ha-1), flumioxazin (60 g ha-1) and sulfentrazone (150 g ha-1) were the most selective doses for sunflower crop.

Keywords: Flumioxazin, Helianthus annuus, S-metolachlor, sulfentrazone, weeds.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 606
1181 Deep Web Content Mining

Authors: Shohreh Ajoudanian, Mohammad Davarpanah Jazi

Abstract:

The rapid expansion of the web is causing the constant growth of information, leading to several problems such as increased difficulty of extracting potentially useful knowledge. Web content mining confronts this problem gathering explicit information from different web sites for its access and knowledge discovery. Query interfaces of web databases share common building blocks. After extracting information with parsing approach, we use a new data mining algorithm to match a large number of schemas in databases at a time. Using this algorithm increases the speed of information matching. In addition, instead of simple 1:1 matching, they do complex (m:n) matching between query interfaces. In this paper we present a novel correlation mining algorithm that matches correlated attributes with smaller cost. This algorithm uses Jaccard measure to distinguish positive and negative correlated attributes. After that, system matches the user query with different query interfaces in special domain and finally chooses the nearest query interface with user query to answer to it.

Keywords: Content mining, complex matching, correlation mining, information extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2270
1180 Investigating the Road Maintenance Performance in Developing Countries

Authors: Jamaa Salih, Francis Edum-Fotwe, Andrew Price

Abstract:

One of the most critical aspects of the management of road infrastructure is the type and scale of maintenance systems adopted and the consequences of their inadequacy. The performance of road maintenance systems can be assessed by a number of important indicators such as: cost, safety, environmental impact, and level of complaints by users. A review of practice reveals that insufficient level of expenditure or poor management of the road network often has serious consequences for the economic and social life of a country in terms of vehicle operating costs (VOC), travel time costs, accident costs and environmental impact. Despite an increase in the attention paid by global road agencies to the environmental and the road users’ satisfaction, the overwhelming evidence from the available literature agree on the lack of similar levels of attention for the two factors in many developing countries. While many sources agree that the road maintenance backlog is caused by either the shortage of expenditures or lack of proper management or both, it appears that managing the available assets particularly in the developing countries is the main issue. To address this subject, this paper will concentrate on exposing the various issues related to this field.  

Keywords: Environmental impact, performance indicators, road maintenance, users’ satisfaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3374
1179 Preservation of Coconut Toddy Sediments as a Leavening Agent for Bakery Products

Authors: B. R. Madushan, S. B. Navaratne, I. Wickramasinghe

Abstract:

Toddy sediment (TS) was cultured in a PDA medium to determine initial yeast load, and also it was undergone sun, shade, solar, dehumidified cold air (DCA) and hot air oven (at 400, 500 and 60oC) drying with a view to preserve viability of yeast. Thereafter, this study was conducted according to two factor factorial design in order to determine best preservation method. Therein the dried TS from the best drying method was taken and divided into two portions. One portion was mixed with 3: 7 ratio of TS: rice flour and the mixture was divided in to two again. While one portion was kept under in house condition the other was in a refrigerator. Same procedure was followed to the rest portion of TS too but it was at the same ratio of corn flour. All treatments were vacuum packed in triple laminate pouches and the best preservation method was determined in terms of leavening index (LI). The TS obtained from the best preservation method was used to make foods (bread and hopper) and organoleptic properties of it were evaluated against same of ordinary foods using sensory panel with a five point hedonic scale. Results revealed that yeast load or fresh TS was 58×106 CFU/g. The best drying method in preserving viability of yeast was DCA because LI of this treatment (96%) is higher than that of other three treatments. Organoleptic properties of foods prepared from best preservation method are as same as ordinary foods according to Duo trio test.

Keywords: Biological leavening agent, coconut toddy, fermentation, yeast.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2243
1178 Perception of the Frequency and Importance of Peer Social Support by Students with Special Educational Needs in Inclusive Education

Authors: Lucia Hrebeňárová, Jarmila Žolnová, Veronika Palková

Abstract:

Inclusive education of students with special educational needs has been on the increase in the Slovak Republic, facing many challenges. Preparedness of teachers for inclusive education is one of the most frequent issues; teachers lack skills when it comes to the use of effective instruction depending on the individual needs of students, improvement of classroom management and social skills, and support of inclusion within the classroom. Social support is crucial for the school success of students within inclusive settings. The aim of the paper is to analyse perception of the frequency and importance of peer social support by students with special educational needs in inclusive education. The data collection tool used was the Child and Adolescent Social Support Scale (CASSS). The research sample consisted of 953 fourth grade students – 141 students with special educational needs educated in an inclusive setting and 812 students of the standard population. No significant differences were found between the students with special educational needs and the students without special educational needs in an inclusive setting when it comes to the perception of frequency and importance of social support of schoolmates and friends. However, the perception of frequency and importance of a friend’s social support was higher than the perception of frequency and importance of a classmate’s social support in both groups of students.

Keywords: Inclusive education, peer social support, peer, student with special educational needs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1298
1177 Consumer Perception of 3D Body Scanning While Online Shopping for Clothing

Authors: A. Grilec, S. Petrak, M. Mahnic Naglic

Abstract:

Technological development and the globalization in production and sales of clothing in the last decade have significantly influenced the changes in consumer relationship with the industrial-fashioned apparel and in the way of clothing purchasing. The Internet sale of clothing is in a constant and significant increase in the global market, but the possibilities offered by modern computing technologies in the customization segment are not yet fully involved, especially according to the individual customer requirements and body sizes. Considering the growing trend of online shopping, the main goal of this paper is to investigate the differences in customer perceptions towards online apparel shopping and particularly to discover the main differences in perceptions between customers regarding three different body sizes. In order to complete the research goal, the quantitative study on the sample of 85 Croatian consumers was conducted in 2017 in Zagreb, Croatia. Respondents were asked to indicate their level of agreement according to a five-point Likert scale ranging from strongly disagree (1) to strongly agree (5). To analyze attitudes of respondents, simple and descriptive statistics were used. The main findings highlight the differences in respondent perception of 3D body scanning, using 3D body scanning in Internet shopping, online apparel shopping habits regarding their body sizes.

Keywords: Consumer behavior, online shopping, 3D body scanning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 735
1176 An Improved Tie Force Method for Progressive Collapse Resistance of Precast Concrete Cross Wall Structures

Authors: M. Tohidi, J. Yang, C. Baniotopoulos

Abstract:

Progressive collapse of buildings typically occurs  when abnormal loading conditions cause local damages, which leads  to a chain reaction of failure and ultimately catastrophic collapse. The  tie force (TF) method is one of the main design approaches for  progressive collapse. As the TF method is a simplified method, further  investigations on the reliability of the method is necessary. This study  aims to develop an improved TF method to design the cross wall  structures for progressive collapse. To this end, the pullout behavior of  strands in grout was firstly analyzed; and then, by considering the tie  force-slip relationship in the friction stage together with the catenary  action mechanism, a comprehensive analytical method was developed.  The reliability of this approach is verified by the experimental results  of concrete block pullout tests and full scale floor-to-floor joints tests  undertaken by Portland Cement Association (PCA). Discrepancies in  the tie force between the analytical results and codified specifications  have suggested the deficiency of TF method, hence an improved  model based on the analytical results has been proposed to address this  concern.

 

Keywords: Cross wall, progressive collapse, ties force method, catenary, analytical.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3678
1175 Routing Capability and Blocking Analysis of Dynamic ROADM Optical Networks (Category - II) for Dynamic Traffic

Authors: Indumathi T. S., T. Srinivas, B. Siva Kumar

Abstract:

Reconfigurable optical add/drop multiplexers (ROADMs) can be classified into three categories based on their underlying switching technologies. Category I consists of a single large optical switch; category II is composed of a number of small optical switches aligned in parallel; and category III has a single optical switch and only one wavelength being added/dropped. In this paper, to evaluate the wavelength-routing capability of ROADMs of category-II in dynamic optical networks,the dynamic traffic models are designed based on Bernoulli, Poisson distributions for smooth and regular types of traffic. Through Analytical and Simulation results, the routing power of cat-II of ROADM networks for two traffic models are determined.

Keywords: Fully-Reconfigurable Optical Add-Drop Multiplexers (FROADMs), Limited Tunability in Reconfigurable Optical Add-Drop multiplexers (LROADM), Multiplexer/De- Multiplexer (MUX/DEMUX), Reconfigurable Optical Add-Drop Multiplexers (ROADMs), Wavelength Division Multiplexing (WDM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1520
1174 A Survey of Sentiment Analysis Based on Deep Learning

Authors: Pingping Lin, Xudong Luo, Yifan Fan

Abstract:

Sentiment analysis is a very active research topic. Every day, Facebook, Twitter, Weibo, and other social media, as well as significant e-commerce websites, generate a massive amount of comments, which can be used to analyse peoples opinions or emotions. The existing methods for sentiment analysis are based mainly on sentiment dictionaries, machine learning, and deep learning. The first two kinds of methods rely on heavily sentiment dictionaries or large amounts of labelled data. The third one overcomes these two problems. So, in this paper, we focus on the third one. Specifically, we survey various sentiment analysis methods based on convolutional neural network, recurrent neural network, long short-term memory, deep neural network, deep belief network, and memory network. We compare their futures, advantages, and disadvantages. Also, we point out the main problems of these methods, which may be worthy of careful studies in the future. Finally, we also examine the application of deep learning in multimodal sentiment analysis and aspect-level sentiment analysis.

Keywords: Natural language processing, sentiment analysis, document analysis, multimodal sentiment analysis, deep learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1981
1173 Investigation of Various PWM Techniques for Shunt Active Filter

Authors: J. Chelladurai, G. Saravana Ilango, C. Nagamani, S. Senthil Kumar

Abstract:

Pulse width modulation (PWM) techniques have been the subject of intensive research for different industrial and power sector applications. A large variety of methods, different in concept and performance, have been newly developed and described. This paper analyzes the comparative merits of Sinusoidal Pulse Width Modulation (SPWM) and Space Vector Pulse Width Modulation (SVPWM) techniques and the suitability of these techniques in a Shunt Active Filter (SAF). The objective is to select the scheme that offers effective utilization of DC bus voltage and also harmonic reduction at the input side. The effectiveness of the PWM techniques is tested in the SAF configuration with a non linear load. The performance of the SAF with the SPWM and (SVPWM) techniques are compared with respect to the THD in source current. The study reveals that in the context of closed loop SAF control with the SVPWM technique there is only a minor improvement in THD. The utilization of the DC bus with SVPWM is also not significant compared to that with SPWM because of the non sinusoidal modulating signal from the controller in SAF configuration.

Keywords: Voltage source inverter, Shunt active filter, SPWM, SVPWM, Matlab/SIMULINK.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2789
1172 Scale Development for Measuring E-Service Quality in Banking

Authors: Vivek Agrawal, Vikas Tripathi, Nitin Seth

Abstract:

This study examines several critical dimensions of eservice quality overlooked in the existing literature and proposes a model and instrument framework for measuring customer perceived e-service quality in the banking sector. The initial design was derived from a pool of instrument dimensions and their items from the existing literature review by content analysis. Based on focused group discussion, nine dimensions were extracted. An exploratory factor analysis approach was applied to data from a survey of 323 respondents. The instrument has been designed specifically for the banking sector. Research data was collected from bank customers who use electronic banking in a developing economy. A nine-factor instrument has been proposed to measure the e-service quality. The instrument has been checked for reliability. The validity and sample place limited the applicability of the instrument across economies and service categories. Future research must be conducted to check the validity. This instrument can help bankers in developing economies like India to measure the e-service quality and make improvements. The present study offers a systematic procedure that provides insights on to the conceptual and empirical comprehension of customer perceived e-service quality and its constituents.

Keywords: Testing, instrument, e-service quality, factor analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3826