Search results for: Hybrid minimal repair
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1068

Search results for: Hybrid minimal repair

138 Applications of High Intensity Ultrasound to Modify Millet Protein Concentrate Functionality

Authors: B. Nazari, M. A. Mohammadifar, S. Shojaee-Aliabadi, L. Mirmoghtadaie

Abstract:

Millets as a new source of plant protein were not used in food applications due to its poor functional properties. In this study, the effect of high intensity ultrasound (frequency: 20 kHz, with contentious flow) (US) in 100% amplitude for varying times (5, 12.5, and 20 min) on solubility, emulsifying activity index (EAI), emulsion stability (ES), foaming capacity (FC), and foaming stability (FS) of millet protein concentrate (MPC) were evaluated. In addition, the structural properties of best treatments such as molecular weight and surface charge were compared with the control sample to prove the US effect. The US treatments significantly (P<0.05) increased the solubility of the native MPC (65.8±0.6%) at all sonicated times with the maximum solubility that is recorded at 12.5 min treatment (96.9±0.82 %). The FC of MPC was also significantly affected by the US treatment. Increase in sonicated time up to 12.5 min significantly increased the FC of native MPC (271.03±4.51 ml), but higher increase reduced it significantly. Minimal improvements were observed in the FS of all sonicated MPC compared to the native MPC. Sonicated time for 12.5 min affected the EAI and ES of the native MPC more markedly than 5 and 20 min that may be attributed to higher increase in proteins tendency to adsorption at the oil and water interfaces after the US treatment at this time. SDS-PAGE analysis showed changes in the molecular weight of MPC that attributed to shearing forces created by cavitation phenomenon. Also, this phenomenon caused an increase in the exposure of more amino acids with negative charge in the surface of US treated MPC, that was demonstrated by Zetasizer data. High intensity ultrasound, as a green technology, can significantly increase the functional properties of MPC and can make this usable for food applications.

Keywords: Millet protein concentrate, Functional properties, Structural properties, High intensity ultrasound.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1680
137 An Exploratory Study of the Student’s Learning Experience by Applying Different Tools for e-Learning and e-Teaching

Authors: Angel Daniel Muñoz Guzmán

Abstract:

E-learning is becoming more and more common every day. For online, hybrid or traditional face-to-face programs, there are some e-teaching platforms like Google classroom, Blackboard, Moodle and Canvas, and there are platforms for full e-learning like Coursera, edX or Udemy. These tools are changing the way students acquire knowledge at schools; however, in today’s changing world that is not enough. As students’ needs and skills change and become more complex, new tools will need to be added to keep them engaged and potentialize their learning. This is especially important in the current global situation that is changing everything: the Covid-19 pandemic. Due to Covid-19, education had to make an unexpected switch from face-to-face courses to digital courses. In this study, the students’ learning experience is analyzed by applying different e-tools and following the Tec21 Model and a flexible and digital model, both developed by the Tecnologico de Monterrey University. The evaluation of the students’ learning experience has been made by the quantitative PrEmo method of emotions. Findings suggest that the quantity of e-tools used during a course does not affect the students’ learning experience as much as how a teacher links every available tool and makes them work as one in order to keep the student engaged and motivated.

Keywords: Student, experience, e-learning, e-teaching, e-tools, technology, education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 716
136 Comparison of the Effects of Three Different Types of Probiotics on the Sucrase Activities of the Small Intestine Mucosa of Broiler Chicks

Authors: Fazlollah Moosavinasab, Zhila Motamedi

Abstract:

An experiment was conducted to study the effects of different types of probiotic on Sucrase enzyme activity of the small intestine mucosa in male broilers. The experimental design was arranged as randomized completely blocks in 4 × 2 factorial arrangement of treatment. 180 male broilers of Ross 308 commercial hybrid were designated into 4 groups. Three replicates of 15 birds were assigned to each treatment. Control treatments (diet contained no probiotic) were fed according to the NRC as base diet and three treatment groups were fed from the same diet plus three different types of probiotics. Birds were slaughtered after 21 and 42 days and different segments of small intestine (at 1,10,30,50,70 and 90% of total length the small intestine) were taken from each replicates (N=2) Sucrase enzyme activities were measured and recorded. Obtained data were analyzed by Spss (P<0.05). In three treatment groups, probiotic had no significant effect on sucrase activity in different ages and segments of small intestine (P<0.05). These data suggested that probiotics administration had no significant effect on treatments comparing to the control group.

Keywords: Broiler, Chicks, Probiotics, Small Intestine, Sucrase

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1954
135 Adsorptive Waste Heat Based Air-Conditioning Control Strategy for Automotives

Authors: Indrasen Raghupatruni, Michael Glora, Ralf Diekmann, Thomas Demmer

Abstract:

As the trend in automotive technology is fast moving towards hybridization and electrification to curb emissions as well as to improve the fuel efficiency, air-conditioning systems in passenger cars have not caught up with this trend and still remain as the major energy consumers amongst others. Adsorption based air-conditioning systems, e.g. with silica-gel water pair, which are already in use for residential and commercial applications, are now being considered as a technology leap once proven feasible for the passenger cars. In this paper we discuss a methodology, challenges and feasibility of implementing an adsorption based air-conditioning system in a passenger car utilizing the exhaust waste heat. We also propose an optimized control strategy with interfaces to the engine control unit of the vehicle for operating this system with reasonable efficiency supported by our simulation and validation results in a prototype vehicle, additionally comparing to existing implementations, simulation based as well as experimental. Finally we discuss the influence of start-stop and hybrid systems on the operation strategy of the adsorption air-conditioning system.

Keywords: Adsorption air-conditioning, feasibility study, optimized control strategy, prototype vehicle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2392
134 A Monte Carlo Method to Data Stream Analysis

Authors: Kittisak Kerdprasop, Nittaya Kerdprasop, Pairote Sattayatham

Abstract:

Data stream analysis is the process of computing various summaries and derived values from large amounts of data which are continuously generated at a rapid rate. The nature of a stream does not allow a revisit on each data element. Furthermore, data processing must be fast to produce timely analysis results. These requirements impose constraints on the design of the algorithms to balance correctness against timely responses. Several techniques have been proposed over the past few years to address these challenges. These techniques can be categorized as either dataoriented or task-oriented. The data-oriented approach analyzes a subset of data or a smaller transformed representation, whereas taskoriented scheme solves the problem directly via approximation techniques. We propose a hybrid approach to tackle the data stream analysis problem. The data stream has been both statistically transformed to a smaller size and computationally approximated its characteristics. We adopt a Monte Carlo method in the approximation step. The data reduction has been performed horizontally and vertically through our EMR sampling method. The proposed method is analyzed by a series of experiments. We apply our algorithm on clustering and classification tasks to evaluate the utility of our approach.

Keywords: Data Stream, Monte Carlo, Sampling, DensityEstimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1397
133 Evaluating Alternative Fuel Vehicles from Technical, Environmental and Economic Perspectives: Case of Light-Duty Vehicles in Iran

Authors: Vahid Aryanpur , Ehsan Shafiei

Abstract:

This paper presents an environmental and technoeconomic evaluation of light duty vehicles in Iran. A comprehensive well-to-wheel (WTW) analysis is applied to compare different automotive fuel chains, conventional internal combustion engines and innovative vehicle powertrains. The study examines the competitiveness of 15 various pathways in terms of energy efficiencies, GHG emissions, and levelized cost of different energy carriers. The results indicate that electric vehicles including battery electric vehicles (BEV), fuel cell vehicles (FCV) and plug-in hybrid electric vehicles (PHEV) increase the WTW energy efficiency by 54%, 51% and 46%, respectively, compared to common internal combustion engines powered by gasoline. On the other hand, greenhouse gas (GHG) emissions per kilometer of FCV and BEV would be 48% lower than that of gasoline engines. It is concluded that BEV has the lowest total cost of energy consumption and external cost of emission, followed by internal combustion engines (ICE) fueled by CNG. Conventional internal combustion engines fueled by gasoline, on the other hand, would have the highest costs.

Keywords: Well-to-Wheel analysis, Energy Efficiency, GHG emissions, Levelized cost of energy, Alternative fuel vehicles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
132 Blueprinting of a Normalized Supply Chain Processes: Results in Implementing Normalized Software Systems

Authors: Bassam Istanbouli

Abstract:

With the technology evolving every day and with the increase in global competition, industries are always under the pressure to be the best. They need to provide good quality products at competitive prices, when and how the customer wants them.  In order to achieve this level of service, products and their respective supply chain processes need to be flexible and evolvable; otherwise changes will be extremely expensive, slow and with many combinatorial effects. Those combinatorial effects impact the whole organizational structure, from a management, financial, documentation, logistics and specially the information system Enterprise Requirement Planning (ERP) perspective. By applying the normalized system concept/theory to segments of the supply chain, we believe minimal effects, especially at the time of launching an organization global software project. The purpose of this paper is to point out that if an organization wants to develop a software from scratch or implement an existing ERP software for their business needs and if their business processes are normalized and modular then most probably this will yield to a normalized and modular software system that can be easily modified when the business evolves. Another important goal of this paper is to increase the awareness regarding the design of the business processes in a software implementation project. If the blueprints created are normalized then the software developers and configurators will use those modular blueprints to map them into modular software. This paper only prepares the ground for further studies;  the above concept will be supported by going through the steps of developing, configuring and/or implementing a software system for an organization by using two methods: The Software Development Lifecycle method (SDLC) and the Accelerated SAP implementation method (ASAP). Both methods start with the customer requirements, then blue printing of its business processes and finally mapping those processes into a software system.  Since those requirements and processes are the starting point of the implementation process, then normalizing those processes will end up in a normalizing software.

Keywords: Blueprint, ERP, SDLC, Modular.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 358
131 Multiple Peaks Tracking Algorithm using Particle Swarm Optimization Incorporated with Artificial Neural Network

Authors: Mei Shan Ngan, Chee Wei Tan

Abstract:

Due to the non-linear characteristics of photovoltaic (PV) array, PV systems typically are equipped with the capability of maximum power point tracking (MPPT) feature. Moreover, in the case of PV array under partially shaded conditions, hotspot problem will occur which could damage the PV cells. Partial shading causes multiple peaks in the P-V characteristic curves. This paper presents a hybrid algorithm of Particle Swarm Optimization (PSO) and Artificial Neural Network (ANN) MPPT algorithm for the detection of global peak among the multiple peaks in order to extract the true maximum energy from PV panel. The PV system consists of PV array, dc-dc boost converter controlled by the proposed MPPT algorithm and a resistive load. The system was simulated using MATLAB/Simulink package. The simulation results show that the proposed algorithm performs well to detect the true global peak power. The results of the simulations are analyzed and discussed.

Keywords: Photovoltaic (PV), Partial Shading, Maximum Power Point Tracking (MPPT), Particle Swarm Optimization (PSO) and Artificial Neural Network (ANN)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3723
130 Exploration of Autistic Children using Case Based Reasoning System with Cognitive Map

Authors: Ebtehal Alawi Alsaggaf, Shehab A. Gamalel-Din

Abstract:

Exploring an autistic child in Elementary school is a difficult task that must be fully thought out and the teachers should be aware of the many challenges they face raising their child especially the behavioral problems of autistic children. Hence there arises a need for developing Artificial intelligence (AI) Contemporary Techniques to help diagnosis to discover autistic people. In this research, we suggest designing architecture of expert system that combine Cognitive Maps (CM) with Case Based Reasoning technique (CBR) in order to reduce time and costs of traditional diagnosis process for the early detection to discover autistic children. The teacher is supposed to enter child's information for analyzing by CM module. Then, the reasoning processor would translate the output into a case to be solved a current problem by CBR module. We will implement a prototype for the model as a proof of concept using java and MYSQL. This will be provided a new hybrid approach that will achieve new synergies and improve problem solving capabilities in AI. And we will predict that will reduce time, costs, the number of human errors and make expertise available to more people who want who want to serve autistic children and their families.

Keywords: Autism, Cognitive Maps (CM), Case Based Reasoning technique (CBR).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1933
129 Fe3O4 and Fe3O4@Au Nanoparticles: Synthesis and Functionalisation for Biomolecular Attachment

Authors: Hendriëtte van der Walt, Lesley Chown, Richard Harris, Ndabenhle Sosibo, Robert Tshikhudo

Abstract:

The use of magnetic and magnetic/gold core/shell nanoparticles in biotechnology or medicine has shown good promise due to their hybrid nature which possesses superior magnetic and optical properties. Some of these potential applications include hyperthermia treatment, bio-separations, diagnostics, drug delivery and toxin removal. Synthesis refinement to control geometric and magnetic/optical properties, and finding functional surfactants for biomolecular attachment, are requirements to meet application specifics. Various high-temperature preparative methods were used for the synthesis of iron oxide and gold-coated iron oxide nanoparticles. Different surface functionalities, such as 11-aminoundecanoic and 11-mercaptoundecanoic acid, were introduced on the surface of the particles to facilitate further attachment of biomolecular functionality and drug-like molecules. Nanoparticle thermal stability, composition, state of aggregation, size and morphology were investigated and the results from techniques such as Fourier Transform-Infra Red spectroscopy (FT-IR), Ultraviolet visible spectroscopy (UV-vis), Transmission Electron Microscopy (TEM) and thermal analysis are discussed.

Keywords: Core/shell, Iron oxide, Gold coating, Nanoparticles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2918
128 Context, Challenges, Constraints and Strategies of Non-Profit Organisations in Responding to the Needs of Asylum Seekers and Refugees in Cape Town, South Africa

Authors: C. O’Brien, Chloe Reiss

Abstract:

While South Africa has been the chosen host country for over 1,2 million asylum seekers/refugees it has at the same time, been struggling to address the needs of its own people who are still trapped in poverty with little prospects of employment. This limited exploratory, qualitative study was undertaken in Cape Town with a purposive sample of 21 key personnel from various NPOs providing a service to asylum seekers/refugees. Individual in-depth face to face interviews were carried out and the main findings were: Some of the officials at the Department of Home Affairs, health personnel, landlords, school principals, employers, bank officials and police officers were prejudicial in their practices towards asylum seekers/ refugees. The major constraints experienced by NPOs in this study were linked to a lack of funding and minimal government support, strained relationship with the Department of Home Affairs and difficulties in accessing refugees. And finally, the strategies adopted by these NPOs included networking with other service providers, engaging in advocacy, raising community awareness and liaising with government. Thus, more focused intervention strategies are needed to build social cohesion, address prejudices which fuels xenophobic attacks and raise awareness/educate various sectors about refugee rights. Given this burgeoning global problem, social work education and training should include curriculum content on migrant issues. Furthermore, larger studies using mixed methodology approaches would yield more nuanced data and provide for more strategic interventions.

Keywords: Refugees and asylum seekers, non-profit organisations, refugee challenges, constraints of service delivery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1039
127 A New Approach to Design an Efficient CIC Decimator Using Signed Digit Arithmetic

Authors: Vishal Awasthi, Krishna Raj

Abstract:

Any digital processing performed on a signal with larger nyquist interval requires more computation than signal processing performed on smaller nyquist interval. The sampling rate alteration generates the unwanted effects in the system such as spectral aliasing and spectral imaging during signal processing. Multirate-multistage implementation of digital filter can result a significant computational saving than single rate filter designed for sample rate conversion. In this paper, we presented an efficient cascaded integrator comb (CIC) decimation filter that perform fast down sampling using signed digit adder algorithm with compensated frequency droop that arises due to aliasing effect during the decimation process. This proposed compensated CIC decimation filter structure with a hybrid signed digit (HSD) fast adder provide an improved performance in terms of down sampling speed by 65.15% than ripple carry adder (RCA) and reduced area and power by 57.5% and 0.01 % than signed digit (SD) adder algorithms respectively.

Keywords: Sampling rate conversion, Multirate Filtering, Compensation Theory, Decimation filter, CIC filter, Redundant signed digit arithmetic, Fast adders.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4866
126 Daylightophil Approach towards High-Performance Architecture for Hybrid-Optimization of Visual Comfort and Daylight Factor in BSk

Authors: Mohammadjavad Mahdavinejad, Hadi Yazdi

Abstract:

The greatest influence we have from the world is shaped through the visual form, thus light is an inseparable element in human life. The use of daylight in visual perception and environment readability is an important issue for users. With regard to the hazards of greenhouse gas emissions from fossil fuels, and in line with the attitudes on the reduction of energy consumption, the correct use of daylight results in lower levels of energy consumed by artificial lighting, heating and cooling systems. Windows are usually the starting points for analysis and simulations to achieve visual comfort and energy optimization; therefore, attention should be paid to the orientation of buildings to minimize electrical energy and maximize the use of daylight. In this paper, by using the Design Builder Software, the effect of the orientation of an 18m2(3m*6m) room with 3m height in city of Tehran has been investigated considering the design constraint limitations. In these simulations, the dimensions of the building have been changed with one degree and the window is located on the smaller face (3m*3m) of the building with 80% ratio. The results indicate that the orientation of building has a lot to do with energy efficiency to meet high-performance architecture and planning goals and objectives.

Keywords: Daylight, window, orientation, energy consumption, design builder.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1052
125 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500

Authors: Mustafa Elfituri, Jonathan Cook

Abstract:

Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.

Keywords: Graph computation, Graph500 benchmark, parallel architectures, parallel programming, workload characterization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 500
124 Modular Hybrid Robots for Safe Human-Robot Interaction

Authors: J. Radojicic, D. Surdilovic, G. Schreck

Abstract:

The paper considers a novel modular and intrinsically safe redundant robotic system with biologically inspired actuators (pneumatic artificial muscles and rubber bellows actuators). Similarly to the biological systems, the stiffness of the internal parallel modules, representing 2 DOF joints in the serial robotic chains, is controlled by co-activation of opposing redundant actuator groups in the null-space of the module Jacobian, without influencing the actual robot position. The decoupled position/stiffness control allows the realization of variable joint stiffness according to different force-displacement relationships. The variable joint stiffness, as well as limited pneumatic muscle/bellows force ability, ensures internal system safety that is crucial for development of human-friendly robots intended for human-robot collaboration. The initial experiments with the system prototype demonstrate the capabilities of independently, simultaneously controlling both joint (Cartesian) motion and joint stiffness. The paper also presents the possible industrial applications of snake-like robots built using the new modules.

Keywords: bellows actuator, human-robot interaction, hyper redundant robot, pneumatic muscle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1965
123 Detecting Interactions between Behavioral Requirements with OWL and SWRL

Authors: Haibo Hu, Dan Yang, Chunxiao Ye, Chunlei Fu, Ren Li

Abstract:

High quality requirements analysis is one of the most crucial activities to ensure the success of a software project, so that requirements verification for software system becomes more and more important in Requirements Engineering (RE) and it is one of the most helpful strategies for improving the quality of software system. Related works show that requirement elicitation and analysis can be facilitated by ontological approaches and semantic web technologies. In this paper, we proposed a hybrid method which aims to verify requirements with structural and formal semantics to detect interactions. The proposed method is twofold: one is for modeling requirements with the semantic web language OWL, to construct a semantic context; the other is a set of interaction detection rules which are derived from scenario-based analysis and represented with semantic web rule language (SWRL). SWRL based rules are working with rule engines like Jess to reason in semantic context for requirements thus to detect interactions. The benefits of the proposed method lie in three aspects: the method (i) provides systematic steps for modeling requirements with an ontological approach, (ii) offers synergy of requirements elicitation and domain engineering for knowledge sharing, and (3)the proposed rules can systematically assist in requirements interaction detection.

Keywords: Requirements Engineering, Semantic Web, OWL, Requirements Interaction Detection, SWRL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1762
122 Development of UiTM Robotic Prosthetic Hand

Authors: M. Amlie A. Kasim, Ahsana Aqilah, Ahmed Jaffar, Cheng Yee Low, Roseleena Jaafar, M. Saiful Bahari, Armansyah

Abstract:

The study of human hand morphology reveals that developing an artificial hand with the capabilities of human hand is an extremely challenging task. This paper presents the development of a robotic prosthetic hand focusing on the improvement of a tendon driven mechanism towards a biomimetic prosthetic hand. The design of this prosthesis hand is geared towards achieving high level of dexterity and anthropomorphism by means of a new hybrid mechanism that integrates a miniature motor driven actuation mechanism, a Shape Memory Alloy actuated mechanism and a passive mechanical linkage. The synergy of these actuators enables the flexion-extension movement at each of the finger joints within a limited size, shape and weight constraints. Tactile sensors are integrated on the finger tips and the finger phalanges area. This prosthesis hand is developed with an exact size ratio that mimics a biological hand. Its behavior resembles the human counterpart in terms of working envelope, speed and torque, and thus resembles both the key physical features and the grasping functionality of an adult hand.

Keywords: Prosthetic hand, Biomimetic actuation, Shape Memory Alloy, Tactile sensing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2606
121 Reinforced Concrete Bridge Deck Condition Assessment Methods Using Ground Penetrating Radar and Infrared Thermography

Authors: Nicole M. Martino

Abstract:

Reinforced concrete bridge deck condition assessments primarily use visual inspection methods, where an inspector looks for and records locations of cracks, potholes, efflorescence and other signs of probable deterioration. Sounding is another technique used to diagnose the condition of a bridge deck, however this method listens for damage within the subsurface as the surface is struck with a hammer or chain. Even though extensive procedures are in place for using these inspection techniques, neither one provides the inspector with a comprehensive understanding of the internal condition of a bridge deck – the location where damage originates from.  In order to make accurate estimates of repair locations and quantities, in addition to allocating the necessary funding, a total understanding of the deck’s deteriorated state is key. The research presented in this paper collected infrared thermography and ground penetrating radar data from reinforced concrete bridge decks without an asphalt overlay. These decks were of various ages and their condition varied from brand new, to in need of replacement. The goals of this work were to first verify that these nondestructive evaluation methods could identify similar areas of healthy and damaged concrete, and then to see if combining the results of both methods would provide a higher confidence than if the condition assessment was completed using only one method. The results from each method were presented as plan view color contour plots. The results from one of the decks assessed as a part of this research, including these plan view plots, are presented in this paper. Furthermore, in order to answer the interest of transportation agencies throughout the United States, this research developed a step-by-step guide which demonstrates how to collect and assess a bridge deck using these nondestructive evaluation methods. This guide addresses setup procedures on the deck during the day of data collection, system setups and settings for different bridge decks, data post-processing for each method, and data visualization and quantification.

Keywords: Bridge deck deterioration, ground penetrating radar, infrared thermography, NDT of bridge decks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 876
120 Distributional Semantics Approach to Thai Word Sense Disambiguation

Authors: Sunee Pongpinigpinyo, Wanchai Rivepiboon

Abstract:

Word sense disambiguation is one of the most important open problems in natural language processing applications such as information retrieval and machine translation. Many approach strategies can be employed to resolve word ambiguity with a reasonable degree of accuracy. These strategies are: knowledgebased, corpus-based, and hybrid-based. This paper pays attention to the corpus-based strategy that employs an unsupervised learning method for disambiguation. We report our investigation of Latent Semantic Indexing (LSI), an information retrieval technique and unsupervised learning, to the task of Thai noun and verbal word sense disambiguation. The Latent Semantic Indexing has been shown to be efficient and effective for Information Retrieval. For the purposes of this research, we report experiments on two Thai polysemous words, namely  /hua4/ and /kep1/ that are used as a representative of Thai nouns and verbs respectively. The results of these experiments demonstrate the effectiveness and indicate the potential of applying vector-based distributional information measures to semantic disambiguation.

Keywords: Distributional semantics, Latent Semantic Indexing, natural language processing, Polysemous words, unsupervisedlearning, Word Sense Disambiguation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780
119 Robot Map Building from Sonar and Laser Information using DSmT with Discounting Theory

Authors: Xinde Li, Xinhan Huang, Min Wang

Abstract:

In this paper, a new method of information fusion – DSmT (Dezert and Smarandache Theory) is introduced to apply to managing and dealing with the uncertain information from robot map building. Here we build grid map form sonar sensors and laser range finder (LRF). The uncertainty mainly comes from sonar sensors and LRF. Aiming to the uncertainty in static environment, we propose Classic DSm (DSmC) model for sonar sensors and laser range finder, and construct the general basic belief assignment function (gbbaf) respectively. Generally speaking, the evidence sources are unreliable in physical system, so we must consider the discounting theory before we apply DSmT. At last, Pioneer II mobile robot serves as a simulation experimental platform. We build 3D grid map of belief layout, then mainly compare the effect of building map using DSmT and DST. Through this simulation experiment, it proves that DSmT is very successful and valid, especially in dealing with highly conflicting information. In short, this study not only finds a new method for building map under static environment, but also supplies with a theory foundation for us to further apply Hybrid DSmT (DSmH) to dynamic unknown environment and multi-robots- building map together.

Keywords: Map building, DSmT, DST, uncertainty, information fusion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911
118 Multi Switched Split Vector Quantizer

Authors: M. Satya Sai Ram, P. Siddaiah, M. Madhavi Latha

Abstract:

Vector quantization is a powerful tool for speech coding applications. This paper deals with LPC Coding of speech signals which uses a new technique called Multi Switched Split Vector Quantization, This is a hybrid of two product code vector quantization techniques namely the Multi stage vector quantization technique, and Switched split vector quantization technique,. Multi Switched Split Vector Quantization technique quantizes the linear predictive coefficients in terms of line spectral frequencies. From results it is proved that Multi Switched Split Vector Quantization provides better trade off between bitrate and spectral distortion performance, computational complexity and memory requirements when compared to Switched Split Vector Quantization, Multi stage vector quantization, and Split Vector Quantization techniques. By employing the switching technique at each stage of the vector quantizer the spectral distortion, computational complexity and memory requirements were greatly reduced. Spectral distortion was measured in dB, Computational complexity was measured in floating point operations (flops), and memory requirements was measured in (floats).

Keywords: Unconstrained vector quantization, Linear predictiveCoding, Split vector quantization, Multi stage vector quantization, Switched Split vector quantization, Line Spectral Frequencies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1712
117 Nonlinear Fuzzy Tracking Real-time-based Control of Drying Parameters

Authors: Marco Soares dos Santos, Camila Nicola Boeri, Jorge Augusto Ferreira, Fernando Neto da Silva

Abstract:

The highly nonlinear characteristics of drying processes have prompted researchers to seek new nonlinear control solutions. However, the relation between the implementation complexity, on-line processing complexity, reliability control structure and controller-s performance is not well established. The present paper proposes high performance nonlinear fuzzy controllers for a real-time operation of a drying machine, being developed under a consistent match between those issues. A PCI-6025E data acquisition device from National Instruments® was used, and the control system was fully designed with MATLAB® / SIMULINK language. Drying parameters, namely relative humidity and temperature, were controlled through MIMOs Hybrid Bang-bang+PI (BPI) and Four-dimensional Fuzzy Logic (FLC) real-time-based controllers to perform drying tests on biological materials. The performance of the drying strategies was compared through several criteria, which are reported without controllers- retuning. Controllers- performance analysis has showed much better performance of FLC than BPI controller. The absolute errors were lower than 8,85 % for Fuzzy Logic Controller, about three times lower than the experimental results with BPI control.

Keywords: Drying control, Fuzzy logic control, Intelligent temperature-humidity control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2304
116 A High Time Resolution Digital Pulse Width Modulator Based on Field Programmable Gate Array’s Phase Locked Loop Megafunction

Authors: Jun Wang, Tingcun Wei

Abstract:

The digital pulse width modulator (DPWM) is the crucial building block for digitally-controlled DC-DC switching converter, which converts the digital duty ratio signal into its analog counterpart to control the power MOSFET transistors on or off. With the increase of switching frequency of digitally-controlled DC-DC converter, the DPWM with higher time resolution is required. In this paper, a 15-bits DPWM with three-level hybrid structure is presented; the first level is composed of a7-bits counter and a comparator, the second one is a 5-bits delay line, and the third one is a 3-bits digital dither. The presented DPWM is designed and implemented using the PLL megafunction of FPGA (Field Programmable Gate Arrays), and the required frequency of clock signal is 128 times of switching frequency. The simulation results show that, for the switching frequency of 2 MHz, a DPWM which has the time resolution of 15 ps is achieved using a maximum clock frequency of 256MHz. The designed DPWM in this paper is especially useful for high-frequency digitally-controlled DC-DC switching converters.

Keywords: DPWM, PLL megafunction, FPGA, time resolution, digitally-controlled DC-DC switching converter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1218
115 Hybrid of Hunting Search and Modified Simplex Methods for Grease Position Parameter Design Optimisation

Authors: P. Luangpaiboon, S. Boonhao

Abstract:

This study proposes a multi-response surface optimization problem (MRSOP) for determining the proper choices of a process parameter design (PPD) decision problem in a noisy environment of a grease position process in an electronic industry. The proposed models attempts to maximize dual process responses on the mean of parts between failure on left and right processes. The conventional modified simplex method and its hybridization of the stochastic operator from the hunting search algorithm are applied to determine the proper levels of controllable design parameters affecting the quality performances. A numerical example demonstrates the feasibility of applying the proposed model to the PPD problem via two iterative methods. Its advantages are also discussed. Numerical results demonstrate that the hybridization is superior to the use of the conventional method. In this study, the mean of parts between failure on left and right lines improve by 39.51%, approximately. All experimental data presented in this research have been normalized to disguise actual performance measures as raw data are considered to be confidential.

Keywords: Grease Position Process, Multi-response Surfaces, Modified Simplex Method, Hunting Search Method, Desirability Function Approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654
114 Performance Evaluation of Hybrid Intelligent Controllers in Load Frequency Control of Multi Area Interconnected Power Systems

Authors: Surya Prakash, Sunil Kumar Sinha

Abstract:

This paper deals with the application of artificial neural network (ANN) and fuzzy based Adaptive Neuro Fuzzy Inference System(ANFIS) approach to Load Frequency Control (LFC) of multi unequal area hydro-thermal interconnected power system. The proposed ANFIS controller combines the advantages of fuzzy controller as well as quick response and adaptability nature of ANN. Area-1 and area-2 consists of thermal reheat power plant whereas area-3 and area-4 consists of hydro power plant with electric governor. Performance evaluation is carried out by using intelligent controller like ANFIS, ANN and Fuzzy controllers and conventional PI and PID control approaches. To enhance the performance of intelligent and conventional controller sliding surface is included. The performances of the controllers are simulated using MATLAB/SIMULINK package. A comparison of ANFIS, ANN, Fuzzy, PI and PID based approaches shows the superiority of proposed ANFIS over ANN & fuzzy, PI and PID controller for 1% step load variation.

Keywords: Load Frequency Control (LFC), ANFIS, ANN & Fuzzy, PI, PID Controllers, Area Control Error (ACE), Tie-line, MATLAB / SIMULINK.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3636
113 Sustainable Energy Production with Closed-Loop Methods: Evaluating the Influence of Power Plant Age on Production Efficiency and Environmental Impact

Authors: Bujar Ismaili, Bahti Ismajli, Venhar Ismaili, Skender Ramadani

Abstract:

In Kosovo, the problem with the electricity supply is huge and it does not meet the demands of consumers. Older thermal power plants, which are regarded as big environmental polluters, produce most of the energy. Our experiment is based on the production of electricity using the closed method that does not affect environmental pollution by using waste as fuel that is considered to pollute the environment. The experiment was carried out in the village of Godanc, municipality of Shtime, Kosovo. In the experiment, a production line based on the production of electricity and central heating was designed at the same time. The results are the benefits of electricity as well as the release of temperature for heating with minimal expenses and with the release of 0% gases into the atmosphere. During this experiment, coal, plastic, waste from wood processing, and agricultural wastes were used as raw materials. The method utilized in the experiment allows for the release of gas through pipes and filters during the top-to-bottom combustion of the raw material in the boiler, followed by the method of gas filtration from waste wood processing (sawdust). During this process, the final product, gas, is obtained. This gas passes through the carburetor, enabling the combustion process to put the internal combustion machine and the generator into operation and produce electricity that does not release gases into the atmosphere. The results show that the system provides energy stability without environmental pollution from toxic substances and waste, as well as with low production costs. From the final results, it follows that, in the case of using coal fuel, we have benefited from more electricity and higher temperature release, followed by plastic waste, which also gave good results. The results obtained during these experiments prove that the current problems of lack of electricity and heating can be met at a lower cost and have a clean environment and waste management.

Keywords: Energy, heating, atmosphere, waste management, gasification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 166
112 Reducing Variation of Dyeing Process in Textile Manufacturing Industry

Authors: M. Zeydan, G. Toğa

Abstract:

This study deals with a multi-criteria optimization problem which has been transformed into a single objective optimization problem using Response Surface Methodology (RSM), Artificial Neural Network (ANN) and Grey Relational Analyses (GRA) approach. Grey-RSM and Grey-ANN are hybrid techniques which can be used for solving multi-criteria optimization problem. There have been two main purposes of this research as follows. 1. To determine optimum and robust fiber dyeing process conditions by using RSM and ANN based on GRA, 2. To obtain the best suitable model by comparing models developed by different methodologies. The design variables for fiber dyeing process in textile are temperature, time, softener, anti-static, material quantity, pH, retarder, and dispergator. The quality characteristics to be evaluated are nominal color consistency of fiber, maximum strength of fiber, minimum color of dyeing solution. GRA-RSM with exact level value, GRA-RSM with interval level value and GRA-ANN models were compared based on GRA output value and MSE (Mean Square Error) performance measurement of outputs with each other. As a result, GRA-ANN with interval value model seems to be suitable reducing the variation of dyeing process for GRA output value of the model.

Keywords: Artificial Neural Network, Grey Relational Analysis, Optimization, Response Surface Methodology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3521
111 Advanced Hybrid Particle Swarm Optimization for Congestion and Power Loss Reduction in Distribution Networks with High Distributed Generation Penetration through Network Reconfiguration

Authors: C. Iraklis, G. Evmiridis, A. Iraklis

Abstract:

Renewable energy sources and distributed power generation units already have an important role in electrical power generation. A mixture of different technologies penetrating the electrical grid, adds complexity in the management of distribution networks. High penetration of distributed power generation units creates node over-voltages, huge power losses, unreliable power management, reverse power flow and congestion. This paper presents an optimization algorithm capable of reducing congestion and power losses, both described as a function of weighted sum. Two factors that describe congestion are being proposed. An upgraded selective particle swarm optimization algorithm (SPSO) is used as a solution tool focusing on the technique of network reconfiguration. The upgraded SPSO algorithm is achieved with the addition of a heuristic algorithm specializing in reduction of power losses, with several scenarios being tested. Results show significant improvement in minimization of losses and congestion while achieving very small calculation times.

Keywords: Congestion, distribution networks, loss reduction, particle swarm optimization, smart grid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 705
110 Collective Redress in Consumer Protection in South East Europe: Cross-National Comparisons, Issues of Commonality and Difference

Authors: Veronika Efremova

Abstract:

In recent decades, there have been significant developments in the European Union in the field of collective consumer redress. South East European countries (SEE) covered by this paper, in line with their EU accession priorities and duties under Stabilisation and Association Agreements, have to harmonize their national laws with the relevant EU acquis for consumer protection (Chapter 28: Health and Consumer). In these countries, only minimal compliance is achieved. SEE countries have introduced rudimentary collective redress mechanisms, with modest enforcement of collective redress and case law. This paper is based on comprehensive interdisciplinary research conducted for SEE countries on common principles for injunctive and compensatory collective redress mechanisms, emphasizing cross-national comparisons, underlining issues of commonality and difference aiming to develop recommendations for an adequate enforcement of collective redress. SEE countries are recognized by the sectoral approach for regulating collective redress contrary to the majority of EU Member States with having adopted horizontal approach to collective redress. In most SEE countries, the laws do not recognize compensatory but only injunctive collective redress in consumer protection. All responsible stakeholders for implementation of collective redress in SEE countries, lack information and awareness on collective redress mechanisms and the way they function in practice. Therefore, specific actions are needed in these countries to make the whole system of collective redress for consumer protection operational and efficient. Taking into consideration the various designated stakeholders in collective redress in each SEE countries, there is a need of their mutual coordination and cooperation in order to develop consumer protection system and policies. By putting into practice the national collective redress mechanisms, effective access to justice for all consumers, the principle of rule of law will be secured and appropriate procedural guarantees to avoid abusive litigation will be ensured.

Keywords: Collective redress mechanism, consumer protection, commonality and difference, South East Europe.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 905
109 Taguchi Robust Design for Optimal Setting of Process Wastes Parameters in an Automotive Parts Manufacturing Company

Authors: Charles Chikwendu Okpala, Christopher Chukwutoo Ihueze

Abstract:

As a technique that reduces variation in a product by lessening the sensitivity of the design to sources of variation, rather than by controlling their sources, Taguchi Robust Design entails the designing of ideal goods, by developing a product that has minimal variance in its characteristics and also meets the desired exact performance. This paper examined the concept of the manufacturing approach and its application to brake pad product of an automotive parts manufacturing company. Although the firm claimed that only defects, excess inventory, and over-production were the few wastes that grossly affect their productivity and profitability, a careful study and analysis of their manufacturing processes with the application of Single Minute Exchange of Dies (SMED) tool showed that the waste of waiting is the fourth waste that bedevils the firm. The selection of the Taguchi L9 orthogonal array which is based on the four parameters and the three levels of variation for each parameter revealed that with a range of 2.17, that waiting is the major waste that the company must reduce in order to continue to be viable. Also, to enhance the company’s throughput and profitability, the wastes of over-production, excess inventory, and defects with ranges of 2.01, 1.46, and 0.82, ranking second, third, and fourth respectively must also be reduced to the barest minimum. After proposing -33.84 as the highest optimum Signal-to-Noise ratio to be maintained for the waste of waiting, the paper advocated for the adoption of all the tools and techniques of Lean Production System (LPS), and Continuous Improvement (CI), and concluded by recommending SMED in order to drastically reduce set up time which leads to unnecessary waiting.

Keywords: Taguchi Robust Design, signal to noise ratio, Single Minute Exchange of Dies, lean production system, waste.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 936