Search results for: principal components
1270 Use of Time-Depend Effects for Mixing and Separation of the Two-Phase Flows
Authors: N. B. Fedosenko, A.A Iatcenko, S.A. Levanov
Abstract:
The paper shows some ability to manage two-phase flows arising from the use of unsteady effects. In one case, we consider the condition of fragmentation of the interface between the two components leads to the intensification of mixing. The problem is solved when the temporal and linear scale are small for the appearance of the developed mixing layer. Showing that exist such conditions for unsteady flow velocity at the surface of the channel, which will lead to the creation and fragmentation of vortices at Re numbers of order unity. Also showing that the Re is not a criterion of similarity for this type of flows, but we can introduce a criterion that depends on both the Re, and the frequency splitting of the vortices. It turned out that feature of this situation is that streamlines behave stable, and if we analyze the behavior of the interface between the components it satisfies all the properties of unstable flows. The other problem we consider the behavior of solid impurities in the extensive system of channels. Simulated unsteady periodic flow modeled breaths. Consider the behavior of the particles along the trajectories. It is shown that, depending on the mass and diameter of the particles, they can be collected in a caustic on the channel walls, stop in a certain place or fly back. Of interest is the distribution of particle velocity in frequency. It turned out that by choosing a behavior of the velocity field of the carrier gas can affect the trajectory of individual particles including force them to fly back.Keywords: Two-phase, mixing, separating, flow control
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13571269 A Multi-Level WEB Based Parallel Processing System A Hierarchical Volunteer Computing Approach
Authors: Abdelrahman Ahmed Mohamed Osman
Abstract:
Over the past few years, a number of efforts have been exerted to build parallel processing systems that utilize the idle power of LAN-s and PC-s available in many homes and corporations. The main advantage of these approaches is that they provide cheap parallel processing environments for those who cannot afford the expenses of supercomputers and parallel processing hardware. However, most of the solutions provided are not very flexible in the use of available resources and very difficult to install and setup. In this paper, a multi-level web-based parallel processing system (MWPS) is designed (appendix). MWPS is based on the idea of volunteer computing, very flexible, easy to setup and easy to use. MWPS allows three types of subscribers: simple volunteers (single computers), super volunteers (full networks) and end users. All of these entities are coordinated transparently through a secure web site. Volunteer nodes provide the required processing power needed by the system end users. There is no limit on the number of volunteer nodes, and accordingly the system can grow indefinitely. Both volunteer and system users must register and subscribe. Once, they subscribe, each entity is provided with the appropriate MWPS components. These components are very easy to install. Super volunteer nodes are provided with special components that make it possible to delegate some of the load to their inner nodes. These inner nodes may also delegate some of the load to some other lower level inner nodes .... and so on. It is the responsibility of the parent super nodes to coordinate the delegation process and deliver the results back to the user. MWPS uses a simple behavior-based scheduler that takes into consideration the current load and previous behavior of processing nodes. Nodes that fulfill their contracts within the expected time get a high degree of trust. Nodes that fail to satisfy their contract get a lower degree of trust. MWPS is based on the .NET framework and provides the minimal level of security expected in distributed processing environments. Users and processing nodes are fully authenticated. Communications and messages between nodes are very secure. The system has been implemented using C#. MWPS may be used by any group of people or companies to establish a parallel processing or grid environment.Keywords: Volunteer computing, Parallel Processing, XMLWebServices, .NET Remoting, Tuplespace.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14951268 Modeling and Analysis of DFIG Based Wind Power System Using Instantaneous Power Components
Authors: Jaimala Gambhir, Tilak Thakur, Puneet Chawla
Abstract:
As per the statistical data, the Doubly-fed Induction Generator (DFIG) based wind turbine with variable speed and variable pitch control is the most common wind turbine in the growing wind market. This machine is usually used on the grid connected wind energy conversion system to satisfy grid code requirements such as grid stability, Fault Ride Through (FRT), power quality improvement, grid synchronization and power control etc. Though the requirements are not fulfilled directly by the machine, the control strategy is used in both the stator as well as rotor side along with power electronic converters to fulfil the requirements stated above. To satisfy the grid code requirements of wind turbine, usually grid side converter is playing a major role. So in order to improve the operation capacity of wind turbine under critical situation, the intensive study of both machine side converter control and grid side converter control is necessary In this paper DFIG is modeled using power components as variables and the performance of the DFIG system is analysed under grid voltage fluctuations. The voltage fluctuations are made by lowering and raising the voltage values in the utility grid intentionally for the purpose of simulation keeping in view of different grid disturbances.Keywords: DFIG, dynamic modeling, DPC, sag, swell, voltage fluctuations, FRT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26571267 Variance Based Component Analysis for Texture Segmentation
Authors: Zeinab Ghasemi, S. Amirhassan Monadjemi, Abbas Vafaei
Abstract:
This paper presents a comparative analysis of a new unsupervised PCA-based technique for steel plates texture segmentation towards defect detection. The proposed scheme called Variance Based Component Analysis or VBCA employs PCA for feature extraction, applies a feature reduction algorithm based on variance of eigenpictures and classifies the pixels as defective and normal. While the classic PCA uses a clusterer like Kmeans for pixel clustering, VBCA employs thresholding and some post processing operations to label pixels as defective and normal. The experimental results show that proposed algorithm called VBCA is 12.46% more accurate and 78.85% faster than the classic PCA. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19731266 Two Scenarios for Ultra-Light Overhead Conveyor System in Logistics Applications
Authors: Batin Latif Aylak, Bernd Noche
Abstract:
Overhead conveyor systems are in use in many installations around the world, meeting the widest range of applications possible. Overhead conveyor systems are particularly preferred in automotive industry but also at post offices. Overhead conveyor systems must always be integrated with a logistical process by finding the best way for a cheaper material flow in order to guarantee precise and fast workflows. With their help, any transport can take place without wasting ground and space, without excessive company capacity, lost or damaged products, erroneous delivery, endless travels and without wasting time. Ultra-light overhead conveyor systems are rope-based conveying systems with individually driven vehicles. The vehicles can move automatically on the rope and this can be realized by energy and signals. Crossings are realized by switches. Ultra-light overhead conveyor systems provide optimal material flow, which produces profit and saves time. This article introduces two new ultra-light overhead conveyor designs in logistics and explains their components. According to the explanation of the components, scenarios are created by means of their technical characteristics. The scenarios are visualized with the help of CAD software. After that, assumptions are made for application area. According to these assumptions scenarios are visualized. These scenarios help logistics companies achieve lower development costs as well as quicker market maturity.
Keywords: Logistics, material flow, overhead conveyor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19981265 Evaluation of Biofertilizer and Manure Effects on Quantitative Yield of Nigella sativa L.
Authors: Mohammad Reza Haj Seyed Hadi, Fereshteh Ghanepasand, Mohammad Taghi Darzi
Abstract:
The main objective of this study was to determine the effects of Nitrogen fixing bacteria and manure application on the seed yield and yield components in black cumin (Nigella sativa L.). The experiment was carried out at the RAN Research Station in Firouzkouh in 2012. A 4×4 factorial experiment, arranged in a randomized complete blocks designed with three replications. Nitrogen fixing bacteria at 4 levels (control, Azotobacter, Azospirillum and Azotobacter + Azospirillum) and manure application at 4 levels (0, 2.5, 5 and 7.5 ton ha-1) were used at this investigation. The present results have shown that the highest height, 1000 seeds weight, seed number per follicle, follicle yield, seed yield and harvest index were obtained after using Azotobacter and Azospirillum, simultaneously. Manure application only effects on follicle yield and by 5ton manure ha-1 the highest follicle yield obtained. Results of this investigation showed that the maximum seed yield obtained when Aotobacter+Azospirillum inoculated with black cumin seeds and 5 ton manure ha-1 applied. According to the results of this investigation the integrated management of Azotobacter and Azospirillum with manure application is the best treatment for achieving the maximum quantitative charactersitics of Black cumin.Keywords: Azotobacter, azospirillum, black cumin, yield, yield components.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24071264 Automatic Generation of OWL Ontologies from UML Class Diagrams Based on Meta- Modelling and Graph Grammars
Authors: Aissam Belghiat, Mustapha Bourahla
Abstract:
Models are placed by modeling paradigm at the center of development process. These models are represented by languages, like UML the language standardized by the OMG which became necessary for development. Moreover the ontology engineering paradigm places ontologies at the center of development process; in this paradigm we find OWL the principal language for knowledge representation. Building ontologies from scratch is generally a difficult task. The bridging between UML and OWL appeared on several regards such as the classes and associations. In this paper, we have to profit from convergence between UML and OWL to propose an approach based on Meta-Modelling and Graph Grammars and registered in the MDA architecture for the automatic generation of OWL ontologies from UML class diagrams. The transformation is based on transformation rules; the level of abstraction in these rules is close to the application in order to have usable ontologies. We illustrate this approach by an example.
Keywords: ATOM3, MDA, Ontology, OWL, UML
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 249081263 Clustering of Variables Based On a Probabilistic Approach Defined on the Hypersphere
Authors: Paulo Gomes, Adelaide Figueiredo
Abstract:
We consider n individuals described by p standardized variables, represented by points of the surface of the unit hypersphere Sn-1. For a previous choice of n individuals we suppose that the set of observables variables comes from a mixture of bipolar Watson distribution defined on the hypersphere. EM and Dynamic Clusters algorithms are used for identification of such mixture. We obtain estimates of parameters for each Watson component and then a partition of the set of variables into homogeneous groups of variables. Additionally we will present a factor analysis model where unobservable factors are just the maximum likelihood estimators of Watson directional parameters, exactly the first principal component of data matrix associated to each group previously identified. Such alternative model it will yield us to directly interpretable solutions (simple structure), avoiding factors rotations.
Keywords: Dynamic Clusters algorithm, EM algorithm, Factor analysis model, Hierarchical Clustering, Watson distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16251262 Mathematical Modeling for Dengue Transmission with the Effect of Season
Authors: R. Kongnuy., P. Pongsumpun
Abstract:
Mathematical models can be used to describe the transmission of disease. Dengue disease is the most significant mosquito-borne viral disease of human. It now a leading cause of childhood deaths and hospitalizations in many countries. Variations in environmental conditions, especially seasonal climatic parameters, effect to the transmission of dengue viruses the dengue viruses and their principal mosquito vector, Aedes aegypti. A transmission model for dengue disease is discussed in this paper. We assume that the human and vector populations are constant. We showed that the local stability is completely determined by the threshold parameter, 0 B . If 0 B is less than one, the disease free equilibrium state is stable. If 0 B is more than one, a unique endemic equilibrium state exists and is stable. The numerical results are shown for the different values of the transmission probability from vector to human populations.Keywords: Dengue disease, mathematical model, season, threshold parameters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22171261 Partial 3D Reconstruction using Evolutionary Algorithms
Authors: Mónica Pérez-Meza, Rodrigo Montúfar-Chaveznava
Abstract:
When reconstructing a scenario, it is necessary to know the structure of the elements present on the scene to have an interpretation. In this work we link 3D scenes reconstruction to evolutionary algorithms through the vision stereo theory. We consider vision stereo as a method that provides the reconstruction of a scene using only a couple of images of the scene and performing some computation. Through several images of a scene, captured from different positions, vision stereo can give us an idea about the threedimensional characteristics of the world. Vision stereo usually requires of two cameras, making an analogy to the mammalian vision system. In this work we employ only a camera, which is translated along a path, capturing images every certain distance. As we can not perform all computations required for an exhaustive reconstruction, we employ an evolutionary algorithm to partially reconstruct the scene in real time. The algorithm employed is the fly algorithm, which employ “flies" to reconstruct the principal characteristics of the world following certain evolutionary rules.Keywords: 3D Reconstruction, Computer Vision, EvolutionaryAlgorithms, Vision Stereo.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18871260 A 1H NMR-Linked PCR Modelling Strategy for Tracking the Fatty Acid Sources of Aldehydic Lipid Oxidation Products in Culinary Oils Exposed to Simulated Shallow-Frying Episodes
Authors: Martin Grootveld, Benita Percival, Sarah Moumtaz, Kerry L. Grootveld
Abstract:
Objectives/Hypotheses: The adverse health effect potential of dietary lipid oxidation products (LOPs) has evoked much clinical interest. Therefore, we employed a 1H NMR-linked Principal Component Regression (PCR) chemometrics modelling strategy to explore relationships between data matrices comprising (1) aldehydic LOP concentrations generated in culinary oils/fats when exposed to laboratory-simulated shallow frying practices, and (2) the prior saturated (SFA), monounsaturated (MUFA) and polyunsaturated fatty acid (PUFA) contents of such frying media (FM), together with their heating time-points at a standard frying temperature (180 oC). Methods: Corn, sunflower, extra virgin olive, rapeseed, linseed, canola, coconut and MUFA-rich algae frying oils, together with butter and lard, were heated according to laboratory-simulated shallow-frying episodes at 180 oC, and FM samples were collected at time-points of 0, 5, 10, 20, 30, 60, and 90 min. (n = 6 replicates per sample). Aldehydes were determined by 1H NMR analysis (Bruker AV 400 MHz spectrometer). The first (dependent output variable) PCR data matrix comprised aldehyde concentration scores vectors (PC1* and PC2*), whilst the second (predictor) one incorporated those from the fatty acid content/heating time variables (PC1-PC4) and their first-order interactions. Results: Structurally complex trans,trans- and cis,trans-alka-2,4-dienals, 4,5-epxy-trans-2-alkenals and 4-hydroxy-/4-hydroperoxy-trans-2-alkenals (group I aldehydes predominantly arising from PUFA peroxidation) strongly and positively loaded on PC1*, whereas n-alkanals and trans-2-alkenals (group II aldehydes derived from both MUFA and PUFA hydroperoxides) strongly and positively loaded on PC2*. PCR analysis of these scores vectors (SVs) demonstrated that PCs 1 (positively-loaded linoleoylglycerols and [linoleoylglycerol]:[SFA] content ratio), 2 (positively-loaded oleoylglycerols and negatively-loaded SFAs), 3 (positively-loaded linolenoylglycerols and [PUFA]:[SFA] content ratios), and 4 (exclusively orthogonal sampling time-points) all powerfully contributed to aldehydic PC1* SVs (p 10-3 to < 10-9), as did all PC1-3 x PC4 interaction ones (p 10-5 to < 10-9). PC2* was also markedly dependent on all the above PC SVs (PC2 > PC1 and PC3), and the interactions of PC1 and PC2 with PC4 (p < 10-9 in each case), but not the PC3 x PC4 contribution. Conclusions: NMR-linked PCR analysis is a valuable strategy for (1) modelling the generation of aldehydic LOPs in heated cooking oils and other FM, and (2) tracking their unsaturated fatty acid (UFA) triacylglycerol sources therein.
Keywords: Frying oils, frying episodes, lipid oxidation products, cytotoxic/genotoxic aldehydes, chemometrics, principal component regression, NMR Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9081259 Optimization of CO2 Emissions and Cost for Composite Building Design with NSGA-II
Authors: Ji Hyeong Park, Ji Hye Jeon, Hyo Seon Park
Abstract:
Environmental pollution problems have been globally main concern in all fields including economy, society and culture into the 21st century. Beginning with the Kyoto Protocol, the reduction on the emissions of greenhouse gas such as CO2 and SOX has been a principal challenge of our day. As most buildings unlike durable goods in other industries have a characteristic and long life cycle, they consume energy in quantity and emit much CO2. Thus, for green building construction, more research is needed to reduce the CO2 emissions at each stage in the life cycle. However, recent studies are focused on the use and maintenance phase. Also, there is a lack of research on the initial design stage, especially the structure design. Therefore, in this study, we propose an optimal design plan considering CO2 emissions and cost in composite buildings simultaneously by applying to the structural design of actual building.Keywords: Multi-objective optimization, CO2 emissions, structural cost, encased composite structure
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21671258 Indicator of Small Calcification Detection in Ultrasonography using Decorrelation of Forward Scattered Waves
Authors: Hirofumi Taki, Takuya Sakamoto, Makoto Yamakawa, Tsuyoshi Shiina, Toru Sato
Abstract:
For the improvement of the ability in detecting small calcifications using Ultrasonography (US) we propose a novel indicator of calcifications in an ultrasound B-mode image without decrease in frame rate. Since the waveform of an ultrasound pulse changes at a calcification position, the decorrelation of adjacent scan lines occurs behind a calcification. Therefore, we employ the decorrelation of adjacent scan lines as an indicator of a calcification. The proposed indicator depicted wires 0.05 mm in diameter at 2 cm depth with a sensitivity of 86.7% and a specificity of 100%, which were hardly detected in ultrasound B-mode images. This study shows the potential of the proposed indicator to approximate the detectable calcification size using an US device to that of an X-ray imager, implying the possibility that an US device will become a convenient, safe, and principal clinical tool for the screening of breast cancer.Keywords: Ultrasonography, Calcification, Decorrelation, Forward scattered wave
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14531257 Neuron Dynamics of Single-Compartment Traub Model for Hardware Implementations
Authors: J. C. Moctezuma, V. Breña-Medina, Jose Luis Nunez-Yanez, Joseph P. McGeehan
Abstract:
In this work we make a bifurcation analysis for a single compartment representation of Traub model, one of the most important conductance-based models. The analysis focus in two principal parameters: current and leakage conductance. Study of stable and unstable solutions are explored; also Hop-bifurcation and frequency interpretation when current varies is examined. This study allows having control of neuron dynamics and neuron response when these parameters change. Analysis like this is particularly important for several applications such as: tuning parameters in learning process, neuron excitability tests, measure bursting properties of the neuron, etc. Finally, a hardware implementation results were developed to corroborate these results.Keywords: Traub model, Pinsky-Rinzel model, Hopf bifurcation, single-compartment models, Bifurcation analysis, neuron modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12061256 Empirical Analysis of Private Listed Companies- Debt Financing and Business Performance in Jiangsu Province
Authors: Chengxuan Geng, Haitao E, Yijie Jiang
Abstract:
According to the theory of capital structure, this paper uses principal component analysis and linear regression analysis to study the relationship between the debt characteristics of the private listed companies in Jiangsu Province and their business performance. The results show that the average debt ratio of the 29 private listed companies selected from the sample is lower. And it is found that for the sample whose debt ratio is lower than 80%, its debt ratio is negatively related to corporate performance, while for the sample whose debt ratio is beyond 80%, the relationship of debt financing and enterprise performance shows the different trends. The conclusions reflect the drawbacks may exist that the debt ratio is relatively low and having not take full advantage of debt governance effect of the private listed companies in Jiangsu Province.
Keywords: private listed companies, debt financing, business performance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15451255 Chose the Right Mutation Rate for Better Evolve Combinational Logic Circuits
Authors: Emanuele Stomeo, Tatiana Kalganova, Cyrille Lambert
Abstract:
Evolvable hardware (EHW) is a developing field that applies evolutionary algorithm (EA) to automatically design circuits, antennas, robot controllers etc. A lot of research has been done in this area and several different EAs have been introduced to tackle numerous problems, as scalability, evolvability etc. However every time a specific EA is chosen for solving a particular task, all its components, such as population size, initialization, selection mechanism, mutation rate, and genetic operators, should be selected in order to achieve the best results. In the last three decade the selection of the right parameters for the EA-s components for solving different “test-problems" has been investigated. In this paper the behaviour of mutation rate for designing logic circuits, which has not been done before, has been deeply analyzed. The mutation rate for an EHW system modifies the number of inputs of each logic gates, the functionality (for example from AND to NOR) and the connectivity between logic gates. The behaviour of the mutation has been analyzed based on the number of generations, genotype redundancy and number of logic gates for the evolved circuits. The experimental results found provide the behaviour of the mutation rate during evolution for the design and optimization of simple logic circuits. The experimental results propose the best mutation rate to be used for designing combinational logic circuits. The research presented is particular important for those who would like to implement a dynamic mutation rate inside the evolutionary algorithm for evolving digital circuits. The researches on the mutation rate during the last 40 years are also summarized.Keywords: Design of logic circuit, evolutionary computation, evolvable hardware, mutation rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16931254 Riding the Crest of the Wave: Inclusive Education in New Zealand
Authors: Barbara A. Perry
Abstract:
In 1996, the New Zealand government and the Ministry of Education announced that they were setting up a "world class system of inclusive education". As a parent of a son with high and complex needs, a teacher, school Principal and Disability studies Lecturer, this author will track the changes in the journey towards inclusive education over the last 20 years. Strategies for partnering with families to ensure educational success along with insights from one of those on the crest of the wave will be presented. Using a narrative methodology the author will illuminate how far New Zealand has come towards this world class system of inclusion promised and share from personal experience some of the highlights and risks in the system. This author has challenged the old structures and been part of the setting up of new structures particularly for providing parent voice and insight; this paper provides a unique view from an insider’s voice as well as a professional in the system.Keywords: Disability studies, inclusive education, special education, working with families with children with disability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12511253 Urban Growth, Sewerage Network and Flooding Risk: Flooding of November 10, 2001 in Algiers
Authors: Boualem El Kechebour, Djilali Benouar
Abstract:
The objective of this work is to present a expertise on flooding hazard analysis and how to reduce the risk. The analysis concerns the disaster induced by the flood on November 10/11, 2001 in the Bab El Oued district of the city of Algiers.The study begins by an expertise of damages in related with the urban environment and the history of the urban growth of the site. After this phase, the work is focalized on the identification of the existing correlations between the development of the town and its vulnerability. The final step consists to elaborate the interpretations on the interactions between the urban growth, the sewerage network and the vulnerability of the urban system.In conclusion, several recommendations are formulated permitting the mitigation of the risk in the future. The principal recommendations concern the new urban operations and the existing urbanized sites.Keywords: urban growth, sewerage network, vulnerability of town, flooding risk, mitigation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16491252 An Axiomatic Model for Development of the Allocated Architecture in Systems Engineering Process
Authors: A. Sharahi, R. Tehrani, A. Mollajan
Abstract:
The final step to complete the “Analytical Systems Engineering Process” is the “Allocated Architecture” in which all Functional Requirements (FRs) of an engineering system must be allocated into their corresponding Physical Components (PCs). At this step, any design for developing the system’s allocated architecture in which no clear pattern of assigning the exclusive “responsibility” of each PC for fulfilling the allocated FR(s) can be found is considered a poor design that may cause difficulties in determining the specific PC(s) which has (have) failed to satisfy a given FR successfully. The present study utilizes the Axiomatic Design method principles to mathematically address this problem and establishes an “Axiomatic Model” as a solution for reaching good alternatives for developing the allocated architecture. This study proposes a “loss Function”, as a quantitative criterion to monetarily compare non-ideal designs for developing the allocated architecture and choose the one which imposes relatively lower cost to the system’s stakeholders. For the case-study, we use the existing design of U. S. electricity marketing subsystem, based on data provided by the U.S. Energy Information Administration (EIA). The result for 2012 shows the symptoms of a poor design and ineffectiveness due to coupling among the FRs of this subsystem.
Keywords: Allocated Architecture, Analytical Systems Engineering Process, Functional Requirements (FRs), Physical Components (PCs), Responsibility of a Physical Component, System’s Stakeholders.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19731251 Integrating Microcontroller-Based Projects in a Human-Computer Interaction Course
Authors: Miguel Angel Garcia-Ruiz, Pedro Cesar Santana-Mancilla, Laura Sanely Gaytan-Lugo
Abstract:
This paper describes the design and application of a short in-class project conducted in Algoma University’s Human-Computer Interaction (HCI) course taught at the Bachelor of Computer Science. The project was based on the Maker Movement (people using and reusing electronic components and everyday materials to tinker with technology and make interactive applications), where students applied low-cost and easy-to-use electronic components, the Arduino Uno microcontroller board, software tools, and everyday objects. Students collaborated in small teams by completing hands-on activities with them, making an interactive walking cane for blind people. At the end of the course, students filled out a Technology Acceptance Model version 2 (TAM2) questionnaire where they evaluated microcontroller boards’ applications in HCI classes. We also asked them about applying the Maker Movement in HCI classes. Results showed overall students’ positive opinions and response about using microcontroller boards in HCI classes. We strongly suggest that every HCI course should include practical activities related to tinkering with technology such as applying microcontroller boards, where students actively and constructively participate in teams for achieving learning objectives.
Keywords: Maker movement, microcontrollers, learning, projects, course, technology acceptance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8601250 Towards Real-Time Classification of Finger Movement Direction Using Encephalography Independent Components
Authors: Mohamed Mounir Tellache, Hiroyuki Kambara, Yasuharu Koike, Makoto Miyakoshi, Natsue Yoshimura
Abstract:
This study explores the practicality of using electroencephalographic (EEG) independent components to predict eight-direction finger movements in pseudo-real-time. Six healthy participants with individual-head MRI images performed finger movements in eight directions with two different arm configurations. The analysis was performed in two stages. The first stage consisted of using independent component analysis (ICA) to separate the signals representing brain activity from non-brain activity signals and to obtain the unmixing matrix. The resulting independent components (ICs) were checked, and those reflecting brain-activity were selected. Finally, the time series of the selected ICs were used to predict eight finger-movement directions using Sparse Logistic Regression (SLR). The second stage consisted of using the previously obtained unmixing matrix, the selected ICs, and the model obtained by applying SLR to classify a different EEG dataset. This method was applied to two different settings, namely the single-participant level and the group-level. For the single-participant level, the EEG dataset used in the first stage and the EEG dataset used in the second stage originated from the same participant. For the group-level, the EEG datasets used in the first stage were constructed by temporally concatenating each combination without repetition of the EEG datasets of five participants out of six, whereas the EEG dataset used in the second stage originated from the remaining participants. The average test classification results across datasets (mean ± S.D.) were 38.62 ± 8.36% for the single-participant, which was significantly higher than the chance level (12.50 ± 0.01%), and 27.26 ± 4.39% for the group-level which was also significantly higher than the chance level (12.49% ± 0.01%). The classification accuracy within [–45°, 45°] of the true direction is 70.03 ± 8.14% for single-participant and 62.63 ± 6.07% for group-level which may be promising for some real-life applications. Clustering and contribution analyses further revealed the brain regions involved in finger movement and the temporal aspect of their contribution to the classification. These results showed the possibility of using the ICA-based method in combination with other methods to build a real-time system to control prostheses.Keywords: Brain-computer interface, BCI, electroencephalography, EEG, finger motion decoding, independent component analysis, pseudo-real-time motion decoding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6001249 Influence of Kinematic, Physical and Mechanical Structure Parameters on Aeroelastic GTU Shaft Vibrations in Magnetic Bearings
Authors: Evgeniia V. Mekhonoshina, Vladimir Ya. Modorskii, Vasilii Yu. Petrov
Abstract:
At present, vibrations of rotors of gas transmittal unit evade sustainable forecasting. This paper describes elastic oscillation modes in resilient supports and rotor impellers modeled during computational experiments with regard to interference in the system of gas-dynamic flow and compressor rotor. Verification of aeroelastic approach was done on model problem of interaction between supersonic jet in shock tube with deformed plate. ANSYS 15.0 engineering analysis system was used as a modeling tool of numerical simulation in this paper. Finite volume method for gas dynamics and finite elements method for assessment of the strain stress state (SSS) components were used as research methods. Rotation speed and material’s elasticity modulus varied during calculations, and SSS components and gas-dynamic parameters in the dynamic system of gas-dynamic flow and compressor rotor were evaluated. The analysis of time dependence demonstrated that gas-dynamic parameters near the rotor blades oscillate at 200 Hz, and SSS parameters at the upper blade edge oscillate four times higher, i.e. with blade frequency. It has been detected that vibration amplitudes correction in the test points at magnetic bearings by aeroelasticity may correspond up to 50%, and about -π/4 for phases.Keywords: Centrifugal compressor, aeroelasticity, interdisciplinary calculation, oscillation phase displacement, vibration, nonstationarity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13251248 Authenticity of Lipid and Soluble Sugar Profiles of Various Oat Cultivars (Avena sativa)
Authors: Marijana M. Ačanski, Kristian A. Pastor, Djura N. Vujić
Abstract:
The identification of lipid and soluble sugar components in flour samples of different cultivars belonging to common oat species (Avena sativa L.) was performed: spring oat, winter oat and hulless oat. Fatty acids were extracted from flour samples with n-hexane, and derivatized into volatile methyl esters, using TMSH (trimethylsulfonium hydroxide in methanol). Soluble sugars were then extracted from defatted and dried samples of oat flour with 96% ethanol, and further derivatized into corresponding TMS-oximes, using hydroxylamine hydrochloride solution and BSTFA (N,O-bis-(trimethylsilyl)-trifluoroacetamide). The hexane and ethanol extracts of each oat cultivar were analyzed using GC-MS system. Lipid and simple sugar compositions are very similar in all samples of investigated cultivars. Chemometric tool was applied to numeric values of automatically integrated surface areas of detected lipid and simple sugar components in their corresponding derivatized forms. Hierarchical cluster analysis shows a very high similarity between the investigated flour samples of oat cultivars, according to the fatty acid content (0.9955). Moderate similarity was observed according to the content of soluble sugars (0.50). These preliminary results support the idea of establishing methods for oat flour authentication, and provide the means for distinguishing oat flour samples, regardless of the variety, from flour samples made of other cereal species, just by lipid and simple sugar profile analysis.
Keywords: Authentication, chemometrics, GC-MS, lipid and soluble sugar composition, oat cultivars.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13731247 Dead Bodies that Matter: A Consensual Qualitative Research on the Lived Experience of Embalmers
Authors: Mark N. Abello, Betina Velanie L. Cruz, Angelo Joachim D. C. De Castro, Arnel A. Diego, John Ezequel V. Murillo
Abstract:
Embalmers are widely recognized as someone who mends the cadavers, but behind that is a great deal of work. These professionals are competent in physiology, chemicals, and cosmetics. Another is that such professionals face cadavers day-to-day. Given this background, the researchers intended to find out the lived experience of embalmers. The purpose of the present study is to discover the essence of the work of these professionals, to determine factors that influence their work, the depths of their life and on how the occupation affects upon physical, emotional-mental, spiritual, moral and social aspects. The researchers used the Consensual Qualitative Research, and eight embalmers, seven male and one female, from Manila and Bulacan were interviewed using open-ended questions and were used to triangulate the results. A primary research team conducted the consensus of domains, and an external auditor reviewed the results. A personal data sheet was also used, this helped the researchers group the respondents according to demographic profile. The results of the consensual qualitative research investigation revealed the four core components of the lived experience of embalmers which are motivation, struggles, acceptance, and contentment. The results revealed core components that play an important role in their everyday lives as an embalmer, daily hardships, and source of their pleasures. The present study will help future researchers, embalmers, and society.Keywords: Embalmers, consensual qualitative research, lived experience.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26141246 Noise Removal from Surface Respiratory EMG Signal
Authors: Slim Yacoub, Kosai Raoof
Abstract:
The aim of this study was to remove the two principal noises which disturb the surface electromyography signal (Diaphragm). These signals are the electrocardiogram ECG artefact and the power line interference artefact. The algorithm proposed focuses on a new Lean Mean Square (LMS) Widrow adaptive structure. These structures require a reference signal that is correlated with the noise contaminating the signal. The noise references are then extracted : first with a noise reference mathematically constructed using two different cosine functions; 50Hz (the fundamental) function and 150Hz (the first harmonic) function for the power line interference and second with a matching pursuit technique combined to an LMS structure for the ECG artefact estimation. The two removal procedures are attained without the use of supplementary electrodes. These techniques of filtering are validated on real records of surface diaphragm electromyography signal. The performance of the proposed methods was compared with already conducted research results.Keywords: Surface EMG, Adaptive, Matching Pursuit, Powerline interference.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 43281245 A New Model for Question Answering Systems
Authors: Mohammad Reza Kangavari, Samira Ghandchi, Manak Golpour
Abstract:
Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems. If this module doesn't work properly, it will make problems for other sections. Moreover answer processing module is an emerging topic in Question Answering, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic classification. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. Answer processing module, consists of candidate answer filtering, candidate answer ordering components and also it has a validation section for interacting with user. This module makes it more suitable to find exact answer. In this paper we have described question and answer processing modules with modeling, implementing and evaluating the system. System implemented in two versions. Results show that 'Version No.1' gave correct answer to 70% of questions (30 correct answers to 50 asked questions) and 'version No.2' gave correct answers to 94% of questions (47 correct answers to 50 asked questions).Keywords: Answer Processing, Classification, QuestionAnswering and Query Reformulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21251244 Quality Properties of Fermented Mugworts and Rapid Pattern Analysis of Their Volatile Flavor Components by Electric Nose Based On SAW (Surface Acoustic Wave) Sensor in GC System
Authors: Hyo-Nam Song
Abstract:
The changes in quality properties and nutritional components in two fermented mugworts (Artemisia capillaries Thumberg, Artemisiaeasiaticae Nakai) were characterized followed by the rapid pattern analysis of volatile flavor compounds by Electric Nose based on SAW(Surface Acoustic Wave) sensor in GC system. There were remarkable decreases in the pH and small changes in the total soluble solids after fermentation. The L (lightness) and b (yellowness) values in Hunter's color system were shown to be decreased, whilst the a (redness) value was increased by fermentation. The HPLC analysis demonstrated that total amino acids were increased in quantity and the essential amino acids were contained higher in A. asiaticaeNakai than in A. capillaries Thumberg. While the total polyphenol contents were not affected by fermentation, the total sugar contents were dramatically decreased. Scopoletinwere highly abundant in A. capillarisThumberg, however, it was not detected in A. asiaticaeNakai. Volatile flavor compounds by Electric Nose showed that the intensity of several peaks were increased much and seven additional flavor peaks were newly produced after fermentation. The flavor differences of two mugworts were clearly distinguished from the image patterns of VaporPrintTM which indicate that the fermentation enables the two mugworts to have subtle flavor differences.
Keywords: Mugwort, Fermentation, Electric Nose, SAW sensor, Flavor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17321243 Identifying Knowledge Gaps in Incorporating Toxicity of Particulate Matter Constituents for Developing Regulatory Limits on Particulate Matter
Authors: Ananya Das, Arun Kumar, Gazala Habib, Vivekanandan Perumal
Abstract:
Regulatory bodies has proposed limits on Particulate Matter (PM) concentration in air; however, it does not explicitly indicate the incorporation of effects of toxicities of constituents of PM in developing regulatory limits. This study aimed to provide a structured approach to incorporate toxic effects of components in developing regulatory limits on PM. A four-step human health risk assessment framework consists of - (1) hazard identification (parameters: PM and its constituents and their associated toxic effects on health), (2) exposure assessment (parameters: concentrations of PM and constituents, information on size and shape of PM; fate and transport of PM and constituents in respiratory system), (3) dose-response assessment (parameters: reference dose or target toxicity dose of PM and its constituents), and (4) risk estimation (metric: hazard quotient and/or lifetime incremental risk of cancer as applicable). Then parameters required at every step were obtained from literature. Using this information, an attempt has been made to determine limits on PM using component-specific information. An example calculation was conducted for exposures of PM2.5 and its metal constituents from Indian ambient environment to determine limit on PM values. Identified data gaps were: (1) concentrations of PM and its constituents and their relationship with sampling regions, (2) relationship of toxicity of PM with its components.Keywords: Air, component-specific toxicity, human health risks, particulate matter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11911242 A Methodology for Quality Problems Diagnosis in SMEs
Authors: Humberto N. Teixeira, Isabel S. Lopes, Sérgio D. Sousa
Abstract:
This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which includes gathering information about predetermined processes and subprocesses of quality management, defined based on the well-known Juran-s trilogy for quality management (Quality planning, quality control and quality improvement) and, predetermined results categories, defined based on quality concept. A set of tools for data collecting and analysis, such as interviews, flowcharts, process analysis diagrams and Failure Mode and effects Analysis (FMEA) are used. The article also presents the conclusions obtained in the application of the methodology in two cases studies.
Keywords: Continuous improvement, Diagnosis, Quality Management, Self-assessment, SMEs
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25051241 A Framework for Data Mining Based Multi-Agent: An Application to Spatial Data
Authors: H. Baazaoui Zghal, S. Faiz, H. Ben Ghezala
Abstract:
Data mining is an extraordinarily demanding field referring to extraction of implicit knowledge and relationships, which are not explicitly stored in databases. A wide variety of methods of data mining have been introduced (classification, characterization, generalization...). Each one of these methods includes more than algorithm. A system of data mining implies different user categories,, which mean that the user-s behavior must be a component of the system. The problem at this level is to know which algorithm of which method to employ for an exploratory end, which one for a decisional end, and how can they collaborate and communicate. Agent paradigm presents a new way of conception and realizing of data mining system. The purpose is to combine different algorithms of data mining to prepare elements for decision-makers, benefiting from the possibilities offered by the multi-agent systems. In this paper the agent framework for data mining is introduced, and its overall architecture and functionality are presented. The validation is made on spatial data. Principal results will be presented.
Keywords: Databases, data mining, multi-agent, spatial datamart.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2046