Search results for: techno economic analysis.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9681

Search results for: techno economic analysis.

8001 Flexible Sensor Array with Programmable Measurement System

Authors: Jung-Chuan Chou, Wei-Chuan Chen, Chien-Cheng Chen

Abstract:

This study is concerned with pH solution detection using 2 × 4 flexible sensor array based on a plastic polyethylene terephthalate (PET) substrate that is coated a conductive layer and a ruthenium dioxide (RuO2) sensitive membrane with the technologies of screen-printing and RF sputtering. For data analysis, we also prepared a dynamic measurement system for acquiring the response voltage and analyzing the characteristics of the working electrodes (WEs), such as sensitivity and linearity. In this condition, an array measurement system was designed to acquire the original signal from sensor array, and it is based on the method of digital signal processing (DSP). The DSP modifies the unstable acquisition data to a direct current (DC) output using the technique of digital filter. Hence, this sensor array can obtain a satisfactory yield, 62.5%, through the design measurement and analysis system in our laboratory.

Keywords: Flexible sensor array, PET, RuO2, dynamic measurement, data analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1479
8000 Kurtosis, Renyi's Entropy and Independent Component Scalp Maps for the Automatic Artifact Rejection from EEG Data

Authors: Antonino Greco, Nadia Mammone, Francesco Carlo Morabito, Mario Versaci

Abstract:

The goal of this work is to improve the efficiency and the reliability of the automatic artifact rejection, in particular from the Electroencephalographic (EEG) recordings. Artifact rejection is a key topic in signal processing. The artifacts are unwelcome signals that may occur during the signal acquisition and that may alter the analysis of the signals themselves. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we enhance this technique introducing the Renyi-s entropy. The performance of our method was tested exploiting the Independent Component scalp maps and it was compared to the performance of the method in literature and it showed to outperform it.

Keywords: Artifact, EEG, Renyi's entropy, independent component analysis, kurtosis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2420
7999 Human Capital Development for ASEAN Community

Authors: Chutikarn Sriwiboon

Abstract:

The main purpose of this research paper was to study the requirements for human capital development in order to be ready for ASEAN Community. Thai education institutions are encountering a challenging course of change to be effective members of ASEAN Economic Community (AEC) in 2015. It was vital that everyone and every organization participate in the process of becoming part of the ASEAN community, a pluralistic society. Thai universities will be required to partake in the human capital development in a variety of fields. In order to assist the whole nation to enhance potential development, there was a need to collaborate with other ASEAN leading universities to do researches to ameliorate the qualifications and capabilities of university management, administers, professors, and staffs.

Keywords: ASEAN, Education, Human capital development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2730
7998 Adapting Tools for Text Monitoring and for Scenario Analysis Related to the Field of Social Disasters

Authors: Svetlana Cojocaru, Mircea Petic, Inga Titchiev

Abstract:

Humanity faces more and more often with different social disasters, which in turn can generate new accidents and catastrophes. To mitigate their consequences, it is important to obtain early possible signals about the events which are or can occur and to prepare the corresponding scenarios that could be applied. Our research is focused on solving two problems in this domain: identifying signals related that an accident occurred or may occur and mitigation of some consequences of disasters. To solve the first problem, methods of selecting and processing texts from global network Internet are developed. Information in Romanian is of special interest for us. In order to obtain the mentioned tools, we should follow several steps, divided into preparatory stage and processing stage. Throughout the first stage, we manually collected over 724 news articles and classified them into 10 categories of social disasters. It constitutes more than 150 thousand words. Using this information, a controlled vocabulary of more than 300 keywords was elaborated, that will help in the process of classification and identification of the texts related to the field of social disasters. To solve the second problem, the formalism of Petri net has been used. We deal with the problem of inhabitants’ evacuation in useful time. The analysis methods such as reachability or coverability tree and invariants technique to determine dynamic properties of the modeled systems will be used. To perform a case study of properties of extended evacuation system by adding time, the analysis modules of PIPE such as Generalized Stochastic Petri Nets (GSPN) Analysis, Simulation, State Space Analysis, and Invariant Analysis have been used. These modules helped us to obtain the average number of persons situated in the rooms and the other quantitative properties and characteristics related to its dynamics.

Keywords: Lexicon of disasters, modelling, Petri nets, text annotation, social disasters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1154
7997 Hybrid Method Using Wavelets and Predictive Method for Compression of Speech Signal

Authors: Karima Siham Aoubid, Mohamed Boulemden

Abstract:

The development of the signal compression algorithms is having compressive progress. These algorithms are continuously improved by new tools and aim to reduce, an average, the number of bits necessary to the signal representation by means of minimizing the reconstruction error. The following article proposes the compression of Arabic speech signal by a hybrid method combining the wavelet transform and the linear prediction. The adopted approach rests, on one hand, on the original signal decomposition by ways of analysis filters, which is followed by the compression stage, and on the other hand, on the application of the order 5, as well as, the compression signal coefficients. The aim of this approach is the estimation of the predicted error, which will be coded and transmitted. The decoding operation is then used to reconstitute the original signal. Thus, the adequate choice of the bench of filters is useful to the transform in necessary to increase the compression rate and induce an impercevable distortion from an auditive point of view.

Keywords: Compression, linear prediction analysis, multiresolution analysis, speech signal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1327
7996 Health Monitoring of Power Transformers by Dissolved Gas Analysis using Regression Method and Study the Effect of Filtration on Oil

Authors: Anjali Chatterjee, Nirmal Kumar Roy

Abstract:

Economically transformers constitute one of the largest investments in a Power system. For this reason, transformer condition assessment and management is a high priority task. If a transformer fails, it would have a significant negative impact on revenue and service reliability. Monitoring the state of health of power transformers has traditionally been carried out using laboratory Dissolved Gas Analysis (DGA) tests performed at periodic intervals on the oil sample, collected from the transformers. DGA of transformer oil is the single best indicator of a transformer-s overall condition and is a universal practice today, which started somewhere in the 1960s. Failure can occur in a transformer due to different reasons. Some failures can be limited or prevented by maintenance. Oil filtration is one of the methods to remove the dissolve gases and prevent the deterioration of the oil. In this paper we analysis the DGA data by regression method and predict the gas concentration in the oil in the future. We bring about a comparative study of different traditional methods of regression and the errors generated out of their predictions. With the help of these data we can deduce the health of the transformer by finding the type of fault if it has occurred or will occur in future. Additional in this paper effect of filtration on the transformer health is highlight by calculating the probability of failure of a transformer with and without oil filtrating.

Keywords: Power Transformers, Dissolve gas Analysis, Regression method, Filtration, oil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2927
7995 Improving Financial Education for Young Women: A Case Study of Australian School Students

Authors: Laura de Zwaan, Tracey West

Abstract:

There is a sustained observable gender gap in financial literacy, with females consistently having lower levels than males. This research explores the knowledge and experiences of high school students in Australia aged 14 to 18 in order to understand how this gap can be improved. Using a predominantly qualitative approach, we find evidence to support impacts on financial literacy from financial socialization and socio-economic environment. We also find evidence that current teaching and assessment approaches to financial literacy may disadvantage female students. We conclude by offering recommendations to improve the way financial literacy education is delivered within the curriculum.

Keywords: Financial literacy, financial socialization, gender, maths.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 331
7994 Numerical Simulation of Progressive Collapse for a Reinforced Concrete Building

Authors: Han-Soo Kim, Jae-Gyun Ahn, Hyo-Seung Ahn

Abstract:

Though nonlinear dynamic analysis using a specialized hydro-code such as AUTODYN is accurate and useful tool for progressive collapse assessment of a multi-story building subjected to blast load, it takes too much time to be applied to a practical simulation of progressive collapse of a tall building. In this paper, blast analysis of a RC frame structure using a simplified model with Reinforcement Contact technique provided in Ansys Workbench was introduced and investigated on its accuracy. Even though the simplified model has a fraction of elements of the detailed model, the simplified model with this modeling technique shows similar structural behavior under the blast load to the detailed model. The proposed modeling method can be effectively applied to blast loading progressive collapse analysis of a RC frame structure.

Keywords: Autodyn, Blast Load, Progressive Collapse, Reinforcement Contact.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4252
7993 The Success of E-Collaborative in E-Commerce: The Study of B2C Business in Thailand

Authors: Wanida Suwunniponth

Abstract:

The objectives of this research were to study the influencing factors that contributed to the success of e-collaborative in e-commerce of B2C (Business to Customer) business in Bangkok, Thailand. The influencing factors included organization, people, information technology and the process of e-collaborative. A questionnaire was used to collect data from 200 small e-commerce businesses and the path analysis was utilized as the tool for data analysis. By using the path analysis, it was revealed that the factors concerning with organization, people and information technology played an influence on e-collaborative process and the success of ecollaborative, whereas the process of e-collaborative factor manipulated its success. The findings suggested that B2C ecommerce business in Thailand should opt in improvement approach in terms of managerial structure, leaderships, staff’s skills and knowledge, and investment of information technology in order to capacitate higher efficiency of e-collaborative process that would result in profit and competitive advantage.

Keywords: E-collaborative, E-commerce, B2C.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
7992 On the Wave Propagation in Layered Plates of General Anisotropic Media

Authors: K. L. Verma

Abstract:

Analysis for the propagation of elastic waves in arbitrary anisotropic plates is investigated, commencing with a formal analysis of waves in a layered plate of an arbitrary anisotropic media, the dispersion relations of elastic waves are obtained by invoking continuity at the interface and boundary of conditions on the surfaces of layered plate. The obtained solutions can be used for material systems of higher symmetry such as monoclinic, orthotropic, transversely isotropic, cubic, and isotropic as it is contained implicitly in the analysis. The cases of free layered plate and layered half space are considered separately. Some special cases have also been deduced and discussed. Finally numerical solution of the frequency equations for an aluminum epoxy is carried out, and the dispersion curves for the few lower modes are presented. The results obtained theoretically have been verified numerically and illustrated graphically.

Keywords: Anisotropic, layered, dispersion, elastic waves, frequency equations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1936
7991 Model-Based Software Regression Test Suite Reduction

Authors: Shiwei Deng, Yang Bao

Abstract:

In this paper, we present a model-based regression test suite reducing approach that uses EFSM model dependence analysis and probability-driven greedy algorithm to reduce software regression test suites. The approach automatically identifies the difference between the original model and the modified model as a set of elementary model modifications. The EFSM dependence analysis is performed for each elementary modification to reduce the regression test suite, and then the probability-driven greedy algorithm is adopted to select the minimum set of test cases from the reduced regression test suite that cover all interaction patterns. Our initial experience shows that the approach may significantly reduce the size of regression test suites.

Keywords: Dependence analysis, EFSM model, greedy algorithm, regression test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1913
7990 3D Locomotion and Fractal Analysis of Goldfish for Acute Toxicity Bioassay

Authors: Kittiwann Nimkerdphol, Masahiro Nakagawa

Abstract:

Biological reactions of individuals of a testing animal to toxic substance are unique and can be used as an indication of the existing of toxic substance. However, to distinguish such phenomenon need a very complicate system and even more complicate to analyze data in 3 dimensional. In this paper, a system to evaluate in vitro biological activities to acute toxicity of stochastic self-affine non-stationary signal of 3D goldfish swimming by using fractal analysis is introduced. Regular digital camcorders are utilized by proposed algorithm 3DCCPC to effectively capture and construct 3D movements of the fish. A Critical Exponent Method (CEM) has been adopted as a fractal estimator. The hypothesis was that the swimming of goldfish to acute toxic would show the fractal property which related to the toxic concentration. The experimental results supported the hypothesis by showing that the swimming of goldfish under the different toxic concentration has fractal properties. It also shows that the fractal dimension of the swimming related to the pH value of FD Ôëê 0.26pH + 0.05. With the proposed system, the fish is allowed to swim freely in all direction to react to the toxic. In addition, the trajectories are precisely evaluated by fractal analysis with critical exponent method and hence the results exhibit with much higher degree of confidence.

Keywords: 3D locomotion, bioassay, critical exponent method, CEM, fractal analysis, goldfish.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1711
7989 Property Aggregation and Uncertainty with Links to the Management and Determination of Critical Design Features

Authors: Steven Whittle, Ingrida Valiusaityte

Abstract:

Within the domain of Systems Engineering the need to perform property aggregation to understand, analyze and manage complex systems is unequivocal. This can be seen in numerous domains such as capability analysis, Mission Essential Competencies (MEC) and Critical Design Features (CDF). Furthermore, the need to consider uncertainty propagation as well as the sensitivity of related properties within such analysis is equally as important when determining a set of critical properties within such a system. This paper describes this property breakdown in a number of domains within Systems Engineering and, within the area of CDFs, emphasizes the importance of uncertainty analysis. As part of this, a section of the paper describes possible techniques which may be used within uncertainty propagation and in conclusion an example is described utilizing one of the techniques for property and uncertainty aggregation within an aircraft system to aid the determination of Critical Design Features.

Keywords: Complex Systems, Critical Design Features, Property Aggregation, Uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1528
7988 Optimizing Organizational Performance: The Critical Role of Headcount Budgeting in Strategic Alignment and Financial Stability

Authors: Shobhit Mittal

Abstract:

Headcount budgeting stands as a pivotal element in organizational financial management, extending beyond traditional budgeting to encompass strategic resource allocation for workforce-related expenses. This process is integral to maintaining financial stability and fostering a productive workforce, requiring a comprehensive analysis of factors such as market trends, business growth projections, and evolving workforce skill requirements. It demands a collaborative approach, primarily involving Human Resources (HR) and finance departments, to align workforce planning with an organization's financial capabilities and strategic objectives. The dynamic nature of headcount budgeting necessitates continuous monitoring and adjustment in response to economic fluctuations, business strategy shifts, technological advancements, and market dynamics. Its significance in talent management is also highlighted, aligning financial planning with talent acquisition and retention strategies to ensure a competitive edge in the market. The consequences of incorrect headcount budgeting are explored, showing how it can lead to financial strain, operational inefficiencies, and hindered strategic objectives. Examining case studies like IBM's strategic workforce rebalancing and Microsoft's shift for long-term success, the importance of aligning headcount budgeting with organizational goals is underscored. These examples illustrate that effective headcount budgeting transcends its role as a financial tool, emerging as a strategic element crucial for an organization's success. This necessitates continuous refinement and adaptation to align with evolving business goals and market conditions, highlighting its role as a key driver in organizational success and sustainability.

Keywords: Strategic planning, fiscal budget, headcount planning, resource allocation, financial management, decision-making, operational efficiency, risk management, headcount budget.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 115
7987 A Methodology for Investigating Public Opinion Using Multilevel Text Analysis

Authors: William Xiu Shun Wong, Myungsu Lim, Yoonjin Hyun, Chen Liu, Seongi Choi, Dasom Kim, Kee-Young Kwahk, Namgyu Kim

Abstract:

Recently, many users have begun to frequently share their opinions on diverse issues using various social media. Therefore, numerous governments have attempted to establish or improve national policies according to the public opinions captured from various social media. In this paper, we indicate several limitations of the traditional approaches to analyze public opinion on science and technology and provide an alternative methodology to overcome these limitations. First, we distinguish between the science and technology analysis phase and the social issue analysis phase to reflect the fact that public opinion can be formed only when a certain science and technology is applied to a specific social issue. Next, we successively apply a start list and a stop list to acquire clarified and interesting results. Finally, to identify the most appropriate documents that fit with a given subject, we develop a new logical filter concept that consists of not only mere keywords but also a logical relationship among the keywords. This study then analyzes the possibilities for the practical use of the proposed methodology thorough its application to discover core issues and public opinions from 1,700,886 documents comprising SNS, blogs, news, and discussions.

Keywords: Big data, social network analysis, text mining, topic modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655
7986 A Review: Comparative Analysis of Different Categorical Data Clustering Ensemble Methods

Authors: S. Sarumathi, N. Shanthi, M. Sharmila

Abstract:

Over the past epoch a rampant amount of work has been done in the data clustering research under the unsupervised learning technique in Data mining. Furthermore several algorithms and methods have been proposed focusing on clustering different data types, representation of cluster models, and accuracy rates of the clusters. However no single clustering algorithm proves to be the most efficient in providing best results. Accordingly in order to find the solution to this issue a new technique, called Cluster ensemble method was bloomed. This cluster ensemble is a good alternative approach for facing the cluster analysis problem. The main hope of the cluster ensemble is to merge different clustering solutions in such a way to achieve accuracy and to improve the quality of individual data clustering. Due to the substantial and unremitting development of new methods in the sphere of data mining and also the incessant interest in inventing new algorithms, makes obligatory to scrutinize a critical analysis of the existing techniques and the future novelty. This paper exposes the comparative study of different cluster ensemble methods along with their features, systematic working process and the average accuracy and error rates of each ensemble methods. Consequently this speculative and comprehensive analysis will be very useful for the community of clustering practitioners and also helps in deciding the most suitable one to rectify the problem in hand.

Keywords: Clustering, Cluster Ensemble methods, Co-association matrix, Consensus function, Median partition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2594
7985 Processing and Economic Analysis of Rain Tree (Samanea saman) Pods for Village Level Hydrous Bioethanol Production

Authors: Dharell B. Siano, Wendy C. Mateo, Victorino T. Taylan, Francisco D. Cuaresma

Abstract:

Biofuel is one of the renewable energy sources adapted by the Philippine government in order to lessen the dependency on foreign fuel and to reduce carbon dioxide emissions. Rain tree pods were seen to be a promising source of bioethanol since it contains significant amount of fermentable sugars. The study was conducted to establish the complete procedure in processing rain tree pods for village level hydrous bioethanol production. Production processes were done for village level hydrous bioethanol production from collection, drying, storage, shredding, dilution, extraction, fermentation, and distillation. The feedstock was sundried, and moisture content was determined at a range of 20% to 26% prior to storage. Dilution ratio was 1:1.25 (1 kg of pods = 1.25 L of water) and after extraction process yielded a sugar concentration of 22 0Bx to 24 0Bx. The dilution period was three hours. After three hours of diluting the samples, the juice was extracted using extractor with a capacity of 64.10 L/hour. 150 L of rain tree pods juice was extracted and subjected to fermentation process using a village level anaerobic bioreactor. Fermentation with yeast (Saccharomyces cerevisiae) can fasten up the process, thus producing more ethanol at a shorter period of time; however, without yeast fermentation, it also produces ethanol at lower volume with slower fermentation process. Distillation of 150 L of fermented broth was done for six hours at 85 °C to 95 °C temperature (feedstock) and 74 °C to 95 °C temperature of the column head (vapor state of ethanol). The highest volume of ethanol recovered was established at with yeast fermentation at five-day duration with a value of 14.89 L and lowest actual ethanol content was found at without yeast fermentation at three-day duration having a value of 11.63 L. In general, the results suggested that rain tree pods had a very good potential as feedstock for bioethanol production. Fermentation of rain tree pods juice can be done with yeast and without yeast.

Keywords: Fermentation, hydrous bioethanol, rain tree pods, village level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1550
7984 Craniometric Analysis of Foramen Magnum for Estimation of Sex

Authors: Tanuj Kanchan, Anadi Gupta, Kewal Krishan

Abstract:

Human skull is shown to exhibit numerous sexually dimorphic traits. Estimation of sex is a challenging task especially when a part of skull is brought for medicolegal investigation. The present research was planned to evaluate the sexing potential of the dimensions of foramen magnum in forensic identification by craniometric analysis. Length and breadth of the foramen magnum was measured using Vernier calipers and the area of foramen magnum was calculated. The length, breadth, and area of foramen magnum were found to be larger in males than females. Sexual dimorphism index was calculated to estimate the sexing potential of each variable. The study observations are suggestive of the limited utility of the craniometric analysis of foramen magnum during the examination of skull and its parts in estimation of sex.

Keywords: Forensic Anthropology, Skeletal remains, Identification, Sex estimation, Foramen magnum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3269
7983 'Performance-Based' Seismic Methodology and Its Application in Seismic Design of Reinforced Concrete Structures

Authors: Jelena R. Pejović, Nina N. Serdar

Abstract:

This paper presents an analysis of the “Performance-Based” seismic design method, in order to overcome the perceived disadvantages and limitations of the existing seismic design approach based on force, in engineering practice. Bearing in mind, the specificity of the earthquake as a load and the fact that the seismic resistance of the structures solely depends on its behaviour in the nonlinear field, traditional seismic design approach based on force and linear analysis is not adequate. “Performance-Based” seismic design method is based on nonlinear analysis and can be used in everyday engineering practice. This paper presents the application of this method to eight-story high reinforced concrete building with combined structural system (reinforced concrete frame structural system in one direction and reinforced concrete ductile wall system in other direction). The nonlinear time-history analysis is performed on the spatial model of the structure using program Perform 3D, where the structure is exposed to forty real earthquake records. For considered building, large number of results were obtained. It was concluded that using this method we could, with a high degree of reliability, evaluate structural behavior under earthquake. It is obtained significant differences in the response of structures to various earthquake records. Also analysis showed that frame structural system had not performed well at the effect of earthquake records on soil like sand and gravel, while a ductile wall system had a satisfactory behavior on different types of soils.

Keywords: Ductile wall, frame system, nonlinear time-history analysis, performance-based methodology, RC building.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1487
7982 Application of Staining Intensity Correlation Analysis to Visualize Protein Colocalizationat a Cellular Level

Authors: Permphan Dharmasaroja

Abstract:

Mutations of the telomeric copy of the survival motor neuron 1 (SMN1) gene cause spinal muscular atrophy. A deletion of the Eef1a2 gene leads to lower motor neuron degeneration in wasted mice. Indirect evidences have been shown that the eEF1A protein family may interact with SMN, and our previous study showed that abnormalities of neuromuscular junctions in wasted mice were similar to those of Smn mutant mice. To determine potential colocalization between SMN and tissue-specific translation elongation factor 1A2 (eEF1A2), an immunochemical analysis of HeLa cells transfected with the plasmid pcDNA3.1(+)C-hEEF1A2- myc and a new quantitative test of colocalization by intensity correlation analysis (ICA) was used to explore the association of SMN and eEF1A2. Here the results showed that eEF1A2 redistributed from the cytoplasm to the nucleus in response to serum and epidermal growth factor. In the cytoplasm, compelling evidence showed that staining for myc-tagged eEF1A2 varied in synchrony with that for SMN, consistent with the formation of a SMN-eEF1A2 complex in the cytoplasm of HeLa cells. These findings suggest that eEF1A2 may colocalize with SMN in the cytoplasm and may be a component of the SMN complex. However, the limitation of the ICA method is an inability to resolve colocalization in components of small organelles such as the nucleus.

Keywords: Intensity correlation analysis, intensity correlation quotient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500
7981 Real Time Multi-Sensory Force Sensing Mat for Sports Biomechanics and Human Gait Analysis

Authors: D. Gouwanda, S. M. N. A. Senanayake

Abstract:

This paper presents a real time force sensing instrument that is designed for human gait analysis purposes. It is capable of recording and monitoring ground reaction forces exerted by human foot during various activities such as walking, running and jumping in real time. In overall, force sensing mat mainly consists of three elements: the force sensing mat, signal conditioning circuit and data acquisition device. Force sensing mat is the mat that contains an array of force sensing elements. To control and process the incoming signal from the force sensing mat, Force-Logger and Force-Reloader are developed using National Instrument Labview. This paper describes the architecture of the force sensing mat, signal conditioning circuit and the real time streaming of the incoming data from the force sensing mat. Additionally, a preliminary experiment dataset is presented in this paper.

Keywords: Force platform, force sensing resistor, human gait analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2307
7980 A study on a Generic Development Process for the BPM+SOA Design and Implementation

Authors: Toshimi Munehira

Abstract:

In order to optimize annual IT spending and to reduce the complexity of an entire system architecture, SOA trials have been started. It is common knowledge that to design an SOA system we have to adopt the top-down approach, but in reality silo systems are being made, so these companies cannot reuse newly designed services, and cannot enjoy SOA-s economic benefits. To prevent this situation, we designed a generic SOA development process referred to as the architecture of “mass customization." To define the generic detail development processes, we did a case study on an imaginary company. Through the case study, we could define the practical development processes and found this could vastly reduce updating development costs.

Keywords: SOA, BPM, Generic Model, MassCustomization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1449
7979 Tidal Data Analysis using ANN

Authors: Ritu Vijay, Rekha Govil

Abstract:

The design of a complete expansion that allows for compact representation of certain relevant classes of signals is a central problem in signal processing applications. Achieving such a representation means knowing the signal features for the purpose of denoising, classification, interpolation and forecasting. Multilayer Neural Networks are relatively a new class of techniques that are mathematically proven to approximate any continuous function arbitrarily well. Radial Basis Function Networks, which make use of Gaussian activation function, are also shown to be a universal approximator. In this age of ever-increasing digitization in the storage, processing, analysis and communication of information, there are numerous examples of applications where one needs to construct a continuously defined function or numerical algorithm to approximate, represent and reconstruct the given discrete data of a signal. Many a times one wishes to manipulate the data in a way that requires information not included explicitly in the data, which is done through interpolation and/or extrapolation. Tidal data are a very perfect example of time series and many statistical techniques have been applied for tidal data analysis and representation. ANN is recent addition to such techniques. In the present paper we describe the time series representation capabilities of a special type of ANN- Radial Basis Function networks and present the results of tidal data representation using RBF. Tidal data analysis & representation is one of the important requirements in marine science for forecasting.

Keywords: ANN, RBF, Tidal Data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1648
7978 Analysis of Delay and Throughput in MANET for DSR Protocol

Authors: Kumar Manoj, Ramesh Kumar, Kumari Arti

Abstract:

A wireless Ad-hoc network consists of wireless nodes communicating without the need for a centralized administration, in which all nodes potentially contribute to the routing process.In this paper, we report the simulation results of four different scenarios for wireless ad hoc networks having thirty nodes. The performances of proposed networks are evaluated in terms of number of hops per route, delay and throughput with the help of OPNET simulator. Channel speed 1 Mbps and simulation time 600 sim-seconds were taken for all scenarios. For the above analysis DSR routing protocols has been used. The throughput obtained from the above analysis (four scenario) are compared as shown in Figure 3. The average media access delay at node_20 for two routes and at node_20 for four different scenario are compared as shown in Figures 4 and 5. It is observed that the throughput will degrade when it will follow different hops for same source to destination (i.e. it has dropped from 1.55 Mbps to 1.43 Mbps which is around 9.7%, and then dropped to 0.48Mbps which is around 35%).

Keywords: Throughput, Delay, DSR, OPNET, MANET, DSSS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2658
7977 On the Analysis and a Few Optimization Issues of a New iCIM 3000 System at an Academic-Research Oriented Institution

Authors: D. R. Delgado Sobrino, R. Holubek, R. Ružarovský

Abstract:

In the past years, the world has witnessed significant work in the field of Manufacturing. Special efforts have been made in the implementation of new technologies, management and control systems, among many others which have all evolved the field. Closely following all this, due to the scope of new projects and the need of turning the existing flexible ideas into more autonomous and intelligent ones, i.e.: moving toward a more intelligent manufacturing, the present paper emerges with the main aim of contributing to the analysis and a few customization issues of a new iCIM 3000 system at the IPSAM. In this process, special emphasis in made on the material flow problem. For this, besides offering a description and analysis of the system and its main parts, also some tips on how to define other possible alternative material flow scenarios and a partial analysis of the combinatorial nature of the problem are offered as well. All this is done with the intentions of relating it with the use of simulation tools, for which these have been briefly addressed with a special focus on the Witness simulation package. For a better comprehension, the previous elements are supported by a few figures and expressions which would help obtaining necessary data. Such data and others will be used in the future, when simulating the scenarios in the search of the best material flow configurations.

Keywords: Flexible/Intelligent assembly/disassembly cell (F/IA/DC), Flexible/Intelligent Manufacturing Systems/Cell (F/IMS/C), Material Flow Optimization/Combinations/Design (MFO/C/D).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2065
7976 Determination of Surface Roughness by Ball Burnishing Process Using Factorial Techniques

Authors: P. S. Dabeer, G. K. Purohit

Abstract:

Burnishing is a method of finishing and hardening machined parts by plastic deformation of the surface. Experimental work based on central composite second order rotatable design has been carried out on a lathe machine to establish the effects of ball burnishing parameters on the surface roughness of brass material. Analysis of the results by the analysis of variance technique and the F-test show that the parameters considered, have significant effects on the surface roughness.

Keywords: Ball burnishing, Response surface Methodology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2468
7975 Comparative Study of the Effects of Process Parameters on the Yield of Oil from Melon Seed (Cococynthis citrullus) and Coconut Fruit (Cocos nucifera)

Authors: Ndidi F. Amulu, Patrick E. Amulu, Gordian O. Mbah, Callistus N. Ude

Abstract:

Comparative analysis of the properties of melon seed, coconut fruit and their oil yield were evaluated in this work using standard analytical technique AOAC. The results of the analysis carried out revealed that the moisture contents of the samples studied are 11.15% (melon) and 7.59% (coconut). The crude lipid content are 46.10% (melon) and 55.15% (coconut).The treatment combinations used (leaching time, leaching temperature and solute: solvent ratio) showed significant difference (p < 0.05) in yield between the samples, with melon oil seed flour having a higher percentage range of oil yield (41.30 – 52.90%) and coconut (36.25 – 49.83%). The physical characterization of the extracted oil was also carried out. The values gotten for refractive index are 1.487 (melon seed oil) and 1.361 (coconut oil) and viscosities are 0.008 (melon seed oil) and 0.002 (coconut oil). The chemical analysis of the extracted oils shows acid value of 1.00mg NaOH/g oil (melon oil), 10.050mg NaOH/g oil (coconut oil) and saponification value of 187.00mg/KOH (melon oil) and 183.26mg/KOH (coconut oil). The iodine value of the melon oil gave 75.00mg I2/g and 81.00mg I2/g for coconut oil. A standard statistical package Minitab version 16.0 was used in the regression analysis and analysis of variance (ANOVA). The statistical software mentioned above was also used to optimize the leaching process. Both samples gave high oil yield at the same optimal conditions. The optimal conditions to obtain highest oil yield ≥ 52% (melon seed) and ≥ 48% (coconut seed) are solute - solvent ratio of 40g/ml, leaching time of 2hours and leaching temperature of 50oC. The two samples studied have potential of yielding oil with melon seed giving the higher yield.

Keywords: Coconut, melon, optimization, processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2146
7974 A Methodology for Quality Problems Diagnosis in SMEs

Authors: Humberto N. Teixeira, Isabel S. Lopes, Sérgio D. Sousa

Abstract:

This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which includes gathering information about predetermined processes and subprocesses of quality management, defined based on the well-known Juran-s trilogy for quality management (Quality planning, quality control and quality improvement) and, predetermined results categories, defined based on quality concept. A set of tools for data collecting and analysis, such as interviews, flowcharts, process analysis diagrams and Failure Mode and effects Analysis (FMEA) are used. The article also presents the conclusions obtained in the application of the methodology in two cases studies.

Keywords: Continuous improvement, Diagnosis, Quality Management, Self-assessment, SMEs

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2488
7973 The Application of Regulatory Impact Assessment (RIA) on the Czech Financial Market

Authors: Jana Chvalkovska, Petr Jansky, Petr Teply

Abstract:

The impact assessment in its various forms has recently become a very important part of policy-making and legislation in many different countries. Regulatory impact assessment (RIA) is yet another set of analytical methods deployed in the legislation of the European Union, of many developed countries as well as in many developing ones such as Mexico, Malaysia and Philippines. The aim of this paper is to provide a theoretical background for economic models in regulatory impact assessment and an overview of their application especially on the financial market in the Czech Republic. We found out an inadequate application of these models, what makes room for further research in this field.

Keywords: regulatory impact assessment, RIA, impact evaluation, building societies, Czech Republic

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1450
7972 Climatic Change, Drought and Dust Crisis in Iran

Authors: Akram Keramat, Behrouz Marivani, Mehdi Samsami

Abstract:

Drought is a phenomenon caused by environmental and climatic changes. This phenomenon is affected by shortage of rainfall and temperature. Dust is one of important environmental problems caused by climate change and drought. With recent multi-year drought, many environmental crises caused by dust in Iran and Middle East. Dust in the vast areas of the provinces occurs with high frequency. By dust affecting many problems created in terms of health, social and economic. In this study, we tried to study the most important factors causing dust. In this way we have used the satellite images and meteorological data. Finally, strategies to deal with the dust will be mentioned.

Keywords: Drought, Environment, Dust.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3632