Search results for: processing demands.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1859

Search results for: processing demands.

1379 Elicitation of Requirements for a Knowledge Management Concept in Decentralized Production Planning

Authors: S. Minhas, C. Juzek, U. Berger

Abstract:

The planning in manufacturing system is becoming complicated day by day due to the expanding networks and shortage of skilled people to manage change. Consequently, faster lead time and rising demands for eco-efficient evaluation of manufacturing products and processes need exploitation of new and intelligent knowledge management concepts for manufacturing planning. This paper highlights motivation for incorporation of new features in the manufacturing planning system. Furthermore, it elaborates requirements for the development of intelligent knowledge management concept to support planning related decisions. Afterwards, the derived concept is presented in this paper considering two case studies. The first case study is concerned with the automotive ramp-up planning. The second case study specifies requirements for knowledge management system to support decisions in eco-efficient evaluation of manufacturing products and processes

Keywords: Ramp-up, Environmental impact, Knowledge management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1828
1378 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach

Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh

Abstract:

This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.

Keywords: River stage-discharge process, LSSVM, discrete wavelet transform (DWT), ensemble empirical decomposition mode (EEMD), multi-station modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 622
1377 Integration of Image and Patient Data, Software and International Coding Systems for Use in a Mammography Research Project

Authors: V. Balanica, W. I. D. Rae, M. Caramihai, S. Acho, C. P. Herbst

Abstract:

Mammographic images and data analysis to facilitate modelling or computer aided diagnostic (CAD) software development should best be done using a common database that can handle various mammographic image file formats and relate these to other patient information. This would optimize the use of the data as both primary reporting and enhanced information extraction of research data could be performed from the single dataset. One desired improvement is the integration of DICOM file header information into the database, as an efficient and reliable source of supplementary patient information intrinsically available in the images. The purpose of this paper was to design a suitable database to link and integrate different types of image files and gather common information that can be further used for research purposes. An interface was developed for accessing, adding, updating, modifying and extracting data from the common database, enhancing the future possible application of the data in CAD processing. Technically, future developments envisaged include the creation of an advanced search function to selects image files based on descriptor combinations. Results can be further used for specific CAD processing and other research. Design of a user friendly configuration utility for importing of the required fields from the DICOM files must be done.

Keywords: Database Integration, Mammogram Classification, Tumour Classification, Computer Aided Diagnosis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921
1376 Noninvasive Brain-Machine Interface to Control Both Mecha TE Robotic Hands Using Emotiv EEG Neuroheadset

Authors: Adrienne Kline, Jaydip Desai

Abstract:

Electroencephalogram (EEG) is a noninvasive technique that registers signals originating from the firing of neurons in the brain. The Emotiv EEG Neuroheadset is a consumer product comprised of 14 EEG channels and was used to record the reactions of the neurons within the brain to two forms of stimuli in 10 participants. These stimuli consisted of auditory and visual formats that provided directions of ‘right’ or ‘left.’ Participants were instructed to raise their right or left arm in accordance with the instruction given. A scenario in OpenViBE was generated to both stimulate the participants while recording their data. In OpenViBE, the Graz Motor BCI Stimulator algorithm was configured to govern the duration and number of visual stimuli. Utilizing EEGLAB under the cross platform MATLAB®, the electrodes most stimulated during the study were defined. Data outputs from EEGLAB were analyzed using IBM SPSS Statistics® Version 20. This aided in determining the electrodes to use in the development of a brain-machine interface (BMI) using real-time EEG signals from the Emotiv EEG Neuroheadset. Signal processing and feature extraction were accomplished via the Simulink® signal processing toolbox. An Arduino™ Duemilanove microcontroller was used to link the Emotiv EEG Neuroheadset and the right and left Mecha TE™ Hands.

Keywords: Brain-machine interface, EEGLAB, emotiv EEG neuroheadset, openViBE, simulink.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2775
1375 Critical Issues Affecting the Engagement by Staff in Professional Development for E-Learning: Findings from a Research Project within the Context of a National Tertiary Education Sector

Authors: J. Mansvelt, G. Suddaby, D. O'Hara

Abstract:

This paper focuses on issues of engagement by staff in professional development related to the delivery of e-learning. The paper reports on findings drawn from a New Zealand research project which is producing a sector-wide framework for professional development in tertiary e-learning. The research findings indicate that staff engaged in e-learning in tertiary institutions is not making the most effective use of the professional development opportunities available to them; rather they seem to gain their knowledge and support from a variety of informal means. This is despite an emphasis on the provision of professional development opportunities by both Government Policies and Institutions themselves. The conclusion drawn from the findings is that institutional approaches to professional development for e-learning do not yet fully reflect the demands and constraints that working in a digital context impose.

Keywords: Academic development, e-learning, engagement, professional development, tertiary education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1421
1374 Graph Cuts Segmentation Approach Using a Patch-Based Similarity Measure Applied for Interactive CT Lung Image Segmentation

Authors: Aicha Majda, Abdelhamid El Hassani

Abstract:

Lung CT image segmentation is a prerequisite in lung CT image analysis. Most of the conventional methods need a post-processing to deal with the abnormal lung CT scans such as lung nodules or other lesions. The simplest similarity measure in the standard Graph Cuts Algorithm consists of directly comparing the pixel values of the two neighboring regions, which is not accurate because this kind of metrics is extremely sensitive to minor transformations such as noise or other artifacts problems. In this work, we propose an improved version of the standard graph cuts algorithm based on the Patch-Based similarity metric. The boundary penalty term in the graph cut algorithm is defined Based on Patch-Based similarity measurement instead of the simple intensity measurement in the standard method. The weights between each pixel and its neighboring pixels are Based on the obtained new term. The graph is then created using theses weights between its nodes. Finally, the segmentation is completed with the minimum cut/Max-Flow algorithm. Experimental results show that the proposed method is very accurate and efficient, and can directly provide explicit lung regions without any post-processing operations compared to the standard method.

Keywords: Graph cuts, lung CT scan, lung parenchyma segmentation, patch based similarity metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 712
1373 Colour Stability of Wild Cactus Pear Juice

Authors: Kgatla T.E, Howard S.S., Hiss D.C.

Abstract:

Prickly pear (Opuntia spp) fruit has received renewed interest since it contains a betalain pigment that has an attractive purple colour for the production of juice. Prickly pear juice was prepared by homogenizing the fruit and treating the pulp with 48 g of pectinase from Aspergillus niger. Titratable acidity was determined by diluting 10 ml prickly pear juice with 90 ml deionized water and titrating to pH 8.2 with 0.1 N NaOH. Brix was measured using a refractometer and ascorbic acid content assayed spectrophotometrically. Colour variation was determined colorimetrically (Hunter L.a.b.). Hunter L.a.b. analysis showed that the red purple colour of prickly pear juice had been affected by juice treatments. This was indicated by low light values of colour difference meter (CDML*), hue, CDMa* and CDMb* values. It was observed that non-treated prickly pear juice had a high (colour difference meter of light) CDML* of 3.9 compared to juice treatments (range 3.29 to 2.14). The CDML* significantly (p<0.05) decreased as the juice was preserved. Spectrophotometric colour analysis showed that browning was low in all treated prickly juice samples as indicated by high values at 540 nm and low values at 476 nm (browning index). The brightness of prickly pear had been affected by acidification compared to other juice treatments. This study presents evidence that processing has a positive effect on the colour quality attribute that offers a clear advantage for the production of red-purple prickly pear juice.

Keywords: Colour, Hunter L.a.b, Prickly pear juice, processing, physicochemical.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2796
1372 Groin Configurations: An Approach towards Stable Lowland Rivers with Improved Environmental Functions

Authors: M. Alauddin, T. Tsujimoto

Abstract:

Dynamicity of stream channels along with environmental concern is the key issue to address in lowland rivers like Jamuna in Bangladesh. The groins are important structures in attaining the improved river environment, but their effective functioning is not evident yet with the present design. Considering the present demands, an approach through modification of groin configurations is planned to function more natural way in dynamic lowland rivers. Four different configurations including the conventional one are considered in the study, and the changes in hydro- and morpho-dynamics affected by various structures are investigated in the laboratory. Results show that the modified combined groin favors gradual deceleration of flow towards the channel side and minimizes local scour noticeably. This favors stable regular channel and improve environmental functions.

Keywords: Lowland river, dynamicity, river environment, groin configuration, local scour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2219
1371 Investigation Wintering And Breeding Habitat Selection by Asiatic Houbara Bustard (Chlamydotis macqueenii ) In Central Steppe of Iran

Authors: S. Aghainajafi Zadeh, M.R. Hemami., F. Heydari

Abstract:

Asiatic Houbara ( Chlamydotis macqueenii ) is a flagship and vulnerable species. In-situ conservation of this threatened species demands for knowledge of its habitat selection. The aim of this study was to determine habitat variables influencing birds wintering and breeding selection in semi- arid central Iran. Habitat features of the detected nest and pellet sites were compared with paired and random plots by quantifying a number of habitat variables. In wintering habitat use at micro scale houbara selected sites where vegetation cover was significantly lower compard to control sites( p< 0.001). Areas with low number of larger plant species (p=0.03) that were not too close to a vegetation patch(p<0.001) were selected for breeding habitat.

Keywords: Asiatic houbara bustard, Habitat selection, Nest, pellet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1503
1370 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-

Authors: Nieto Bernal Wilson, Carmona Suarez Edgar

Abstract:

The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects.  Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.

Keywords: Data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1456
1369 Public Participation in Sustainable Urban Planning

Authors: M. P. Amado, C. V. Santos, E. B. Moura, V.G. Silva

Abstract:

Urban planning, in particular on protected landscape areas, demands an increasing role of public participation within the frame of the efficiency of sustainable planning process. The development of urban planning actions in Protected Landscape areas, as Sintra-Cascais Natural Park, should perform a methodological process that is structured over distinct sequential stages, providing the development of a continuous, interactive, integrated and participative planning. From the start of Malveira da Serra and Janes Plan process, several public participation actions were promoted, in order to involve the local agents, stakeholders and the population in the decision of specific local key issues and define the appropriate priorities within the goals and strategies previously settled. As a result, public participation encouraged an innovative process that guarantees the efficiency of sustainable urban planning and promotes a sustainable new way of living in community.

Keywords: Protected landscape areas, Public participation, Sustainable development, Sustainable planning, Urban planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2851
1368 Potential of Salvia sclarea L. for Phytoremediation of Soils Contaminated with Heavy Metals

Authors: Violina R. Angelova, Radka V. Ivanova, Givko M. Todorov, Krasimir I. Ivanov

Abstract:

A field study was conducted to evaluate the efficacy of Salvia sclarea L. for phytoremediation of contaminated soils. The experiment was performed on an agricultural fields contaminated by the Non-Ferrous-Metal Works near Plovdiv, Bulgaria. The content of heavy metals in different parts of Salvia sclarea L. (roots, stems, leaves and inflorescences) was determined by ICP. The essential oil of the Salvia sclarea L. was obtained by steam distillation in laboratory conditions and was analyzed for heavy metals and its chemical composition was determined. Salvia sclarea L. is a plant which is tolerant to heavy metals and can be grown on contaminated soils. Based on the obtained results and using the most common criteria, Salvia sclarea L. can be classified as Pb hyperaccumulator and Cd and Zn accumulators, therefore, this plant has suitable potential for the phytoremediation of heavy metal contaminated soils. Favorable is also the fact that heavy metals do not influence the development of the Salvia sclarea L., as well as on the quality and quantity of the essential oil. For clary sage oil obtained from the processing of clary sage grown on highly contaminated soils, its key odour-determining ingredients meet the quality requirements of the European Pharmacopoeia and BS ISO 7609 regarding Bulgarian clary sage oil and/or have values that are close to the limits of these standards. The possibility of further industrial processing will make Salvia sclarea L. an economically interesting crop for farmers of phytoextraction technology.

Keywords: Clary sage, heavy metals, phytoremediation, polluted soils.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1793
1367 Diagnosing Dangerous Arrhythmia of Patients by Automatic Detecting of QRS Complexes in ECG

Authors: Jia-Rong Yeh, Ai-Hsien Li, Jiann-Shing Shieh, Yen-An Su, Chi-Yu Yang

Abstract:

In this paper, an automatic detecting algorithm for QRS complex detecting was applied for analyzing ECG recordings and five criteria for dangerous arrhythmia diagnosing are applied for a protocol type of automatic arrhythmia diagnosing system. The automatic detecting algorithm applied in this paper detected the distribution of QRS complexes in ECG recordings and related information, such as heart rate and RR interval. In this investigation, twenty sampled ECG recordings of patients with different pathologic conditions were collected for off-line analysis. A combinative application of four digital filters for bettering ECG signals and promoting detecting rate for QRS complex was proposed as pre-processing. Both of hardware filters and digital filters were applied to eliminate different types of noises mixed with ECG recordings. Then, an automatic detecting algorithm of QRS complex was applied for verifying the distribution of QRS complex. Finally, the quantitative clinic criteria for diagnosing arrhythmia were programmed in a practical application for automatic arrhythmia diagnosing as a post-processor. The results of diagnoses by automatic dangerous arrhythmia diagnosing were compared with the results of off-line diagnoses by experienced clinic physicians. The results of comparison showed the application of automatic dangerous arrhythmia diagnosis performed a matching rate of 95% compared with an experienced physician-s diagnoses.

Keywords: Signal processing, electrocardiography (ECG), QRS complex, arrhythmia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1494
1366 Well-Being in Adolescence: Fitting Measurement Model

Authors: Azlina Abu Bakar, Abdul Fatah Wan Sidek

Abstract:

Well-being has been given special emphasis in quality of life. It involves living a meaningful, life satisfaction, stability and happiness in life. Well-being also concerns the satisfaction of physical, psychological, social needs and demands of an individual. The purpose of this study was to validate three-factor measurement model of well-being using structural equation modeling (SEM). The conceptions of well-being measured such dimensions as physical, psychological and social well-being. This study was done based on a total sample of 650 adolescents from east-coast of peninsular Malaysia. The Well-Being Scales which was adapted from [1] was used in this study. The items were hypothesized a priori to have nonzero loadings on all dimensions in the model. The findings of the SEM demonstrated that it is a good fitting model which the proposed model fits the driving theory; (x2df = 1.268; GFI = .994; CFI = .998; TLI= .996; p = .255; RMSEA = .021). Composite reliability (CR) was .93 and average variance extracted (AVE) was 58%. The model in this study fits with the sample of data and well-being is important to bring sustainable development to the mainstream.

Keywords: Adolescence, Structural Equation Modeling, Sustainable Development, Well-Being.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3044
1365 Isolation and Classification of Red Blood Cells in Anemic Microscopic Images

Authors: Jameela Ali Alkrimi, Loay E. George, Azizah Suliman, Abdul Rahim Ahmad, Karim Al-Jashamy

Abstract:

Red blood cells (RBCs) are among the most commonly and intensively studied type of blood cells in cell biology. Anemia is a lack of RBCs is characterized by its level compared to the normal hemoglobin level. In this study, a system based image processing methodology was developed to localize and extract RBCs from microscopic images. Also, the machine learning approach is adopted to classify the localized anemic RBCs images. Several textural and geometrical features are calculated for each extracted RBCs. The training set of features was analyzed using principal component analysis (PCA). With the proposed method, RBCs were isolated in 4.3secondsfrom an image containing 18 to 27 cells. The reasons behind using PCA are its low computation complexity and suitability to find the most discriminating features which can lead to accurate classification decisions. Our classifier algorithm yielded accuracy rates of 100%, 99.99%, and 96.50% for K-nearest neighbor (K-NN) algorithm, support vector machine (SVM), and neural network RBFNN, respectively. Classification was evaluated in highly sensitivity, specificity, and kappa statistical parameters. In conclusion, the classification results were obtained within short time period, and the results became better when PCA was used.

Keywords: Red blood cells, pre-processing image algorithms, classification algorithms, principal component analysis PCA, confusion matrix, kappa statistical parameters, ROC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3167
1364 Design of Compliant Mechanism Based Microgripper with Three Finger Using Topology Optimization

Authors: R. Bharanidaran, B. T. Ramesh

Abstract:

High precision in motion is required to manipulate the micro objects in precision industries for micro assembly, cell manipulation etc. Precision manipulation is achieved based on the appropriate mechanism design of micro devices such as microgrippers. Design of a compliant based mechanism is the better option to achieve a highly precised and controlled motion. This research article highlights the method of designing a compliant based three fingered microgripper suitable for holding asymmetric objects. Topological optimization technique, a systematic method is implemented in this research work to arrive a topologically optimized design of the mechanism needed to perform the required micro motion of the gripper. Optimization technique has a drawback of generating senseless regions such as node to node connectivity and staircase effect at the boundaries. Hence, it is required to have post processing of the design to make it manufacturable. To reduce the effect of post processing stage and to preserve the edges of the image, a cubic spline interpolation technique is introduced in the MATLAB program. Structural performance of the topologically developed mechanism design is tested using finite element method (FEM) software. Further the microgripper structure is examined to find its fatigue life and vibration characteristics.

Keywords: Compliant mechanism, Cubic spline interpolation, FEM, Topology optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3551
1363 AI-Driven Cloud Security: Proactive Defense Against Evolving Cyber Threats

Authors: Ashly Joseph

Abstract:

Cloud computing has become an essential component of enterprises and organizations globally in the current era of digital technology. The cloud has a multitude of advantages, including scalability, flexibility, and cost-effectiveness, rendering it an appealing choice for data storage and processing. The increasing storage of sensitive information in cloud environments has raised significant concerns over the security of such systems. The frequency of cyber threats and attacks specifically aimed at cloud infrastructure has been increasing, presenting substantial dangers to the data, reputation, and financial stability of enterprises. Conventional security methods can become inadequate when confronted with ever intricate and dynamic threats. Artificial Intelligence (AI) technologies possess the capacity to significantly transform cloud security through their ability to promptly identify and thwart assaults, adjust to emerging risks, and offer intelligent perspectives for proactive security actions. The objective of this research study is to investigate the utilization of AI technologies in augmenting the security measures within cloud computing systems. This paper aims to offer significant insights and recommendations for businesses seeking to protect their cloud-based assets by analyzing the present state of cloud security, the capabilities of AI, and the possible advantages and obstacles associated with using AI into cloud security policies.

Keywords: Machine Learning, Natural Learning Processing, Denial-of-Service attacks, Sentiment Analysis, Cloud computing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 62
1362 Current Starved Ring Oscillator Image Sensor

Authors: Devin Atkin, Orly Yadid-Pecht

Abstract:

The continual demands for increasing resolution and dynamic range in complimentary metal-oxide semiconductor (CMOS) image sensors have resulted in exponential increases in the amount of data that need to be read out of an image sensor, and existing readouts cannot keep up with this demand. Interesting approaches such as sparse and burst readouts have been proposed and show promise, but at considerable trade-offs in other specifications. To this end, we have begun designing and evaluating various readout topologies centered around an attempt to parallelize the sensor readout. In this paper, we have designed, simulated, and started testing a light-controlled oscillator topology with dual column and row readouts. We expect the parallel readout structure to offer greater speed and alleviate the trade-off typical in this topology, where slow pixels present a major framerate bottleneck.

Keywords: CMOS image sensors, high-speed capture, wide dynamic range, light controlled oscillator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 118
1361 Short-Term Electric Load Forecasting Using Multiple Gaussian Process Models

Authors: Tomohiro Hachino, Hitoshi Takata, Seiji Fukushima, Yasutaka Igarashi

Abstract:

This paper presents a Gaussian process model-based short-term electric load forecasting. The Gaussian process model is a nonparametric model and the output of the model has Gaussian distribution with mean and variance. The multiple Gaussian process models as every hour ahead predictors are used to forecast future electric load demands up to 24 hours ahead in accordance with the direct forecasting approach. The separable least-squares approach that combines the linear least-squares method and genetic algorithm is applied to train these Gaussian process models. Simulation results are shown to demonstrate the effectiveness of the proposed electric load forecasting.

Keywords: Direct method, electric load forecasting, Gaussian process model, genetic algorithm, separable least-squares method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1953
1360 Identifying and Prioritizing Factors Affecting Consumer Behavior Based on Product Value

Authors: Houshang Taghizadeh, Gholamreza Soltani Fesghandis

Abstract:

Nowadays, without the awareness of consumer behavior and correct understanding of it, it is not possible for organizations to take appropriate measures to meet the consumer needs and demands. The aim of this paper is the identification and prioritization of the factors affecting the consumer behavior based on the product value. The population of the study includes all the consumers of furniture producing firms in East Azarbaijan province, Iran. The research sample includes 93 people selected by the sampling formula in unlimited population. The data collection instrument was a questionnaire, the validity of which was confirmed through face validity and the reliability of which was determined, using Cronbach's alpha coefficient. The Kolmogorov-Smironov test was used to test data normality, the t-test for identification of factors affecting the product value, and Friedman test for prioritizing the factors. The results show that quality, satisfaction, styling, price, finishing operation, performance, safety, worth, shape, use, and excellence are placed from 1 to 11 priorities, respectively.

Keywords: Consumer Behavior, Consumer Satisfaction, Product, Value

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2869
1359 Mucosal- Submucosal Changes in Rabbit Duodenum during Development

Authors: Elnasharty M. A., Abou-Ghanema I. I., Sayed-Ahmed A., A. Abo Elnour

Abstract:

The sequential morphologic changes of rabbit duodenal mucosa-submucosa were studied from primodial stage to birth in 15 fetuses and during the early days of life in 21 rabbit newborns till maturity using light, scanning and transmission electron microscopy. Fetal rabbit duodenum develops from a simple tube of stratified epithelium to a tube containing villus and intervillus regions of simple columnar epithelium. By day 21 of gestation, the first rudimentary villi were appeared and by day 24 the first true villi were appeared. The Crypts of Lieberkuhn did not appear until birth. By the first day of postnatal life the duodenal glands appeared. The histological maturity of the rabbit small intestine occurred one month after birth. In conclusion, at all stages, the sequential morphologic changes of the rabbit small intestine developed to meet the structural and physiological demands during the fetal stage to be prepared to extra uterine life.

Keywords: Duodenum, mucosa, submucosa, morphogenesis, rabbit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2383
1358 Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences

Authors: Yuan-Hsiang Chang, Pin-Chi Lin, Li-Der Jeng

Abstract:

Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.). 

Keywords: Motion detection, motion tracking, trajectory analysis, video surveillance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696
1357 Processing the Medical Sensors Signals Using Fuzzy Inference System

Authors: S. Bouharati, I. Bouharati, C. Benzidane, F. Alleg, M. Belmahdi

Abstract:

Sensors possess several properties of physical measures. Whether devices that convert a sensed signal into an electrical signal, chemical sensors and biosensors, thus all these sensors can be considered as an interface between the physical and electrical equipment. The problem is the analysis of the multitudes of saved settings as input variables. However, they do not all have the same level of influence on the outputs. In order to identify the most sensitive parameters, those that can guide users in gathering information on the ground and in the process of model calibration and sensitivity analysis for the effect of each change made. Mathematical models used for processing become very complex. In this paper a fuzzy rule-based system is proposed as a solution for this problem. The system collects the available signals information from sensors. Moreover, the system allows the study of the influence of the various factors that take part in the decision system. Since its inception fuzzy set theory has been regarded as a formalism suitable to deal with the imprecision intrinsic to many problems. At the same time, fuzzy sets allow to use symbolic models. In this study an example was applied for resolving variety of physiological parameters that define human health state. The application system was done for medical diagnosis help. The inputs are the signals expressed the cardiovascular system parameters, blood pressure, Respiratory system paramsystem was done, it will be able to predict the state of patient according any input values.

Keywords: Sensors, Sensivity, fuzzy logic, analysis, physiological parameters, medical diagnosis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1943
1356 Power Distance and Knowledge Management from a Post-Taylorist Perspective

Authors: John Walton, Vishal Parikh

Abstract:

Contact centres have been exemplars of scientific management in the discipline of operations management for more than a decade now. With the movement of industries from a resource based economy to knowledge based economy businesses have started to realize the customer eccentricity being the key to sustainability amidst high velocity of the market. However, as technologies have converged and advanced, so have the contact centres. Contact Centres have redirected the supply chains and the concept of retailing is highly diminished due to over exaggeration of cost reduction strategies. In conditions of high environmental velocity together with services featuring considerable information intensity contact centres will require up to date and enlightened agents to satisfy the demands placed upon them by those requesting their services. In this paper we examine salient factors such as Power Distance, Knowledge structures and the dynamics of job specialisation and enlargement to suggest critical success factors in the domain of contact centres.

Keywords: Post Taylorism, Knowledge Management, Power Distance, Organisational Learning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1833
1355 A System for Analyzing and Eliciting Public Grievances Using Cache Enabled Big Data

Authors: P. Kaladevi, N. Giridharan

Abstract:

The system for analyzing and eliciting public grievances serves its main purpose to receive and process all sorts of complaints from the public and respond to users. Due to the more number of complaint data becomes big data which is difficult to store and process. The proposed system uses HDFS to store the big data and uses MapReduce to process the big data. The concept of cache was applied in the system to provide immediate response and timely action using big data analytics. Cache enabled big data increases the response time of the system. The unstructured data provided by the users are efficiently handled through map reduce algorithm. The processing of complaints takes place in the order of the hierarchy of the authority. The drawbacks of the traditional database system used in the existing system are set forth by our system by using Cache enabled Hadoop Distributed File System. MapReduce framework codes have the possible to leak the sensitive data through computation process. We propose a system that add noise to the output of the reduce phase to avoid signaling the presence of sensitive data. If the complaints are not processed in the ample time, then automatically it is forwarded to the higher authority. Hence it ensures assurance in processing. A copy of the filed complaint is sent as a digitally signed PDF document to the user mail id which serves as a proof. The system report serves to be an essential data while making important decisions based on legislation.

Keywords: Big Data, Hadoop, HDFS, Caching, MapReduce, web personalization, e-governance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1565
1354 U.S. Nuclear Regulatory CommissionTraining for Research and Training Reactor Inspectors

Authors: Gary Marlin Sandquist

Abstract:

Currently, a large number of license activities (Early Site Permits, Combined Operating License, reactor certifications, etc.), are pending for review before the United States Nuclear Regulatory Commission (US NRC). Much of the senior staff at the NRC is now committed to these review and licensing actions. To address this additional workload, the NRC has recruited a large number of new Regulatory Staff for dealing with these and other regulatory actions such as the US Fleet of Research and Test Reactors (RTRs). These reactors pose unusual demands on Regulatory Staff since the US Fleet of RTRs, although few (32 Licensed RTRs as of 2010), they represent a broad range of reactor types, operations, and research and training aspects that nuclear reactor power plants (such as the 104 LWRs) do not pose. The NRC must inspect and regulate all these facilities. This paper addresses selected training topics and regulatory activities providedNRC Inspectors for RTRs.

Keywords: Regulations, Research and Test Reactors, Training, US NRC

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1612
1353 Comparison between Higher-Order SVD and Third-order Orthogonal Tensor Product Expansion

Authors: Chiharu Okuma, Jun Murakami, Naoki Yamamoto

Abstract:

In digital signal processing it is important to approximate multi-dimensional data by the method called rank reduction, in which we reduce the rank of multi-dimensional data from higher to lower. For 2-dimennsional data, singular value decomposition (SVD) is one of the most known rank reduction techniques. Additional, outer product expansion expanded from SVD was proposed and implemented for multi-dimensional data, which has been widely applied to image processing and pattern recognition. However, the multi-dimensional outer product expansion has behavior of great computation complex and has not orthogonally between the expansion terms. Therefore we have proposed an alterative method, Third-order Orthogonal Tensor Product Expansion short for 3-OTPE. 3-OTPE uses the power method instead of nonlinear optimization method for decreasing at computing time. At the same time the group of B. D. Lathauwer proposed Higher-Order SVD (HOSVD) that is also developed with SVD extensions for multi-dimensional data. 3-OTPE and HOSVD are similarly on the rank reduction of multi-dimensional data. Using these two methods we can obtain computation results respectively, some ones are the same while some ones are slight different. In this paper, we compare 3-OTPE to HOSVD in accuracy of calculation and computing time of resolution, and clarify the difference between these two methods.

Keywords: Singular value decomposition (SVD), higher-order SVD (HOSVD), higher-order tensor, outer product expansion, power method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1532
1352 Objects Extraction by Cooperating Optical Flow, Edge Detection and Region Growing Procedures

Authors: C. Lodato, S. Lopes

Abstract:

The image segmentation method described in this paper has been developed as a pre-processing stage to be used in methodologies and tools for video/image indexing and retrieval by content. This method solves the problem of whole objects extraction from background and it produces images of single complete objects from videos or photos. The extracted images are used for calculating the object visual features necessary for both indexing and retrieval processes. The segmentation algorithm is based on the cooperation among an optical flow evaluation method, edge detection and region growing procedures. The optical flow estimator belongs to the class of differential methods. It permits to detect motions ranging from a fraction of a pixel to a few pixels per frame, achieving good results in presence of noise without the need of a filtering pre-processing stage and includes a specialised model for moving object detection. The first task of the presented method exploits the cues from motion analysis for moving areas detection. Objects and background are then refined using respectively edge detection and seeded region growing procedures. All the tasks are iteratively performed until objects and background are completely resolved. The method has been applied to a variety of indoor and outdoor scenes where objects of different type and shape are represented on variously textured background.

Keywords: Image Segmentation, Motion Detection, Object Extraction, Optical Flow

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737
1351 An Experimentally Validated Thermo- Mechanical Finite Element Model for Friction Stir Welding in Carbon Steels

Authors: A. H. Kheireddine, A. A. Khalil, A. H. Ammouri, G. T. Kridli, R. F. Hamade

Abstract:

Solidification cracking and hydrogen cracking are some defects generated in the fusion welding of ultrahigh carbon steels. However, friction stir welding (FSW) of such steels, being a solid-state technique, has been demonstrated to alleviate such problems encountered in traditional welding. FSW include different process parameters that must be carefully defined prior processing. These parameters included but not restricted to: tool feed, tool RPM, tool geometry, tool tilt angle. These parameters form a key factor behind avoiding warm holes and voids behind the tool and in achieving a defect-free weld. More importantly, these parameters directly affect the microstructure of the weld and hence the final mechanical properties of weld. For that, 3D finite element (FE) thermo-mechanical model was developed using DEFORM 3D to simulate FSW of carbon steel. At points of interest in the joint, tracking is done for history of critical state variables such as temperature, stresses, and strain rates. Typical results found include the ability to simulate different weld zones. Simulations predictions were successfully compared to experimental FSW tests. It is believed that such a numerical model can be used to optimize FSW processing parameters to favor desirable defect free weld with better mechanical properties.

Keywords: Carbon Steels, DEFORM 3D, FEM, Friction stir welding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2546
1350 Numerical Modeling of Direct Shear Tests on Sandy Clay

Authors: R. Ziaie Moayed , S. Tamassoki , E. Izadi

Abstract:

Investigation of sandy clay behavior is important since urban development demands mean that sandy clay areas are increasingly encountered, especially for transportation infrastructures. This paper presents the results of the finite element analysis of the direct shear test (under three vertical loading 44, 96 and 192 kPa) and discusses the effects of different parameters such as cohesion, friction angle and Young's modulus on the shear strength of sandy clay. The numerical model was calibrated against the experimental results of large-scale direct shear tests. The results have shown that the shear strength was increased with increase in friction angle and cohesion. However, the shear strength was not influenced by raising the friction angle at normal stress of 44 kPa. Also, the effect of different young's modulus factors on stress-strain curve was investigated.

Keywords: Shear strength, Finite element analysis, Large direct shear test, Sandy clay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5451