Search results for: ordinal response models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3880

Search results for: ordinal response models

760 Towards a Deeper Understanding of 21st Century Global Terrorism

Authors: Francis Jegede

Abstract:

This paper examines essential issues relating to the rise and nature of violent extremism involving non-state actors and groups in the early 21st century. The global trends in terrorism and violent extremism are examined in relation to Western governments’ counter terror operations. The paper analyses the existing legal framework for fighting violent extremism and terrorism and highlights the inherent limitations of the current International Law of War in dealing with the growing challenges posed by terrorists and violent extremist groups. The paper discusses how terrorist groups use civilians, women and children as tools and weapon of war to fuel their campaign of terror and suggests ways in which the international community could deal with the challenge of fighting terrorist groups without putting civilians, women and children in harm way. The paper emphasises the need to uphold human rights values and respect for the law of war in our response to global terrorism. The paper poses the question as to whether the current legal framework for dealing with terrorist groups is sufficient without contravening the essential provisions and ethos of the International Law of War and Human Rights. While the paper explains how terrorist groups flagrantly disregard the rule of law and disrespect human rights in their campaign of terror, it also notes instances in which the current Western strategy in fighting terrorism may be viewed or considered as conflicting with human rights and international law.

Keywords: Terrorism, law of war, international law, violent extremism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2261
759 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: Predictive analysis, big data, predictive analysis algorithms. CART algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1042
758 Top Management Support as an Enabling Factor for Academic Innovation through Knowledge Sharing

Authors: Sawsan J. Al-husseini, Talib A. Dosa

Abstract:

Educational institutions are today facing increasing pressures due to economic, political and social upheaval. This is only exacerbated by the nature of education as an intangible good which relies upon the intellectual assets of the organisation, its staff. Top management support has been acknowledged as having a positive general influence on knowledge management and creativity. However, there is a lack of models linking top management support, knowledge sharing, and innovation within higher education institutions, in general within developing countries, and particularly in Iraq. This research sought to investigate the impact of top management support on innovation through the mediating role of knowledge sharing in Iraqi private HEIs. A quantitative approach was taken and 262 valid responses were collected to test the causal relationships between top management support, knowledge sharing, and innovation. Employing structural equation modelling with AMOS v.25, the research demonstrated that knowledge sharing plays a pivotal role in the relationship between top management support and innovation. The research has produced some guidelines for researchers as well as leaders, and provided evidence to support the use of knowledge sharing to increase innovation within the higher education environment in developing countries, particularly Iraq.

Keywords: Top management support, knowledge sharing, innovation, structural equation modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1253
757 Market Segmentation and Conjoint Analysis for Apple Family Design

Authors: Abbas Al-Refaie, Nour Bata

Abstract:

A distributor of Apple products' experiences numerous difficulties in developing marketing strategies for new and existing mobile product entries that maximize customer satisfaction and the firm's profitability. This research, therefore, integrates market segmentation in platform-based product family design and conjoint analysis to identify iSystem combinations that increase customer satisfaction and business profits. First, the enhanced market segmentation grid is created. Then, the estimated demand model is formulated. Finally, the profit models are constructed then used to determine the ideal product family design that maximizes profit. Conjoint analysis is used to explore customer preferences with their satisfaction levels. A total of 200 surveys are collected about customer preferences. Then, simulation is used to determine the importance values for each attribute. Finally, sensitivity analysis is conducted to determine the product family design that maximizes both objectives. In conclusion, the results of this research shall provide great support to Apple distributors in determining the best marketing strategies that enhance their market share.

Keywords: Market segmentation, conjoint analysis, market strategies, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2484
756 3DARModeler: a 3D Modeling System in Augmented Reality Environment

Authors: Trien V. Do, Jong-Weon Lee

Abstract:

This paper describes a 3D modeling system in Augmented Reality environment, named 3DARModeler. It can be considered a simple version of 3D Studio Max with necessary functions for a modeling system such as creating objects, applying texture, adding animation, estimating real light sources and casting shadows. The 3DARModeler introduces convenient, and effective human-computer interaction to build 3D models by combining both the traditional input method (mouse/keyboard) and the tangible input method (markers). It has the ability to align a new virtual object with the existing parts of a model. The 3DARModeler targets nontechnical users. As such, they do not need much knowledge of computer graphics and modeling techniques. All they have to do is select basic objects, customize their attributes, and put them together to build a 3D model in a simple and intuitive way as if they were doing in the real world. Using the hierarchical modeling technique, the users are able to group several basic objects to manage them as a unified, complex object. The system can also connect with other 3D systems by importing and exporting VRML/3Ds Max files. A module of speech recognition is included in the system to provide flexible user interfaces.

Keywords: 3D Modeling, Augmented Reality, GeometricModeling, Virtual Reality

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2622
755 AI-Based Approaches for Task Offloading, ‎Resource ‎Allocation and Service Placement of ‎IoT Applications: State of the Art

Authors: Fatima Z. Cherhabil, Mammar Sedrati, Sonia-Sabrina Bendib‎

Abstract:

In order to support the continued growth, critical latency of ‎IoT ‎applications and ‎various obstacles of traditional data centers, ‎Mobile Edge ‎Computing (MEC) has ‎emerged as a promising solution that extends the cloud data-processing and decision-making to edge devices. ‎By adopting a MEC structure, IoT applications could be executed ‎locally, on ‎an edge server, different fog nodes or distant cloud ‎data centers. However, we are ‎often ‎faced with wanting to optimize conflicting criteria such as ‎minimizing energy ‎consumption of limited local capabilities (in terms of CPU, RAM, storage, bandwidth) of mobile edge ‎devices and trying to ‎keep ‎high performance (reducing ‎response time, increasing throughput and service availability) ‎at the same ‎time‎. Achieving one goal may affect the other making Task Offloading (TO), ‎Resource Allocation (RA) and Service Placement (SP) complex ‎processes. ‎It is a nontrivial multi-objective optimization ‎problem ‎to study the trade-off between conflicting criteria. ‎The paper provides a survey on different TO, SP and RA recent Multi-‎Objective Optimization (MOO) approaches used in edge computing environments, particularly Artificial Intelligent (AI) ones, to satisfy various objectives, constraints and dynamic conditions related to IoT applications‎.

Keywords: Mobile Edge Computing, Multi-Objective Optimization, Artificial Intelligence ‎Approaches, Task Offloading, Resource Allocation, Service Placement‎.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 448
754 Identifying Knowledge Gaps in Incorporating Toxicity of Particulate Matter Constituents for Developing Regulatory Limits on Particulate Matter

Authors: Ananya Das, Arun Kumar, Gazala Habib, Vivekanandan Perumal

Abstract:

Regulatory bodies has proposed limits on Particulate Matter (PM) concentration in air; however, it does not explicitly indicate the incorporation of effects of toxicities of constituents of PM in developing regulatory limits. This study aimed to provide a structured approach to incorporate toxic effects of components in developing regulatory limits on PM. A four-step human health risk assessment framework consists of - (1) hazard identification (parameters: PM and its constituents and their associated toxic effects on health), (2) exposure assessment (parameters: concentrations of PM and constituents, information on size and shape of PM; fate and transport of PM and constituents in respiratory system), (3) dose-response assessment (parameters: reference dose or target toxicity dose of PM and its constituents), and (4) risk estimation (metric: hazard quotient and/or lifetime incremental risk of cancer as applicable). Then parameters required at every step were obtained from literature. Using this information, an attempt has been made to determine limits on PM using component-specific information. An example calculation was conducted for exposures of PM2.5 and its metal constituents from Indian ambient environment to determine limit on PM values. Identified data gaps were: (1) concentrations of PM and its constituents and their relationship with sampling regions, (2) relationship of toxicity of PM with its components.

Keywords: Air, component-specific toxicity, human health risks, particulate matter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1160
753 A Study of Panel Logit Model and Adaptive Neuro-Fuzzy Inference System in the Prediction of Financial Distress Periods

Authors: Ε. Giovanis

Abstract:

The purpose of this paper is to present two different approaches of financial distress pre-warning models appropriate for risk supervisors, investors and policy makers. We examine a sample of the financial institutions and electronic companies of Taiwan Security Exchange (TSE) market from 2002 through 2008. We present a binary logistic regression with paned data analysis. With the pooled binary logistic regression we build a model including more variables in the regression than with random effects, while the in-sample and out-sample forecasting performance is higher in random effects estimation than in pooled regression. On the other hand we estimate an Adaptive Neuro-Fuzzy Inference System (ANFIS) with Gaussian and Generalized Bell (Gbell) functions and we find that ANFIS outperforms significant Logit regressions in both in-sample and out-of-sample periods, indicating that ANFIS is a more appropriate tool for financial risk managers and for the economic policy makers in central banks and national statistical services.

Keywords: ANFIS, Binary logistic regression, Financialdistress, Panel data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2327
752 Cost Sensitive Feature Selection in Decision-Theoretic Rough Set Models for Customer Churn Prediction: The Case of Telecommunication Sector Customers

Authors: Emel Kızılkaya Aydogan, Mihrimah Ozmen, Yılmaz Delice

Abstract:

In recent days, there is a change and the ongoing development of the telecommunications sector in the global market. In this sector, churn analysis techniques are commonly used for analysing why some customers terminate their service subscriptions prematurely. In addition, customer churn is utmost significant in this sector since it causes to important business loss. Many companies make various researches in order to prevent losses while increasing customer loyalty. Although a large quantity of accumulated data is available in this sector, their usefulness is limited by data quality and relevance. In this paper, a cost-sensitive feature selection framework is developed aiming to obtain the feature reducts to predict customer churn. The framework is a cost based optional pre-processing stage to remove redundant features for churn management. In addition, this cost-based feature selection algorithm is applied in a telecommunication company in Turkey and the results obtained with this algorithm.

Keywords: Churn prediction, data mining, decision-theoretic rough set, feature selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1743
751 Objective Assessment of Psoriasis Lesion Thickness for PASI Scoring using 3D Digital Imaging

Authors: M.H. Ahmad Fadzil, Hurriyatul Fitriyah, Esa Prakasa, Hermawan Nugroho, S.H. Hussein, Azura Mohd. Affandi

Abstract:

Psoriasis is a chronic inflammatory skin condition which affects 2-3% of population around the world. Psoriasis Area and Severity Index (PASI) is a gold standard to assess psoriasis severity as well as the treatment efficacy. Although a gold standard, PASI is rarely used because it is tedious and complex. In practice, PASI score is determined subjectively by dermatologists, therefore inter and intra variations of assessment are possible to happen even among expert dermatologists. This research develops an algorithm to assess psoriasis lesion for PASI scoring objectively. Focus of this research is thickness assessment as one of PASI four parameters beside area, erythema and scaliness. Psoriasis lesion thickness is measured by averaging the total elevation from lesion base to lesion surface. Thickness values of 122 3D images taken from 39 patients are grouped into 4 PASI thickness score using K-means clustering. Validation on lesion base construction is performed using twelve body curvature models and show good result with coefficient of determinant (R2) is equal to 1.

Keywords: 3D digital imaging, base construction, PASI, psoriasis lesion thickness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2428
750 Automated Heart Sound Classification from Unsegmented Phonocardiogram Signals Using Time Frequency Features

Authors: Nadia Masood Khan, Muhammad Salman Khan, Gul Muhammad Khan

Abstract:

Cardiologists perform cardiac auscultation to detect abnormalities in heart sounds. Since accurate auscultation is a crucial first step in screening patients with heart diseases, there is a need to develop computer-aided detection/diagnosis (CAD) systems to assist cardiologists in interpreting heart sounds and provide second opinions. In this paper different algorithms are implemented for automated heart sound classification using unsegmented phonocardiogram (PCG) signals. Support vector machine (SVM), artificial neural network (ANN) and cartesian genetic programming evolved artificial neural network (CGPANN) without the application of any segmentation algorithm has been explored in this study. The signals are first pre-processed to remove any unwanted frequencies. Both time and frequency domain features are then extracted for training the different models. The different algorithms are tested in multiple scenarios and their strengths and weaknesses are discussed. Results indicate that SVM outperforms the rest with an accuracy of 73.64%.

Keywords: Pattern recognition, machine learning, computer aided diagnosis, heart sound classification, and feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1250
749 Global Kinetics of Direct Dimethyl Ether Synthesis Process from Syngas in Slurry Reactor over a Novel Cu-Zn-Al-Zr Slurry Catalyst

Authors: Zhen Chen, Haitao Zhang, Weiyong Ying, Dingye Fang

Abstract:

The direct synthesis process of dimethyl ether (DME) from syngas in slurry reactors is considered to be promising because of its advantages in caloric transfer. In this paper, the influences of operating conditions (temperature, pressure and weight hourly space velocity) on the conversion of CO, selectivity of DME and methanol were studied in a stirred autoclave over Cu-Zn-Al-Zr slurry catalyst, which is far more suitable to liquid phase dimethyl ether synthesis process than bifunctional catalyst commercially. A Langmuir- Hinshelwood mechanism type global kinetics model for liquid phase DME direct synthesis based on methanol synthesis models and a methanol dehydration model has been investigated by fitting our experimental data. The model parameters were estimated with MATLAB program based on general Genetic Algorithms and Levenberg-Marquardt method, which is suitably fitting experimental data and its reliability was verified by statistical test and residual error analysis.

Keywords: alcohol/ether fuel, Cu-Zn-Al-Zr slurry catalyst, global kinetics, slurry reactor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5482
748 Effect of Environmental Factors on Photoreactivation of Microorganisms under Indoor Conditions

Authors: Shirin Shafaei, James R. Bolton, Mohamed Gamal El Din

Abstract:

Ultraviolet (UV) disinfection causes damage to the DNA or RNA of microorganisms, but many microorganisms can repair this damage after exposure to near-UV or visible wavelengths (310–480 nm) by a mechanism called photoreactivation. Photoreactivation is gaining more attention because it can reduce the efficiency of UV disinfection of wastewater several hours after treatment. The focus of many photoreactivation research activities on the single species has caused a considerable lack in knowledge about complex natural communities of microorganisms and their response to UV treatment. In this research, photoreactivation experiments were carried out on the influent of the UV disinfection unit at a municipal wastewater treatment plant (WWTP) in Edmonton, Alberta after exposure to a Medium-Pressure (MP) UV lamp system to evaluate the effect of environmental factors on photoreactivation of microorganisms in the actual municipal wastewater. The effect of reactivation fluence, temperature, and river water on photoreactivation of total coliforms was examined under indoor conditions. The results showed that higher effective reactivation fluence values (up to 20 J/cm2) and higher temperatures (up to 25 °C) increased the photoreactivation of total coliforms. However, increasing the percentage of river in the mixtures of the effluent and river water decreased the photoreactivation of the mixtures. The results of this research can help the municipal wastewater treatment industry to examine the environmental effects of discharging their effluents into receiving waters.

Keywords: Photoreactivation, reactivation fluence, river water, temperature, ultraviolet disinfection, wastewater effluent.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1381
747 Wear Measuring and Wear Modelling Based On Archard, ASTM, and Neural Network Models

Authors: A. Shebani, C. Pislaru

Abstract:

The wear measuring and wear modelling are fundamental issues in the industrial field, mainly correlated to the economy and safety. Therefore, there is a need to study the wear measurements and wear estimation. Pin-on-disc test is the most common test which is used to study the wear behaviour. In this paper, the pin-on-disc (AEROTECH UNIDEX 11) is used for the investigation of the effects of normal load and hardness of material on the wear under dry and sliding conditions. In the pin-on-disc rig, two specimens were used; one, a pin is made of steel with a tip, positioned perpendicular to the disc, where the disc is made of aluminium. The pin wear and disc wear were measured by using the following instruments: The Talysurf instrument, a digital microscope, and the alicona instrument. The Talysurf profilometer was used to measure the pin/disc wear scar depth, digital microscope was used to measure the diameter and width of wear scar, and the alicona was used to measure the pin wear and disc wear. After that, the Archard model, American Society for Testing and Materials model (ASTM), and neural network model were used for pin/disc wear modelling. Simulation results were implemented by using the Matlab program. This paper focuses on how the alicona can be used for wear measurements and how the neural network can be used for wear estimation.

Keywords: Wear measuring, Wear modelling, Neural Network, Alicona.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4246
746 Advance in Monitoring and Process Control of Surface Roughness

Authors: Somkiat Tangjitsitcharoen, Siripong Damrongthaveesak

Abstract:

This paper presents an advance in monitoring and process control of surface roughness in CNC machine for the turning and milling processes. An integration of the in-process monitoring and process control of the surface roughness is proposed and developed during the machining process by using the cutting force ratio. The previously developed surface roughness models for turning and milling processes of the author are adopted to predict the inprocess surface roughness, which consist of the cutting speed, the feed rate, the tool nose radius, the depth of cut, the rake angle, and the cutting force ratio. The cutting force ratios obtained from the turning and the milling are utilized to estimate the in-process surface roughness. The dynamometers are installed on the tool turret of CNC turning machine and the table of 5-axis machining center to monitor the cutting forces. The in-process control of the surface roughness has been developed and proposed to control the predicted surface roughness. It has been proved by the cutting tests that the proposed integration system of the in-process monitoring and the process control can be used to check the surface roughness during the cutting by utilizing the cutting force ratio.

Keywords: Turning, milling, monitoring, surface roughness, cutting force ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2104
745 Formulation and in vitro Evaluation of Sustained Release Matrix Tablets of Levetiracetam for Better Epileptic Treatment

Authors: Nagasamy Venkatesh Dhandapani

Abstract:

The objective of the present study was to develop sustained release oral matrix tablets of anti epileptic drug levetiracetam. The sustained release matrix tablets of levetiracetam were prepared using hydrophilic matrix hydroxypropyl methylcellulose (HPMC) as a release retarding polymer by wet granulation method. Prior to compression, FTIR studies were performed to understand the compatibility between the drug and excipients. The study revealed that there was no chemical interaction between drug and excipients used in the study. The tablets were characterized by physical and chemical parameters and results were found in acceptable limits. In vitro release study was carried out for the tablets using 0.1 N HCl for 2 hours and in phosphate buffer pH 7.4 for remaining time up to 12 hours. The effect of polymer concentration was studied. Different dissolution models were applied to drug release data in order to evaluate release mechanisms and kinetics. The drug release data fit well to zero order kinetics. Drug release mechanism was found as a complex mixture of diffusion, swelling and erosion.

Keywords: Levetiracetam, sustained-release, hydrophilic matrix tablet, HPMC grade K 100 MCR, wet granulation, zero order release kinetics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1588
744 On Developing an Automatic Speech Recognition System for Standard Arabic Language

Authors: R. Walha, F. Drira, H. El-Abed, A. M. Alimi

Abstract:

The Automatic Speech Recognition (ASR) applied to Arabic language is a challenging task. This is mainly related to the language specificities which make the researchers facing multiple difficulties such as the insufficient linguistic resources and the very limited number of available transcribed Arabic speech corpora. In this paper, we are interested in the development of a HMM-based ASR system for Standard Arabic (SA) language. Our fundamental research goal is to select the most appropriate acoustic parameters describing each audio frame, acoustic models and speech recognition unit. To achieve this purpose, we analyze the effect of varying frame windowing (size and period), acoustic parameter number resulting from features extraction methods traditionally used in ASR, speech recognition unit, Gaussian number per HMM state and number of embedded re-estimations of the Baum-Welch Algorithm. To evaluate the proposed ASR system, a multi-speaker SA connected-digits corpus is collected, transcribed and used throughout all experiments. A further evaluation is conducted on a speaker-independent continue SA speech corpus. The phonemes recognition rate is 94.02% which is relatively high when comparing it with another ASR system evaluated on the same corpus.

Keywords: ASR, HMM, acoustical analysis, acoustic modeling, Standard Arabic language

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1757
743 CFD Investigation of Turbulent Mixed Convection Heat Transfer in a Closed Lid-Driven Cavity

Authors: A. Khaleel, S. Gao

Abstract:

Both steady and unsteady turbulent mixed convection heat transfer in a 3D lid-driven enclosure, which has constant heat flux on the middle of bottom wall and with isothermal moving sidewalls, is reported in this paper for working fluid with Prandtl number Pr = 0.71. The other walls are adiabatic and stationary. The dimensionless parameters used in this research are Reynolds number, Re = 5000, 10000 and 15000, and Richardson number, Ri = 1 and 10. The simulations have been done by using different turbulent methods such as RANS, URANS, and LES. The effects of using different k-ε models such as standard, RNG and Realizable k-ε model are investigated. Interesting behaviours of the thermal and flow fields with changing the Re or Ri numbers are observed. Isotherm and turbulent kinetic energy distributions and variation of local Nusselt number at the hot bottom wall are studied as well. The local Nusselt number is found increasing with increasing either Re or Ri number. In addition, the turbulent kinetic energy is discernibly affected by increasing Re number. Moreover, the LES results have shown good ability of this method in predicting more detailed flow structures in the cavity.

Keywords: Mixed convection, Lid-driven cavity, Turbulent flow, RANS model, URANS model, Large eddy simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2251
742 Effect of Sensory Manipulations on Human Joint Stiffness Strategy and Its Adaptation for Human Dynamic Stability

Authors: Aizreena Azaman, Mai Ishibashi, Masanori Ishizawa, Shin-Ichiroh Yamamoto

Abstract:

Sensory input plays an important role to human posture control system to initiate strategy in order to counterpart any unbalance condition and thus, prevent fall. In previous study, joint stiffness was observed able to describe certain issues regarding to movement performance. But, correlation between balance ability and joint stiffness is still remains unknown. In this study, joint stiffening strategy at ankle and hip were observed under different sensory manipulations and its correlation with conventional clinical test (Functional Reach Test) for balance ability was investigated. In order to create unstable condition, two different surface perturbations (tilt up-tilt (TT) down and forward-backward (FB)) at four different frequencies (0.2, 0.4, 0.6 and 0.8 Hz) were introduced. Furthermore, four different sensory manipulation conditions (include vision and vestibular system) were applied to the subject and they were asked to maintain their position as possible. The results suggested that joint stiffness were high during difficult balance situation. Less balance people generated high average joint stiffness compared to balance people. Besides, adaptation of posture control system under repetitive external perturbation also suggested less during sensory limited condition. Overall, analysis of joint stiffening response possible to predict unbalance situation faced by human

Keywords: Balance ability, joint stiffness, sensory, adaptation, dynamic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931
741 Effect of Modified Atmosphere Packaging and Storage Temperatures on Quality of Shelled Raw Walnuts

Authors: M. Javanmard

Abstract:

This study was aimed at analyzing the effects of packaging (MAP) and preservation conditions on the packaged fresh walnut kernel quality. The central composite plan was used for evaluating the effect of oxygen (0–10%), carbon dioxide (0-10%), and temperature (4-26 °C) on qualitative characteristics of walnut kernels. Also, the response level technique was used to find the optimal conditions for interactive effects of factors, as well as estimating the best conditions of process using least amount of testing. Measured qualitative parameters were: peroxide index, color, decreased weight, mould and yeast counting test, and sensory evaluation. The results showed that the defined model for peroxide index, color, weight loss, and sensory evaluation is significant (p < 0.001), so that increase of temperature causes the peroxide value, color variation, and weight loss to increase and it reduces the overall acceptability of walnut kernels. An increase in oxygen percentage caused the color variation level and peroxide value to increase and resulted in lower overall acceptability of the walnuts. An increase in CO2 percentage caused the peroxide value to decrease, but did not significantly affect other indices (p ≥ 0.05). Mould and yeast were not found in any samples. Optimal packaging conditions to achieve maximum quality of walnuts include: 1.46% oxygen, 10% carbon dioxide, and temperature of 4 °C.

Keywords: Shelled walnut, MAP, quality, storage temperature.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1105
740 Analysis of GI/M(n)/1/N Queue with Single Working Vacation and Vacation Interruption

Authors: P. Vijaya Laxmi, V. Goswami, V. Suchitra

Abstract:

This paper presents a finite buffer renewal input single working vacation and vacation interruption queue with state dependent services and state dependent vacations, which has a wide range of applications in several areas including manufacturing, wireless communication systems. Service times during busy period, vacation period and vacation times are exponentially distributed and are state dependent. As a result of the finite waiting space, state dependent services and state dependent vacation policies, the analysis of these queueing models needs special attention. We provide a recursive method using the supplementary variable technique to compute the stationary queue length distributions at pre-arrival and arbitrary epochs. An efficient computational algorithm of the model is presented which is fast and accurate and easy to implement. Various performance measures have been discussed. Finally, some special cases and numerical results have been depicted in the form of tables and graphs. 

Keywords: State Dependent Service, Vacation Interruption, Supplementary Variable, Single Working Vacation, Blocking Probability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2126
739 Novel NMR-Technology to Assess Food Quality and Safety

Authors: Markus Link, Manfred Spraul, Hartmut Schaefer, Fang Fang, Birk Schuetz

Abstract:

High Resolution NMR Spectroscopy offers unique screening capabilities for food quality and safety by combining non-targeted and targeted screening in one analysis.

The objective is to demonstrate, that due to its extreme reproducibility NMR can detect smallest changes in concentrations of many components in a mixture, which is best monitored by statistical evaluation however also delivers reliable quantification results.

The methodology typically uses a 400 MHz high resolution instrument under full automation after minimized sample preparation.

For example one fruit juice analysis in a push button operation takes at maximum 15 minutes and delivers a multitude of results, which are automatically summarized in a PDF report.

The method has been proven on fruit juices, where so far unknown frauds could be detected. In addition conventional targeted parameters are obtained in the same analysis. This technology has the advantage that NMR is completely quantitative and concentration calibration only has to be done once for all compounds. Since NMR is so reproducible, it is also transferable between different instruments (with same field strength) and laboratories. Based on strict SOP`s, statistical models developed once can be used on multiple instruments and strategies for compound identification and quantification are applicable as well across labs.

Keywords: Automated solution, NMR, non-targeted screening, targeted screening.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2221
738 Finite Element Analysis of Different Architectures for Bone Scaffold

Authors: Nimisha R. Shirbhate, Sanjay Bokade

Abstract:

Bone Scaffolds are fundamental architecture or a support structure that allows the regeneration of lost or damaged tissues and they are developed as a crucial tool in biomedical engineering. The structure of bone scaffolds plays an important role in treating bone defects. The shape of the bone scaffold performs a vital role, specifically pore size and shape, which help understand the behavior and strength of the scaffold. In this article, first, fundamental aspects of bone scaffold design are established. Second, the behavior of each architecture of the bone scaffold with biomaterials is discussed. Finally, for each structure, the stress analysis was carried out. This study aimed to design a porous and mechanically strong bone regeneration scaffold that can be successfully manufactured. Four porous architectures of the bone scaffold were designed using Rhinoceros solid modelling software. The structure model consisted of repeatable unit cells arranged in layers to fill the chosen scaffold volume. The mechanical behavior of used biocompatible material is studied with the help of ANSYS 19.2 software. It is also playing significant role to predict the strength of defined structures or 3 dimensional models.

Keywords: Bone scaffold, stress analysis, porous structure, static loading.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 497
737 The Ability of Forecasting the Term Structure of Interest Rates Based On Nelson-Siegel and Svensson Model

Authors: Tea Poklepović, Zdravka Aljinović, Branka Marasović

Abstract:

Due to the importance of yield curve and its estimation it is inevitable to have valid methods for yield curve forecasting in cases when there are scarce issues of securities and/or week trade on a secondary market. Therefore in this paper, after the estimation of weekly yield curves on Croatian financial market from October 2011 to August 2012 using Nelson-Siegel and Svensson models, yield curves are forecasted using Vector autoregressive model and Neural networks. In general, it can be concluded that both forecasting methods have good prediction abilities where forecasting of yield curves based on Nelson Siegel estimation model give better results in sense of lower Mean Squared Error than forecasting based on Svensson model Also, in this case Neural networks provide slightly better results. Finally, it can be concluded that most appropriate way of yield curve prediction is Neural networks using Nelson-Siegel estimation of yield curves.

Keywords: Nelson-Siegel model, Neural networks, Svensson model, Vector autoregressive model, Yield curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3219
736 Effects of Inlet Distorted Flows on the Performance of an Axial Compressor

Authors: Asad Islam, Khalid Parvez

Abstract:

Compressor fans in modern aircraft engines are of considerate importance, as they provide majority of thrust required by the aircraft. Their challenging environment is frequently subjected to non-uniform inflow conditions. These conditions could be either due to the flight operating requirements such as take-off and landing, wake interference from aircraft fuselage or cross-flow wind conditions. So, in highly maneuverable flights regimes of fighter aircrafts affects the overall performance of an engine. Since the flow in compressor of an aircraft application is highly sensitive because of adverse pressure gradient due to different flow orientations of the aircraft. Therefore, it is prone to unstable operations. This paper presents the study that focuses on axial compressor response to inlet flow orientations for the range of angles as 0 to 15 degrees. For this purpose, NASA Rotor-37 was taken and CFD mesh was developed. The compressor characteristics map was generated for the design conditions of pressure ratio of 2.106 with the rotor operating at rotational velocity of 17188.7 rpm using CFD simulating environment of ANSYS-CFX®. The grid study was done to see the effects of mesh upon computational solution. Then, the mesh giving the best results, (when validated with the available experimental NASA’s results); was used for further distortion analysis. The flow in the inlet nozzle was given angle orientations ranging from 0 to 15 degrees. The CFD results are analyzed and discussed with respect to stall margin and flow separations due to induced distortions.

Keywords: Angle, ANSYS-CFX®, axial compressor, Bladegen®, CFD, distortions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1991
735 Dynamic Model Conception of Improving Services Quality in Railway Transport

Authors: Eva Nedeliakova, Jaroslav Masek, Juraj Camaj

Abstract:

This article describes the results of research focused on quality of railway freight transport services. Improvement of these services has a crucial importance in customer considering on the future use of railway transport. Processes filling the customer demands and output quality assessment were defined as a part of the research. In this contribution is introduced the map of quality planning and the algorithm of applied methodology. It characterizes a model which takes into account characters of transportation with linking a perception services quality in ordinary and extraordinary operation. Despite the fact that rail freight transport has its solid position in the transport market, lots of carriers worldwide have been experiencing a stagnation for a couple of years. Therefore, specific results of the research have a significant importance and belong to numerous initiatives aimed to develop and support railway transport not only by creating a single railway area or reducing noise but also by promoting railway services. This contribution is focused also on the application of dynamic quality models which represent an innovative method of evaluation quality services. Through this conception, time factor, expected, and perceived quality in each moment of the transportation process can be taken into account.

Keywords: Quality, railway, transport, service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1675
734 Fung’s Model Constants for Intracranial Blood Vessel of Human Using Biaxial Tensile Test Results

Authors: Mohammad Shafigh, Nasser Fatouraee, Amirsaied Seddighi

Abstract:

Mechanical properties of cerebral arteries are, due to their relationship with cerebrovascular diseases, of clinical worth. To acquire these properties, eight samples were obtained from middle cerebral arteries of human cadavers, whose death were not due to injuries or diseases of cerebral vessels, and tested within twelve hours after resection, by a precise biaxial tensile test device specially developed for the present study considering the dimensions, sensitivity and anisotropic nature of samples. The resulting stress-stretch curve was plotted and subsequently fitted to a hyperelastic three-parameter Fung model. It was found that the arteries were noticeably stiffer in circumferential than in axial direction. It was also demonstrated that the use of multi-parameter hyperelastic constitutive models is useful for mathematical description of behavior of cerebral vessel tissue. The reported material properties are a proper reference for numerical modeling of cerebral arteries and computational analysis of healthy or diseased intracranial arteries.

Keywords: Anisotropic Tissue, Cerebral Blood Vessels, Fung Model, Nonlinear Material, Plain Stress.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3330
733 A Comprehensive Review on Different Mixed Data Clustering Ensemble Methods

Authors: S. Sarumathi, N. Shanthi, S. Vidhya, M. Sharmila

Abstract:

An extensive amount of work has been done in data clustering research under the unsupervised learning technique in Data Mining during the past two decades. Moreover, several approaches and methods have been emerged focusing on clustering diverse data types, features of cluster models and similarity rates of clusters. However, none of the single clustering algorithm exemplifies its best nature in extracting efficient clusters. Consequently, in order to rectify this issue, a new challenging technique called Cluster Ensemble method was bloomed. This new approach tends to be the alternative method for the cluster analysis problem. The main objective of the Cluster Ensemble is to aggregate the diverse clustering solutions in such a way to attain accuracy and also to improve the eminence the individual clustering algorithms. Due to the massive and rapid development of new methods in the globe of data mining, it is highly mandatory to scrutinize a vital analysis of existing techniques and the future novelty. This paper shows the comparative analysis of different cluster ensemble methods along with their methodologies and salient features. Henceforth this unambiguous analysis will be very useful for the society of clustering experts and also helps in deciding the most appropriate one to resolve the problem in hand.

Keywords: Clustering, Cluster Ensemble Methods, Coassociation matrix, Consensus Function, Median Partition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2086
732 Statistical Analysis and Predictive Learning of Mechanical Parameters for TiO2 Filled GFRP Composite

Authors: S. Srinivasa Moorthy, K. Manonmani

Abstract:

The new, polymer composites consisting of e-glass fiber reinforcement with titanium oxide filler in the double bonded unsaturated polyester resin matrix were made. The glass fiber and titanium oxide reinforcement composites were made in three different fiber lengths (3cm, 5cm, and 7cm), filler content (2 wt%, 4 wt%, and 6 wt%) and fiber content (20 wt%, 40 wt%, and 60 wt%). 27 different compositions were fabricated and a sequence of experiments were carried out to determine tensile strength and impact strength. The vital influencing factors fiber length, fiber content and filler content were chosen as 3 factors in 3 levels of Taguchi’s L9 orthogonal array. The influences of parameters were determined for tensile strength and impact strength by Analysis of variance (ANOVA) and S/N ratio. Using Artificial Neural Network (ANN) an expert system was devised to predict the properties of hybrid reinforcement GFRP composites. The predict models were experimentally proved with the maximum coincidence.

Keywords: Analysis of variance (ANOVA), Artificial neural network (ANN), Polymer composites, Taguchi’s orthogonal array.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2377
731 Kinematic Hardening Parameters Identification with Respect to Objective Function

Authors: Marina Franulovic, Robert Basan, Bozidar Krizan

Abstract:

Constitutive modeling of material behavior is becoming increasingly important in prediction of possible failures in highly loaded engineering components, and consequently, optimization of their design. In order to account for large number of phenomena that occur in the material during operation, such as kinematic hardening effect in low cycle fatigue behavior of steels, complex nonlinear material models are used ever more frequently, despite of the complexity of determination of their parameters. As a method for the determination of these parameters, genetic algorithm is good choice because of its capability to provide very good approximation of the solution in systems with large number of unknown variables. For the application of genetic algorithm to parameter identification, inverse analysis must be primarily defined. It is used as a tool to fine-tune calculated stress-strain values with experimental ones. In order to choose proper objective function for inverse analysis among already existent and newly developed functions, the research is performed to investigate its influence on material behavior modeling.

Keywords: Genetic algorithm, kinematic hardening, material model, objective function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3757