Search results for: 1.5 bits/stage
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1103

Search results for: 1.5 bits/stage

803 Achieving Success in NPD Projects

Authors: Ankush Agrawal, Nadia Bhuiyan

Abstract:

The new product development (NPD) literature emphasizes the importance of introducing new products on the market for continuing business success. New products are responsible for employment, economic growth, technological progress, and high standards of living. Therefore, the study of NPD and the processes through which they emerge is important. The goal of our research is to propose a framework of critical success factors, metrics, and tools and techniques for implementing metrics for each stage of the new product development (NPD) process. An extensive literature review was undertaken to investigate decades of studies on NPD success and how it can be achieved. These studies were scanned for common factors for firms that enjoyed success of new products on the market. The paper summarizes NPD success factors, suggests metrics that should be used to measure these factors, and proposes tools and techniques to make use of these metrics. This was done for each stage of the NPD process, and brought together in a framework that the authors propose should be followed for complex NPD projects. While many studies have been conducted on critical success factors for NPD, these studies tend to be fragmented and focus on one or a few phases of the NPD process. 

Keywords: New product development, performance, critical success factors, framework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2447
802 Video-Based Face Recognition Based On State-Space Model

Authors: Cheng-Chieh Chiang, Yi-Chia Chan, Greg C. Lee

Abstract:

This paper proposes a video-based framework for face recognition to identify which faces appear in a video sequence. Our basic idea is like a tracking task - to track a selection of person candidates over time according to the observing visual features of face images in video frames. Hence, we employ the state-space model to formulate video-based face recognition by dividing this problem into two parts: the likelihood and the transition measures. The likelihood measure is to recognize whose face is currently being observed in video frames, for which two-dimensional linear discriminant analysis is employed. The transition measure estimates the probability of changing from an incorrect recognition at the previous stage to the correct person at the current stage. Moreover, extra nodes associated with head nodes are incorporated into our proposed state-space model. The experimental results are also provided to demonstrate the robustness and efficiency of our proposed approach.

Keywords: 2DLDA, face recognition, state-space model, likelihood measure, transition measure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1685
801 Quad Tree Decomposition Based Analysis of Compressed Image Data Communication for Lossy and Lossless Using WSN

Authors: N. Muthukumaran, R. Ravi

Abstract:

The Quad Tree Decomposition based performance analysis of compressed image data communication for lossy and lossless through wireless sensor network is presented. Images have considerably higher storage requirement than text. While transmitting a multimedia content there is chance of the packets being dropped due to noise and interference. At the receiver end the packets that carry valuable information might be damaged or lost due to noise, interference and congestion. In order to avoid the valuable information from being dropped various retransmission schemes have been proposed. In this proposed scheme QTD is used. QTD is an image segmentation method that divides the image into homogeneous areas. In this proposed scheme involves analysis of parameters such as compression ratio, peak signal to noise ratio, mean square error, bits per pixel in compressed image and analysis of difficulties during data packet communication in Wireless Sensor Networks. By considering the above, this paper is to use the QTD to improve the compression ratio as well as visual quality and the algorithm in MATLAB 7.1 and NS2 Simulator software tool.

Keywords: Image compression, Compression Ratio, Quad tree decomposition, Wireless sensor networks, NS2 simulator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2391
800 The Influence of Mobile Phone's Forms in the User Perception

Authors: The Jaya Suteja, Stephany Tedjohartoko

Abstract:

Not all types of mobile phone are successful in entering the market because some types of the mobile phone have a negative perception of user. Therefore, it is important to understand the influence of mobile phone's characteristics in the local user perception. This research investigates the influence of QWERTY mobile phone's forms in the perception of Indonesian user. First, some alternatives of mobile phone-s form are developed based on a certain number of mobile phone's models. At the second stage, some word pairs as design attributes of the mobile phone are chosen to represent the user perception of mobile phone. At the final stage, a survey is conducted to investigate the influence of the developed form alternatives to the user perception. Based on the research, users perceive mobile phone's form with curved top and straight bottom shapes and mobile phone's form with slider and antenna as the most negative form. Meanwhile, mobile phone's form with curved top and bottom shapes and mobile phone-s form without slider and antenna are perceived by the user as the most positive form.

Keywords: Influence, mobile phone, form, user perception.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370
799 Unconstrained Arabic Online Handwritten Words Segmentation using New HMM State Design

Authors: Randa Ibrahim Elanwar, Mohsen Rashwan, Samia Mashali

Abstract:

In this paper we propose a segmentation system for unconstrained Arabic online handwriting. An essential problem addressed by analytical-based word recognition system. The system is composed of two-stages the first is a newly special designed hidden Markov model (HMM) and the second is a rules based stage. In our system, handwritten words are broken up into characters by simultaneous segmentation-recognition using HMMs of unique design trained using online features most of which are novel. The HMM output characters boundaries represent the proposed segmentation points (PSP) which are then validated by rules-based post stage without any contextual information help to solve different segmentation errors. The HMM has been designed and tested using a self collected dataset (OHASD) [1]. Most errors cases are cured and remarkable segmentation enhancement is achieved. Very promising word and character segmentation rates are obtained regarding the unconstrained Arabic handwriting difficulty and not using context help.

Keywords: Arabic, Hidden Markov Models, online handwriting, word segmentation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1836
798 Primary School Teachers’ Conceptual and Procedural Knowledge of Rational Number and Its Effects on Pupils’ Achievement in Rational Numbers

Authors: R. M. Kashim

Abstract:

The study investigated primary school teachers’ conceptual and procedural knowledge of rational numbers and its effects on pupil’s achievement in rational numbers. Specifically, primary school teachers’ level of conceptual knowledge about rational numbers, primary school teachers’ level of procedural knowledge about rational numbers, and the effects of teachers conceptual and procedural knowledge on their pupils understanding of rational numbers in primary schools is investigated. The study was carried out in Bauchi metropolis in the Bauchi state of Nigeria. The design of the study was a multi-stage design. The first stage was a descriptive design. The second stage involves a pre-test, post-test only quasi-experimental design. Two instruments were used for the data collection in the study. These were Conceptual and Procedural knowledge test (CPKT) and Rational number achievement test (RAT), the population of the study comprises of three (3) mathematics teachers’ holders of Nigerian Certificate in Education (NCE) teaching primary six and 210 pupils in their intact classes were used for the study. The data collected were analyzed using mean, standard deviation, analysis of variance, analysis of covariance and t- test. The findings indicated that the pupils taught rational number by a teacher that has high conceptual and procedural knowledge understand and perform better than the pupil taught by a teacher who has low conceptual and procedural knowledge of rational number. It is, therefore, recommended that teachers in primary schools should be encouraged to enrich their conceptual knowledge of rational numbers. Also, the superiority performance of teachers in procedural knowledge in rational number should not become an obstruction of understanding. Teachers Conceptual and procedural knowledge of rational numbers should be balanced so that primary school pupils will have a view of better teaching and learning of rational number in our contemporary schools.

Keywords: Achievement, conceptual knowledge, procedural knowledge, rational numbers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 886
797 The Mechanism Underlying Empathy-Related Helping Behavior: An Investigation of Empathy-Attitude- Action Model

Authors: Wan-Ting Liao, Angela K. Tzeng

Abstract:

Empathy has been an important issue in psychology, education, as well as cognitive neuroscience. Empathy has two major components: cognitive and emotional. Cognitive component refers to the ability to understand others’ perspectives, thoughts, and actions, whereas emotional component refers to understand how others feel. Empathy can be induced, attitude can then be changed, and with enough attitude change, helping behavior can occur. This finding leads us to two questions: is attitude change really necessary for prosocial behavior? And, what roles cognitive and affective empathy play? For the second question, participants with different psychopathic personality (PP) traits are critical because high PP people were found to suffer only affective empathy deficit. Their cognitive empathy shows no significant difference from the control group. 132 college students voluntarily participated in the current three-stage study. Stage 1 was to collect basic information including Interpersonal Reactivity Index (IRI), Psychopathic Personality Inventory-Revised (PPI-R), Attitude Scale, Visual Analogue Scale (VAS), and demographic data. Stage two was for empathy induction with three controversial scenarios, namely domestic violence, depression with a suicide attempt, and an ex-offender. Participants read all three stories and then rewrite the stories by one of two perspectives (empathetic vs. objective). They would then complete the VAS and Attitude Scale one more time for their post-attitude and emotional status. Three IVs were introduced for data analysis: PP (High vs. Low), Responsibility (whether or not the character is responsible for what happened), and Perspective-taking (Empathic vs. Objective). Stage 3 was for the action. Participants were instructed to freely use the 17 tokens they received as donations. They were debriefed and interviewed at the end of the experiment. The major findings were people with higher empathy tend to take more action in helping. Attitude change is not necessary for prosocial behavior. The controversy of the scenarios and how familiar participants are towards target groups play very important roles. Finally, people with high PP tend to show more public prosocial behavior due to their affective empathy deficit. Pre-existing value and belief as well as recent dramatic social events seem to have a big impact and possibly reduce the effect of the independent variables (IV) in our paradigm.

Keywords: Affective empathy, attitude, cognitive empathy, prosocial behavior, psychopathic traits.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 712
796 Trace Emergence of Ants- Traffic Flow, based upon Exclusion Process

Authors: Ali Lemouari, Mohamed Benmohamed

Abstract:

Biological evolution has generated a rich variety of successful solutions; from nature, optimized strategies can be inspired. One interesting example is the ant colonies, which are able to exhibit a collective intelligence, still that their dynamic is simple. The emergence of different patterns depends on the pheromone trail, leaved by the foragers. It serves as positive feedback mechanism for sharing information. In this paper, we use the dynamic of TASEP as a model of interaction at a low level of the collective environment in the ant-s traffic flow. This work consists of modifying the movement rules of particles “ants" belonging to the TASEP model, so that it adopts with the natural movement of ants. Therefore, as to respect the constraints of having no more than one particle per a given site, and in order to avoid collision within a bidirectional circulation, we suggested two strategies: decease strategy and waiting strategy. As a third work stage, this is devoted to the study of these two proposed strategies- stability. As a final work stage, we applied the first strategy to the whole environment, in order to get to the emergence of traffic flow, which is a way of learning.

Keywords: Ants system, emergence, exclusion process, pheromone.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1327
795 An Efficient Architecture for Interleaved Modular Multiplication

Authors: Ahmad M. Abdel Fattah, Ayman M. Bahaa El-Din, Hossam M.A. Fahmy

Abstract:

Modular multiplication is the basic operation in most public key cryptosystems, such as RSA, DSA, ECC, and DH key exchange. Unfortunately, very large operands (in order of 1024 or 2048 bits) must be used to provide sufficient security strength. The use of such big numbers dramatically slows down the whole cipher system, especially when running on embedded processors. So far, customized hardware accelerators - developed on FPGAs or ASICs - were the best choice for accelerating modular multiplication in embedded environments. On the other hand, many algorithms have been developed to speed up such operations. Examples are the Montgomery modular multiplication and the interleaved modular multiplication algorithms. Combining both customized hardware with an efficient algorithm is expected to provide a much faster cipher system. This paper introduces an enhanced architecture for computing the modular multiplication of two large numbers X and Y modulo a given modulus M. The proposed design is compared with three previous architectures depending on carry save adders and look up tables. Look up tables should be loaded with a set of pre-computed values. Our proposed architecture uses the same carry save addition, but replaces both look up tables and pre-computations with an enhanced version of sign detection techniques. The proposed architecture supports higher frequencies than other architectures. It also has a better overall absolute time for a single operation.

Keywords: Montgomery multiplication, modular multiplication, efficient architecture, FPGA, RSA

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2454
794 The Necessity to Standardize Procedures of Providing Engineering Geological Data for Designing Road and Railway Tunneling Projects

Authors: Atefeh Saljooghi Khoshkar, Jafar Hassanpour

Abstract:

One of the main problems of design stage relating to many tunneling projects is the lack of an appropriate standard for the provision of engineering geological data in a predefined format. In particular, this is more reflected in highway and railroad tunnels projects in which there is a number of tunnels and different professional teams involved. In this regard, a comprehensive software needs to be designed using the accepted methods in order to help engineering geologists to prepare standard reports, which contain sufficient input data for the design stage. Regarding this necessity, an applied software has been designed using macro capabilities and Visual Basic programming language (VBA) through Microsoft Excel. In this software, all of the engineering geological input data, which are required for designing different parts of tunnels such as discontinuities properties, rock mass strength parameters, rock mass classification systems, boreability classification, the penetration rate and so forth can be calculated and reported in a standard format.

Keywords: Engineering geology, rock mass classification, rock mechanic, tunnel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 122
793 CBIR Using Multi-Resolution Transform for Brain Tumour Detection and Stages Identification

Authors: H. Benjamin Fredrick David, R. Balasubramanian, A. Anbarasa Pandian

Abstract:

Image retrieval is the most interesting technique which is being used today in our digital world. CBIR, commonly expanded as Content Based Image Retrieval is an image processing technique which identifies the relevant images and retrieves them based on the patterns that are extracted from the digital images. In this paper, two research works have been presented using CBIR. The first work provides an automated and interactive approach to the analysis of CBIR techniques. CBIR works on the principle of supervised machine learning which involves feature selection followed by training and testing phase applied on a classifier in order to perform prediction. By using feature extraction, the image transforms such as Contourlet, Ridgelet and Shearlet could be utilized to retrieve the texture features from the images. The features extracted are used to train and build a classifier using the classification algorithms such as Naïve Bayes, K-Nearest Neighbour and Multi-class Support Vector Machine. Further the testing phase involves prediction which predicts the new input image using the trained classifier and label them from one of the four classes namely 1- Normal brain, 2- Benign tumour, 3- Malignant tumour and 4- Severe tumour. The second research work includes developing a tool which is used for tumour stage identification using the best feature extraction and classifier identified from the first work. Finally, the tool will be used to predict tumour stage and provide suggestions based on the stage of tumour identified by the system. This paper presents these two approaches which is a contribution to the medical field for giving better retrieval performance and for tumour stages identification.

Keywords: Brain tumour detection, content based image retrieval, classification of tumours, image retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 775
792 Design of Composite Risers for Minimum Weight

Authors: Chunguang Wang, Krishna Shankar, Evgeny V. Morozov

Abstract:

The use of composite materials in offshore engineering for deep sea oil production riser systems has drawn considerable interest due to the potential weight savings and improvement in durability. The design of composite risers consists of two stages: (1) local design based on critical local load cases, and (2) global analysis of the full length composite riser under global loads and assessment of critical locations. In the first stage, eight different material combinations were selected and their laminate configurations optimised under local load considerations. Stage two includes a final local stress analysis of the critical sections of the riser under the combined loads determined in the global analysis. This paper describes two design methodologies of the composite riser to provide minimum structural weight and shows that the use of off angle fibre orientations in addition to axial and hoop reinforcements offer substantial weight savings and ensure the structural capacity.

Keywords: Composite Riser, Composite Tubular, Finite Element Modelling, Global Design, Local Design, Offshore Engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2330
791 Effect of Drought Stress on Nitrogen Components in Corn

Authors: Masoud Rafiee, Fatemeh Abdipoor, Hosain Lari

Abstract:

An attempt was made to study of nitrogen components response of corn (Zea mays L.) to drought stress. A farm research was done in RCBD as split-plot with four replications in Khorramabad, west Iran. Drought stress levels as irrigation regimes after 75 (control), 100, and 120 (stress) mm cumulative evaporation were in main plots, and four seed corn varieties include 500 (medium maturity), 647, 700, and 704 (long maturity) were as subplots. Soluble protein, nitrate and proline amino acid were measured in shoot and root at flowering stage, and grain yield was measured in harvesting stage. As the drought progressed, the amount of nitrate and proline followed an increasing trend, but soluble protein decreased in shoot and root. The highest amount of nitrate and proline was observed in longer maturity varieties than shorter ones, but decrease yield of long maturity varieties was higher than medium maturity varieties in drought condition, because of long duration of stress.

Keywords: Nitrate, Proline, Soluble protein, Yield

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
790 Automatic 3D Reconstruction of Coronary Artery Centerlines from Monoplane X-ray Angiogram Images

Authors: Ali Zifan, Panos Liatsis, Panagiotis Kantartzis, Manolis Gavaises, Nicos Karcanias, Demosthenes Katritsis

Abstract:

We present a new method for the fully automatic 3D reconstruction of the coronary artery centerlines, using two X-ray angiogram projection images from a single rotating monoplane acquisition system. During the first stage, the input images are smoothed using curve evolution techniques. Next, a simple yet efficient multiscale method, based on the information of the Hessian matrix, for the enhancement of the vascular structure is introduced. Hysteresis thresholding using different image quantiles, is used to threshold the arteries. This stage is followed by a thinning procedure to extract the centerlines. The resulting skeleton image is then pruned using morphological and pattern recognition techniques to remove non-vessel like structures. Finally, edge-based stereo correspondence is solved using a parallel evolutionary optimization method based on f symbiosis. The detected 2D centerlines combined with disparity map information allow the reconstruction of the 3D vessel centerlines. The proposed method has been evaluated on patient data sets for evaluation purposes.

Keywords: Vessel enhancement, centerline extraction, symbiotic reconstruction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2272
789 Green Product Design for Mobile Phones

Authors: İlke Bereketli, Müjde Erol Genevois, H. Ziya Ulukan

Abstract:

Nowadays, manufacturers are facing great challenges with regard to the production of green products due to the emerging issue of hazardous substance management (HSM). In particular, environmental legislation pressures have yielded to increased risk, manufacturing complexity and green components demands. The green principles were expanded to many departments within organization, including supply chain. Green supply chain management (GSCM) was emerging in the last few years. This idea covers every stage in manufacturing from the first to the last stage of life cycle. From product lifecycle concept, the cycle starts at the design of a product. QFD is a customer-driven product development tool, considered as a structured management approach for efficiently translating customer needs into design requirements and parts deployment, as well as manufacturing plans and controls in order to achieve higher customer satisfaction. This paper develops an Eco- QFD to provide a framework for designing Eco-mobile phone by integrating the life cycle analysis LCA into QFD throughout the entire product development process.

Keywords: Eco-design, Eco-QFD, EEE, Environmental New Product Development, Mobile Phone.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2635
788 Adapting Tools for Text Monitoring and for Scenario Analysis Related to the Field of Social Disasters

Authors: Svetlana Cojocaru, Mircea Petic, Inga Titchiev

Abstract:

Humanity faces more and more often with different social disasters, which in turn can generate new accidents and catastrophes. To mitigate their consequences, it is important to obtain early possible signals about the events which are or can occur and to prepare the corresponding scenarios that could be applied. Our research is focused on solving two problems in this domain: identifying signals related that an accident occurred or may occur and mitigation of some consequences of disasters. To solve the first problem, methods of selecting and processing texts from global network Internet are developed. Information in Romanian is of special interest for us. In order to obtain the mentioned tools, we should follow several steps, divided into preparatory stage and processing stage. Throughout the first stage, we manually collected over 724 news articles and classified them into 10 categories of social disasters. It constitutes more than 150 thousand words. Using this information, a controlled vocabulary of more than 300 keywords was elaborated, that will help in the process of classification and identification of the texts related to the field of social disasters. To solve the second problem, the formalism of Petri net has been used. We deal with the problem of inhabitants’ evacuation in useful time. The analysis methods such as reachability or coverability tree and invariants technique to determine dynamic properties of the modeled systems will be used. To perform a case study of properties of extended evacuation system by adding time, the analysis modules of PIPE such as Generalized Stochastic Petri Nets (GSPN) Analysis, Simulation, State Space Analysis, and Invariant Analysis have been used. These modules helped us to obtain the average number of persons situated in the rooms and the other quantitative properties and characteristics related to its dynamics.

Keywords: Lexicon of disasters, modelling, Petri nets, text annotation, social disasters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1157
787 The Multimedia Interactive Theatre by Virtual Means Regarding Computational Intelligence in Space Design as HCI and Samples from Turkey

Authors: Pelin Yildiz

Abstract:

The aim of this study is to emphasize the opportunities in space design under the aspect of HCI as performance areas. HCI is a multidisciplinary approach that could be identified in many different areas. The aesthetical reflections of HCI by virtual reality in space design are the high-tech solutions of the new innovations as computational facilities by artistic features. The method of this paper is to identify the subject in 3 main parts. In the first part a general approach and definition of interactivity on the basis of space design; in the second part the concept of multimedia interactive theater by some chosen samples from the world and interactive design aspects; in the third part the samples from Turkey will be identified by stage designing principles. In the results it could be declared that the multimedia database is the virtual approach of theatre stage designing regarding interactive means by computational facilities according to aesthetical aspects. HCI is mostly identified in theatre stages as computational intelligence under the affect of interactivity.

Keywords: Computational intelligence, interactive space, multimedia theatre, virtual reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2689
786 Six Sigma Solutions and its Benefit-Cost Ratio for Quality Improvement

Authors: S. Homrossukon, A. Anurathapunt

Abstract:

This is an application research presenting the improvement of production quality using the six sigma solutions and the analyses of benefit-cost ratio. The case of interest is the production of tile-concrete. Such production has faced with the problem of high nonconforming products from an inappropriate surface coating and had low process capability based on the strength property of tile. Surface coating and tile strength are the most critical to quality of this product. The improvements followed five stages of six sigma solutions. After the improvement, the production yield was improved to 80% as target required and the defective products from coating process was remarkably reduced from 29.40% to 4.09%. The process capability based on the strength quality was increased from 0.87 to 1.08 as customer oriented. The improvement was able to save the materials loss for 3.24 millions baht or 0.11 million dollars. The benefits from the improvement were analyzed from (1) the reduction of the numbers of non conforming tile using its factory price for surface coating improvement and (2) the materials saved from the increment of process capability. The benefit-cost ratio of overall improvement was high as 7.03. It was non valuable investment in define, measure, analyses and the initial of improve stages after that it kept increasing. This was due to there were no benefits in define, measure, and analyze stages of six sigma since these three stages mainly determine the cause of problem and its effects rather than improve the process. The benefit-cost ratio starts existing in the improve stage and go on. Within each stage, the individual benefitcost ratio was much higher than the accumulative one as there was an accumulation of cost since the first stage of six sigma. The consideration of the benefit-cost ratio during the improvement project helps make decisions for cost saving of similar activities during the improvement and for new project. In conclusion, the determination of benefit-cost ratio behavior through out six sigma implementation period provides the useful data for managing quality improvement for the optimal effectiveness. This is the additional outcome from the regular proceeding of six sigma.

Keywords: Six Sigma Solutions, Process Improvement, QualityManagement, Benefit Cost Ratio

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2137
785 Kant’s Conception of Human Dignity and the Importance of Singularity within Commonality

Authors: Francisco Lobo

Abstract:

Kant’s household theory of human dignity as a common feature of all rational beings is the starting point of any intellectual endeavor to unravel the implications of this normative notion. Yet, it is incomplete, as it neglects considering the importance of the singularity or uniqueness of the individual. In a first, deconstructive stage, this paper describes the Kantian account of human dignity as one among many conceptions of human dignity. It reads carefully into the original wording used by Kant in German and its English translations, as well as the works of modern commentators, to identify its shortcomings. In a second, constructive stage, it then draws on the theories of Aristotle, Alexis de Tocqueville, John Stuart Mill, and Hannah Arendt to try and enhance the Kantian conception, in the sense that these authors give major importance to the singularity of the individual. The Kantian theory can be perfected by including elements from the works of these authors, while at the same time being mindful of the dangers entailed in focusing too much on singularity. The conclusion of this paper is that the Kantian conception of human dignity can be enhanced if it acknowledges that not only morality has dignity, but also the irreplaceable human individual to the extent that she is a narrative, original creature with the potential to act morally.

Keywords: Commonality, dignity, Kant, singularity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 819
784 Sustainable Development, China’s Emerging Role via One Belt, One Road

Authors: Saeid Rabiei Majd, Motahareh Alvandi, Mehrad Rabiei

Abstract:

The rapid economic and technological development of any country depends on access to cheap sources of energy. Competition for access to petroleum resources is always accompanied by numerous environmental risks. These factors have caused more attention to environmental issues and sustainable development in petroleum contracts and activities. Nowadays, a sign of developed countries is adhering to the principles and rules of international environmental law and sustainable development of commercial contracts. China has entered into play through the massive project plan, One Belt, One Road. China is becoming a new emerging power in the world. China's bilateral investment treaties have an impact on environmental rights and sustainable development through regional and international foreign direct investment. The aim of this research is to examine China's key position to promote and improve environmental principles and international law and sustainable development in the energy sector in the world through the initiative, One Belt, One Road. Based on this hypothesis, it seems that in the near future, China's investment bilateral investment treaties will become popular investment model used in global trade, especially in the field of energy and sustainable development. They will replace the European and American models. The research method is including literature review, analytical and descriptive methods.

Keywords: Principles of sustainable development, oil and gas law, Chinas BITs, one belt one road, environmental rights.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1960
783 Optimization of Two-Stage Pretreatment Combined with Microwave Radiation Using Response Surface Methodology

Authors: Jidapa Manaso, Apanee Luengnaruemitchai, Sujitra Wongkasemjit

Abstract:

Pretreatment is an essential step in the conversion of lignocellulosic biomass to fermentable sugar that used for biobutanol production. Among pretreatment processes, microwave is considered to improve pretreatment efficiency due to its high heating efficiency, easy operation, and easily to combine with chemical reaction. The main objectives of this work are to investigate the feasibility of microwave pretreatment to enhance enzymatic hydrolysis of corncobs and to determine the optimal conditions using response surface methodology. Corncobs were pretreated via two-stage pretreatment in dilute sodium hydroxide (2 %) followed by dilute sulfuric acid 1 %. Pretreated corncobs were subjected to enzymatic hydrolysis to produce reducing sugar. Statistical experimental design was used to optimize pretreatment parameters including temperature, residence time and solid-to-liquid ratio to achieve the highest amount of glucose. The results revealed that solid-to-liquid ratio and temperature had a significant effect on the amount of glucose.

Keywords: Corncobs, Microwave radiation, Pretreatment, Response Surface Methodology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2552
782 Three-Stage Mining Metals Supply Chain Coordination and Product Quality Improvement with Revenue Sharing Contract

Authors: Hamed Homaei, Iraj Mahdavi, Ali Tajdin

Abstract:

One of the main concerns of miners is to increase the quality level of their products because the mining metals price depends on their quality level; however, increasing the quality level of these products has different costs at different levels of the supply chain. These costs usually increase after extractor level. This paper studies the coordination issue of a decentralized three-level supply chain with one supplier (extractor), one mineral processor and one manufacturer in which the increasing product quality level cost at the processor level is higher than the supplier and at the level of the manufacturer is more than the processor. We identify the optimal product quality level for each supply chain member by designing a revenue sharing contract. Finally, numerical examples show that the designed contract not only increases the final product quality level but also provides a win-win condition for all supply chain members and increases the whole supply chain profit.

Keywords: Three-stage supply chain, product quality improvement, channel coordination, revenue sharing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1104
781 Finite Element Simulation of Multi-Stage Deep Drawing Processes and Comparison with Experimental Results

Authors: A. Pourkamali Anaraki, M. Shahabizadeh, B. Babaee

Abstract:

The plastic forming process of sheet plate takes an important place in forming metals. The traditional techniques of tool design for sheet forming operations used in industry are experimental and expensive methods. Prediction of the forming results, determination of the punching force, blank holder forces and the thickness distribution of the sheet metal will decrease the production cost and time of the material to be formed. In this paper, multi-stage deep drawing simulation of an Industrial Part has been presented with finite element method. The entire production steps with additional operations such as intermediate annealing and springback has been simulated by ABAQUS software under axisymmetric conditions. The simulation results such as sheet thickness distribution, Punch force and residual stresses have been extracted in any stages and sheet thickness distribution was compared with experimental results. It was found through comparison of results, the FE model have proven to be in close agreement with those of experiment.

Keywords: Deep drawing, Finite element method, Simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5077
780 A 24-Bit, 8.1-MS/s D/A Converter for Audio Baseband Channel Applications

Authors: N. Ben Ameur, M. Loulou

Abstract:

This paper study the high-level modelling and design of delta-sigma (ΔΣ) noise shapers for audio Digital-to-Analog Converter (DAC) so as to eliminate the in-band Signal-to-Noise- Ratio (SNR) degradation that accompany one channel mismatch in audio signal. The converter combines a cascaded digital signal interpolation, a noise-shaping single loop delta-sigma modulator with a 5-bit quantizer resolution in the final stage. To reduce sensitivity of Digital-to-Analog Converter (DAC) nonlinearities of the last stage, a high pass second order Data Weighted Averaging (R2DWA) is introduced. This paper presents a MATLAB description modelling approach of the proposed DAC architecture with low distortion and swing suppression integrator designs. The ΔΣ Modulator design can be configured as a 3rd-order and allows 24-bit PCM at sampling rate of 64 kHz for Digital Video Disc (DVD) audio application. The modeling approach provides 139.38 dB of dynamic range for a 32 kHz signal band at -1.6 dBFS input signal level.

Keywords: DVD-audio, DAC, Interpolator and Interpolation Filter, Single-Loop ΔΣ Modulation, R2DWA, Clock Jitter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2623
779 The Effect of Complementary Irrigation in Different Growth Stages on Yield, Qualitative and Quantitative Indices of the Two Wheat (Triticum aestivum L.) Cultivars in Mazandaran

Authors: Abbas Ghanbari-Malidarreh

Abstract:

In most wheat growing moderate regions and especially in the north of Iran climate, is affected grain filling by several physical and abiotic stresses. In this region, grain filling often occurs when temperatures are increasing and moisture supply is decreasing. The experiment was designed in RCBD with split plot arrangements with four replications. Four irrigation treatments included (I0) no irrigation (check); (I1) one irrigation (50 mm) at heading stage; (I2) two irrigation (100 mm) at heading and anthesis stage; and (I3) three irrigation (150 mm) at heading, anthesis and early grain filling growth stage, two wheat cultivars (Milan and Shanghai) were cultured in the experiment. Totally raining was 453 mm during the growth season. The result indicated that biological yield, grain yield and harvest index were significantly affected by irrigation levels. I3 treatment produced more tillers number in m2, fertile tillers number in m2, harvest index and biological yield. Milan produced more tillers number in m2, fertile tillers in m2, while Shanghai produced heavier tillers and grain 1000 weight. Plant height was significant in wheat varieties while were not statistically significant in irrigation levels. Milan produced more grain yield, harvest index and biological yield. Grain yield shown that I1, I2, and I3 produced increasing of 5228 (21%), 5460 (27%) and 5670 (29%) kg ha-1, respectively. There was an interaction of irrigation and cultivar on grain yields. In the absence of the irrigation reduced grain 1000 weight from 45 to 40 g. No irrigation reduced soil moisture extraction during the grain filling stage. Current assimilation as a source of carbon for grain filling depends on the light intercepting viable green surfaces of the plant after anthesis that due to natural senescence and the effect of various stresses. At the same time the demand by the growing grain is increasing. It is concluded from research work that wheat crop irrigated Milan cultivar could increase the grain yield in comparison with Shanghai cultivar. Although, the grain yield of Shanghai under irrigation was slightly lower than Milan. This grain yield also was related to weather condition, sowing date, plant density and location conditions and management of fertilizers, because there was not significant difference in biological and straw yield. The best result was produced by I1 treatment. I2 and I3 treatments were not significantly difference with I1 treatment. Grain yield of I1 indicated that wheat is under soil moisture deficiency. Therefore, I1 irrigation was better than I0.

Keywords: anthesis, grain yield, irrigation, supplementary, Wheat.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692
778 Embedded Semi-Fragile Signature Based Scheme for Ownership Identification and Color Image Authentication with Recovery

Authors: M. Hamad Hassan, S.A.M. Gilani

Abstract:

In this paper, a novel scheme is proposed for Ownership Identification and Color Image Authentication by deploying Cryptography & Digital Watermarking. The color image is first transformed from RGB to YST color space exclusively designed for watermarking. Followed by color space transformation, each channel is divided into 4×4 non-overlapping blocks with selection of central 2×2 sub-blocks. Depending upon the channel selected two to three LSBs of each central 2×2 sub-block are set to zero to hold the ownership, authentication and recovery information. The size & position of sub-block is important for correct localization, enhanced security & fast computation. As YS ÔèÑ T so it is suitable to embed the recovery information apart from the ownership and authentication information, therefore 4×4 block of T channel along with ownership information is then deployed by SHA160 to compute the content based hash that is unique and invulnerable to birthday attack or hash collision instead of using MD5 that may raise the condition i.e. H(m)=H(m'). For recovery, intensity mean of 4x4 block of each channel is computed and encoded upto eight bits. For watermark embedding, key based mapping of blocks is performed using 2DTorus Automorphism. Our scheme is oblivious, generates highly imperceptible images with correct localization of tampering within reasonable time and has the ability to recover the original work with probability of near one.

Keywords: Hash Collision, LSB, MD5, PSNR, SHA160

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1519
777 Emergency Response Plan Establishment and Computerization through the Analysis of the Disasters Occurring on Long-Span Bridges by Type

Authors: Sungnam Hong, Sun-Kyu Park, Dooyong Cho, Jinwoong Choi

Abstract:

In this paper, a strategy for long-span bridge disaster response was developed, divided into risk analysis, business impact analysis, and emergency response plan. At the risk analysis stage, the critical risk was estimated. The critical risk was “car accident."The critical process by critical-risk classification was assessed at the business impact analysis stage. The critical process was the task related to the road conditions and traffic safety. Based on the results of the precedent analysis, an emergency response plan was established. By making the order of the standard operating procedures clear, an effective plan for dealing with disaster was formulated. Finally, a prototype software was developed based on the research findings. This study laid the foundation of an information-technology-based disaster response guideline and is significant in that it computerized the disaster response plan to improve the plan-s accessibility.

Keywords: Emergency response; Long-span bridge; Disaster management; Standard operating procedure; Ubiquitous.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1834
776 Multi Switched Split Vector Quantization of Narrowband Speech Signals

Authors: M. Satya Sai Ram, P. Siddaiah, M. Madhavi Latha

Abstract:

Vector quantization is a powerful tool for speech coding applications. This paper deals with LPC Coding of speech signals which uses a new technique called Multi Switched Split Vector Quantization (MSSVQ), which is a hybrid of Multi, switched, split vector quantization techniques. The spectral distortion performance, computational complexity, and memory requirements of MSSVQ are compared to split vector quantization (SVQ), multi stage vector quantization(MSVQ) and switched split vector quantization (SSVQ) techniques. It has been proved from results that MSSVQ has better spectral distortion performance, lower computational complexity and lower memory requirements when compared to all the above mentioned product code vector quantization techniques. Computational complexity is measured in floating point operations (flops), and memory requirements is measured in (floats).

Keywords: Linear predictive Coding, Multi stage vectorquantization, Switched Split vector quantization, Split vectorquantization, Line Spectral Frequencies (LSF).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1672
775 A Hybrid Classification Method using Artificial Neural Network Based Decision Tree for Automatic Sleep Scoring

Authors: Haoyu Ma, Bin Hu, Mike Jackson, Jingzhi Yan, Wen Zhao

Abstract:

In this paper we propose a new classification method for automatic sleep scoring using an artificial neural network based decision tree. It attempts to treat sleep scoring progress as a series of two-class problems and solves them with a decision tree made up of a group of neural network classifiers, each of which uses a special feature set and is aimed at only one specific sleep stage in order to maximize the classification effect. A single electroencephalogram (EEG) signal is used for our analysis rather than depending on multiple biological signals, which makes greatly simplifies the data acquisition process. Experimental results demonstrate that the average epoch by epoch agreement between the visual and the proposed method in separating 30s wakefulness+S1, REM, S2 and SWS epochs was 88.83%. This study shows that the proposed method performed well in all the four stages, and can effectively limit error propagation at the same time. It could, therefore, be an efficient method for automatic sleep scoring. Additionally, since it requires only a small volume of data it could be suited to pervasive applications.

Keywords: Sleep, Sleep stage, Automatic sleep scoring, Electroencephalography, Decision tree, Artificial neural network

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2071
774 Motion Prediction and Motion Vector Cost Reduction during Fast Block Motion Estimation in MCTF

Authors: Karunakar A K, Manohara Pai M M

Abstract:

In 3D-wavelet video coding framework temporal filtering is done along the trajectory of motion using Motion Compensated Temporal Filtering (MCTF). Hence computationally efficient motion estimation technique is the need of MCTF. In this paper a predictive technique is proposed in order to reduce the computational complexity of the MCTF framework, by exploiting the high correlation among the frames in a Group Of Picture (GOP). The proposed technique applies coarse and fine searches of any fast block based motion estimation, only to the first pair of frames in a GOP. The generated motion vectors are supplied to the next consecutive frames, even to subsequent temporal levels and only fine search is carried out around those predicted motion vectors. Hence coarse search is skipped for all the motion estimation in a GOP except for the first pair of frames. The technique has been tested for different fast block based motion estimation algorithms over different standard test sequences using MC-EZBC, a state-of-the-art scalable video coder. The simulation result reveals substantial reduction (i.e. 20.75% to 38.24%) in the number of search points during motion estimation, without compromising the quality of the reconstructed video compared to non-predictive techniques. Since the motion vectors of all the pair of frames in a GOP except the first pair will have value ±1 around the motion vectors of the previous pair of frames, the number of bits required for motion vectors is also reduced by 50%.

Keywords: Motion Compensated Temporal Filtering, predictivemotion estimation, lifted wavelet transform, motion vector

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1619