Search results for: priority ratio
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2092

Search results for: priority ratio

1282 Taguchi Robust Design for Optimal Setting of Process Wastes Parameters in an Automotive Parts Manufacturing Company

Authors: Charles Chikwendu Okpala, Christopher Chukwutoo Ihueze

Abstract:

As a technique that reduces variation in a product by lessening the sensitivity of the design to sources of variation, rather than by controlling their sources, Taguchi Robust Design entails the designing of ideal goods, by developing a product that has minimal variance in its characteristics and also meets the desired exact performance. This paper examined the concept of the manufacturing approach and its application to brake pad product of an automotive parts manufacturing company. Although the firm claimed that only defects, excess inventory, and over-production were the few wastes that grossly affect their productivity and profitability, a careful study and analysis of their manufacturing processes with the application of Single Minute Exchange of Dies (SMED) tool showed that the waste of waiting is the fourth waste that bedevils the firm. The selection of the Taguchi L9 orthogonal array which is based on the four parameters and the three levels of variation for each parameter revealed that with a range of 2.17, that waiting is the major waste that the company must reduce in order to continue to be viable. Also, to enhance the company’s throughput and profitability, the wastes of over-production, excess inventory, and defects with ranges of 2.01, 1.46, and 0.82, ranking second, third, and fourth respectively must also be reduced to the barest minimum. After proposing -33.84 as the highest optimum Signal-to-Noise ratio to be maintained for the waste of waiting, the paper advocated for the adoption of all the tools and techniques of Lean Production System (LPS), and Continuous Improvement (CI), and concluded by recommending SMED in order to drastically reduce set up time which leads to unnecessary waiting.

Keywords: Taguchi Robust Design, signal to noise ratio, Single Minute Exchange of Dies, lean production system, waste.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 955
1281 Temperature-Based Detection of Initial Yielding Point in Loading of Tensile Specimens Made of Structural Steel

Authors: Aqsa Jamil, Hiroshi Tamura, Hiroshi Katsuchi, Jiaqi Wang

Abstract:

Yield point represents the upper limit of forces which can be applied on a specimen without causing any permanent deformation. After yielding, the behavior of specimen suddenly changes including the possibility of cracking or buckling. So, the accumulation of damage or type of fracture changes depending on this condition. As it is difficult to accurately detect yield points of the several stress concentration points in structural steel specimens, an effort has been made in this research work to develop a convenient technique using thermography (temperature-based detection) during tensile tests for the precise detection of yield point initiation. To verify the applicability of thermography camera, tests were conducted under different loading conditions and measuring the deformation by installing various strain gauges and monitoring the surface temperature with the help of thermography camera. The yield point of specimens was estimated by the help of temperature dip which occurs due to the thermoelastic effect during the plastic deformation. The scattering of the data has been checked by performing repeatability analysis. The effect of temperature imperfection and light source has been checked by carrying out the tests at daytime as well as midnight and by calculating the signal to noise ratio (SNR) of the noised data from the infrared thermography camera, it can be concluded that the camera is independent of testing time and the presence of a visible light source. Furthermore, a fully coupled thermal-stress analysis has been performed by using Abaqus/Standard exact implementation technique to validate the temperature profiles obtained from the thermography camera and to check the feasibility of numerical simulation for the prediction of results extracted with the help of thermographic technique.

Keywords: Signal to noise ratio, thermoelastic effect, thermography, yield point.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 334
1280 A Case Study on Performance of Isolated Bridges under Near-Fault Ground Motion

Authors: Daniele Losanno, H. A. Hadad, Giorgio Serino

Abstract:

This paper presents a numerical investigation on the seismic performance of a benchmark bridge with different optimal isolation systems under near fault ground motion. Usually, very large displacements make seismic isolation an unfeasible solution due to boundary conditions, especially in case of existing bridges or high risk seismic regions. Hence, near-fault ground motions are most likely to affect either structures with long natural period range like isolated structures or structures sensitive to velocity content such as viscously damped structures. The work is aimed at analyzing the seismic performance of a three-span continuous bridge designed with different isolation systems having different levels of damping. The case study was analyzed in different configurations including: (a) simply supported, (b) isolated with lead rubber bearings (LRBs), (c) isolated with rubber isolators and 10% classical damping (HDLRBs), and (d) isolated with rubber isolators and 70% supplemental damping ratio. Case (d) represents an alternative control strategy that combines the effect of seismic isolation with additional supplemental damping trying to take advantages from both solutions. The bridge is modeled in SAP2000 and solved by time history direct-integration analyses under a set of six recorded near-fault ground motions. In addition to this, a set of analysis under Italian code provided seismic action is also conducted, in order to evaluate the effectiveness of the suggested optimal control strategies under far field seismic action. Results of the analysis demonstrated that an isolated bridge equipped with HDLRBs and a total equivalent damping ratio of 70% represents a very effective design solution for both mitigation of displacement demand at the isolation level and base shear reduction in the piers also in case of near fault ground motion.

Keywords: Isolated bridges, optimal design, near-fault motion, supplemental damping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1251
1279 Changing Patterns of Colorectal Cancer in Hail Region

Authors: Laila Salah Seada, Ashraf Ibrahim, Fawaz Al Rashid, Ihab Abdo, Hassan Kasim, Waleed Al Mansi, Saud Al Shabli

Abstract:

Background and Objectives: Colorectal carcinoma is increasing among both men and women worldwide. It has a multifactorial etiology including genetic factors, environmental factors and inflammatory conditions of the digestive tract. A clinicopathologic assessment of colorectal carcinoma in Hail region is done, considering any changing patterns in two 5-year periods from 2005-2009 (A) and from 2012 to 2017 (B). All data had been retrieved from histopathology files of King Khalid Hospital, Hail. Results: During period (A), 75 cases were diagnosed as colorectal carcinoma. Male patients comprised 56/75 (74.7%) of the study, with a mean age of 58.4 (36-97), while females were 19/75 (25.3%) with a mean age of 50.3(30-85) and the difference was significant (p = 0.05). M:F ratio was 2.9:1. Most common histological type was adenocarcioma in 68/75 (90.7%) patients mostly well differentiated in 44/68 (64.7%). Mucinous neoplasms comprised only 7/75 (9.3%) of cases and tended to have a higher stage (p = 0.04). During period (B), 115 cases were diagnosed with an increase of 53.3% in number of cases than period (A). Male to female ratio also decreased to 1.35:1, females being 44.83% more affected. Adenocarcinoma remained the prevalent type (93.9%), while mucinous type was still rare (5.2%). No distal metastases found at time of presentation. Localization of tumors was rectosigmoid in group (A) in 41.4%, which increased to 56.6% in group (B), with an increase of 15.2%. Iliocecal location also decreased from 8% to 3.5%, being 56.25% less. Other proximal areas of the colon were decreased by 25.75%, from 53.9% in group (A) to 40% in group (B). Conclusion: Colorectal carcinoma in Hail region has increased by 53.3% in the past 5 years, with more females being diagnosed. Localization has also shifted distally by 15.2%. These findings are different from Western world patterns which experienced a decrease in incidence and proximal shift of the colon cancer localization. This might be due to better diagnostic tools, population awareness of the disease, as well as changing of life style and/or food habits in the region.

Keywords: Colorectal cancer, Hail Region, changing pattern, distal shift.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 924
1278 Modern Seismic Design Approach for Buildings with Hysteretic Dampers

Authors: Vanessa A. Segovia, Sonia E. Ruiz

Abstract:

The use of energy dissipation systems for seismic applications has increased worldwide, thus it is necessary to develop practical and modern criteria for their optimal design. Here, a direct displacement-based seismic design approach for frame buildings with hysteretic energy dissipation systems (HEDS) is applied. The building is constituted by two individual structural systems consisting of: 1) a main elastic structural frame designed for service loads; and 2) a secondary system, corresponding to the HEDS, that controls the effects of lateral loads. The procedure implies to control two design parameters: a) the stiffness ratio (α=Kframe/Ktotal system), and b) the strength ratio (γ=Vdamper/Vtotal system). The proposed damage-controlled approach contributes to the design of a more sustainable and resilient building because the structural damage is concentrated on the HEDS. The reduction of the design displacement spectrum is done by means of a damping factor (recently published) for elastic structural systems with HEDS, located in Mexico City. Two limit states are verified: serviceability and near collapse. Instead of the traditional trial-error approach, a procedure that allows the designer to establish the preliminary sizes of the structural elements of both systems is proposed. The design methodology is applied to an 8-story steel building with buckling restrained braces, located in soft soil of Mexico City. With the aim of choosing the optimal design parameters, a parametric study is developed considering different values of હ and ઻. The simplified methodology is for preliminary sizing, design, and evaluation of the effectiveness of HEDS, and it constitutes a modern and practical tool that enables the structural designer to select the best design parameters. 

Keywords: Damage-controlled buildings, direct displacementbased seismic design, optimal hysteretic energy dissipation systems

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2340
1277 Further the Effectiveness of Software Testability Measure

Authors: Liang Zhao, Feng Wang, Bo Deng, Bo Yang

Abstract:

Software testability is proposed to address the problem of increasing cost of test and the quality of software. Testability measure provides a quantified way to denote the testability of software. Since 1990s, many testability measure models are proposed to address the problem. By discussing the contradiction between domain testability and domain range ratio (DRR), a new testability measure, semantic fault distance, is proposed. Its validity is discussed.

Keywords: Software testability, DRR, Domain testability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2031
1276 Enhancing Performance of Bluetooth Piconets Using Priority Scheduling and Exponential Back-Off Mechanism

Authors: Dharmendra Chourishi “Maitraya”, Sridevi Seshadri

Abstract:

Bluetooth is a personal wireless communication technology and is being applied in many scenarios. It is an emerging standard for short range, low cost, low power wireless access technology. Current existing MAC (Medium Access Control) scheduling schemes only provide best-effort service for all masterslave connections. It is very challenging to provide QoS (Quality of Service) support for different connections due to the feature of Master Driven TDD (Time Division Duplex). However, there is no solution available to support both delay and bandwidth guarantees required by real time applications. This paper addresses the issue of how to enhance QoS support in a Bluetooth piconet. The Bluetooth specification proposes a Round Robin scheduler as possible solution for scheduling the transmissions in a Bluetooth Piconet. We propose an algorithm which will reduce the bandwidth waste and enhance the efficiency of network. We define token counters to estimate traffic of real-time slaves. To increase bandwidth utilization, a back-off mechanism is then presented for best-effort slaves to decrease the frequency of polling idle slaves. Simulation results demonstrate that our scheme achieves better performance over the Round Robin scheduling.

Keywords: Piconet, Medium Access Control, Polling algorithm, Scheduling, QoS, Time Division Duplex (TDD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1688
1275 Study of the Azo Hydrazone Tautomerism in the 4-(9-Anthrylazo) Phenol

Authors: Ramadan Ali Bawa, Ebtisam Mohammed Alzaraide

Abstract:

The spectroscopic study on 4-(9-anthrylazo) phenol has revealed that the azo dye under study exists in two tautomeric forms which are azo phenol and hydrazo keto forms in ratio of almost (1:1). The azo hydrazone tautomerism was confirmed by the use of IR spectroscopy and HNMR in which the characteristic absorption bands and chemical shifts for both tautomers were assigned.

Keywords: Spectroscopic, tautomeric forms, azo hydrazone tautomerism, IR spectroscopy, HNMR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2824
1274 Market Feasibility for New Brand Coffee House: The Case Study of Thailand

Authors: Pongsiri K.

Abstract:

This research aimed to study the market feasibility for new brand coffee house, the case study of Thailand.. This study is a mixed methods research combining quantitative research and the qualitative research. Primary data 350 sets of questionnaires were distributed, and the high quality completed questionnaires of 320 sets returned. Research samples are identified as customers’ of Hi-end department stores in Thailand. The sources of secondary data were critical selected from highly reliable sources, both from public and private sectors. The results were used to classify the customer group into two main groups, the younger than 25 and the older than 25years old. Results of the younger group, are give priority to the dimension of coffee house and its services dimension more than others, then branding dimension and the product dimension respectively. On the other hand, the older group give the difference result as they rate the important of the branding, coffee house and its services, then the product respectively. Coffee consuming is not just the trend but it has become part of people lifestyle. And the new cultures also created by the wise businessman. Coffee was long produced and consumed in Thailand. But it is surprisingly the hi-end brand coffee houses in Thai market are mostly imported brands. The café business possibility for Thai brand coffee house in Thai market were discussed in the paper.

Keywords: Coffee House, Café, Coffee Consuming and new entry branding, market feasibility

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14022
1273 An Algorithm of Finite Capacity Material Requirement Planning System for Multi-stage Assembly Flow Shop

Authors: T. Wuttipornpun, U. Wangrakdiskul, W. Songserm

Abstract:

This paper aims to develop an algorithm of finite capacity material requirement planning (FCMRP) system for a multistage assembly flow shop. The developed FCMRP system has two main stages. The first stage is to allocate operations to the first and second priority work centers and also determine the sequence of the operations on each work center. The second stage is to determine the optimal start time of each operation by using a linear programming model. Real data from a factory is used to analyze and evaluate the effectiveness of the proposed FCMRP system and also to guarantee a practical solution to the user. There are five performance measures, namely, the total tardiness, the number of tardy orders, the total earliness, the number of early orders, and the average flow-time. The proposed FCMRP system offers an adjustable solution which is a compromised solution among the conflicting performance measures. The user can adjust the weight of each performance measure to obtain the desired performance. The result shows that the combination of FCMRP NP3 and EDD outperforms other combinations in term of overall performance index. The calculation time for the proposed FCMRP system is about 10 minutes which is practical for the planners of the factory.

Keywords: Material requirement planning, Finite capacity, Linear programming, Permutation, Application in industry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2285
1272 Distribution Sampling of Vector Variance without Duplications

Authors: Erna T. Herdiani, Maman A. Djauhari

Abstract:

In recent years, the use of vector variance as a measure of multivariate variability has received much attention in wide range of statistics. This paper deals with a more economic measure of multivariate variability, defined as vector variance minus all duplication elements. For high dimensional data, this will increase the computational efficiency almost 50 % compared to the original vector variance. Its sampling distribution will be investigated to make its applications possible.

Keywords: Asymptotic distribution, covariance matrix, likelihood ratio test, vector variance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1549
1271 Magnetization of Thin-Film Permalloy Ellipses used for Programmable Motion of Magnetic Particles

Authors: P. Warnicke

Abstract:

Simulations of magnetic microstructure in elliptical Permalloy elements used for controlled motion of magnetic particles are discussed. The saturating field of the elliptical elements was studied with respect to lateral dimensions for one-vortex, cross-tie, diamond and double-diamond states as initial zero-field domain configurations. With aspect ratio of 1:3 the short axis was varied from 125 nm to 1000 nm, whereas the thickness was kept constant at 50 nm.

Keywords: Domain structure, magnetization, micromagnetics, Permalloy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1380
1270 Many-Sided Self Risk Analysis Model for Information Asset to Secure Stability of the Information and Communication Service

Authors: Jin-Tae Lee, Jung-Hoon Suh, Sang-Soo Jang, Jae-Il Lee

Abstract:

Information and communication service providers (ICSP) that are significant in size and provide Internet-based services take administrative, technical, and physical protection measures via the information security check service (ISCS). These protection measures are the minimum action necessary to secure the stability and continuity of the information and communication services (ICS) that they provide. Thus, information assets are essential to providing ICS, and deciding the relative importance of target assets for protection is a critical procedure. The risk analysis model designed to decide the relative importance of information assets, which is described in this study, evaluates information assets from many angles, in order to choose which ones should be given priority when it comes to protection. Many-sided risk analysis (MSRS) grades the importance of information assets, based on evaluation of major security check items, evaluation of the dependency on the information and communication facility (ICF) and influence on potential incidents, and evaluation of major items according to their service classification, in order to identify the ISCS target. MSRS could be an efficient risk analysis model to help ICSPs to identify their core information assets and take information protection measures first, so that stability of the ICS can be ensured.

Keywords: Information Asset, Information CommunicationFacility, Evaluation, ISCS (Information Security Check Service), Evaluation, Grade.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1436
1269 Long-term Flexural Behavior of HSC Beams

Authors: Andreea Muntean, Cornelia Măgureanu

Abstract:

This article presents the analysis of experimental values regarding cracking pattern, specific strains and deformability for reinforced high strength concrete beams. The beams have the concrete class C80/95 and a longitudinal reinforcement ratio of 2.01%, respectively 3.39%. The elements were subjected to flexure under static short-term and long-term loading. The experimental values are compared with calculation values using the design relationships according to Eurocode 2.

Keywords: High strength concrete, beams, flexure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1574
1268 New SUZ-4 Zeolite Membrane from Sol-Gel Technique

Authors: P. Worathanakul, P. Kongkachuichay

Abstract:

A new SUZ-4 zeolite membrane with tetraethlyammonium hydroxide as the template was fabricated on mullite tube via hydrothermal sol-gel synthesis in a rotating autoclave reactor. The suitable synthesis condition was SiO2:Al2O3 ratio of 21.2 for 4 days at 155 °C crystallization under autogenous pressure. The obtained SUZ-4 possessed a high BET surface area of 396.4 m2/g, total pore volume at 2.611 cm3/g, and narrow pore size distribution with 97 nm mean diameter and 760 nm long of needle crystal shape. The SUZ-4 layer obtained from seeding crystallization was thicker than that of without seeds or in situ crystallization.

Keywords: Membrane, seeding, sol-gel, SUZ-4 Zeolite.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1975
1267 Hiding Data in Images Using PCP

Authors: Souvik Bhattacharyya, Gautam Sanyal

Abstract:

In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

Keywords: Cover Image, LSB, Pixel Coordinate Position (PCP), Stego Image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1804
1266 Heart Rate Variability Analysis for Early Stage Prediction of Sudden Cardiac Death

Authors: Reeta Devi, Hitender Kumar Tyagi, Dinesh Kumar

Abstract:

In present scenario, cardiovascular problems are growing challenge for researchers and physiologists. As heart disease have no geographic, gender or socioeconomic specific reasons; detecting cardiac irregularities at early stage followed by quick and correct treatment is very important. Electrocardiogram is the finest tool for continuous monitoring of heart activity. Heart rate variability (HRV) is used to measure naturally occurring oscillations between consecutive cardiac cycles. Analysis of this variability is carried out using time domain, frequency domain and non-linear parameters. This paper presents HRV analysis of the online dataset for normal sinus rhythm (taken as healthy subject) and sudden cardiac death (SCD subject) using all three methods computing values for parameters like standard deviation of node to node intervals (SDNN), square root of mean of the sequences of difference between adjacent RR intervals (RMSSD), mean of R to R intervals (mean RR) in time domain, very low-frequency (VLF), low-frequency (LF), high frequency (HF) and ratio of low to high frequency (LF/HF ratio) in frequency domain and Poincare plot for non linear analysis. To differentiate HRV of healthy subject from subject died with SCD, k –nearest neighbor (k-NN) classifier has been used because of its high accuracy. Results show highly reduced values for all stated parameters for SCD subjects as compared to healthy ones. As the dataset used for SCD patients is recording of their ECG signal one hour prior to their death, it is therefore, verified with an accuracy of 95% that proposed algorithm can identify mortality risk of a patient one hour before its death. The identification of a patient’s mortality risk at such an early stage may prevent him/her meeting sudden death if in-time and right treatment is given by the doctor.

Keywords: Early stage prediction, heart rate variability, linear and non linear analysis, sudden cardiac death.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1788
1265 Comparison between Deterministic and Probabilistic Stability Analysis, Featuring Consequent Risk Assessment

Authors: Isabela Moreira Queiroz

Abstract:

Slope stability analyses are largely carried out by deterministic methods and evaluated through a single security factor. Although it is known that the geotechnical parameters can present great dispersal, such analyses are considered fixed and known. The probabilistic methods, in turn, incorporate the variability of input key parameters (random variables), resulting in a range of values of safety factors, thus enabling the determination of the probability of failure, which is an essential parameter in the calculation of the risk (probability multiplied by the consequence of the event). Among the probabilistic methods, there are three frequently used methods in geotechnical society: FOSM (First-Order, Second-Moment), Rosenblueth (Point Estimates) and Monte Carlo. This paper presents a comparison between the results from deterministic and probabilistic analyses (FOSM method, Monte Carlo and Rosenblueth) applied to a hypothetical slope. The end was held to evaluate the behavior of the slope and consequent risk analysis, which is used to calculate the risk and analyze their mitigation and control solutions. It can be observed that the results obtained by the three probabilistic methods were quite close. It should be noticed that the calculation of the risk makes it possible to list the priority to the implementation of mitigation measures. Therefore, it is recommended to do a good assessment of the geological-geotechnical model incorporating the uncertainty in viability, design, construction, operation and closure by means of risk management. 

Keywords: Probabilistic methods, risk assessment, risk management, slope stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720
1264 A Study of Combined Mechanical and Chemical Stabilisation of Fine Grained Dredge Soil of River Jhelum

Authors: Adnan F. Sheikh, Fayaz A. Mir

Abstract:

After the recent devastating flood in Kashmir in 2014, dredging of the local water bodies, especially Jhelum River has become a priority for the government. Local government under the project name of 'Comprehensive Flood Management Programme' plans to undertake an increase in discharge of existing flood channels by removal of encroachments and acquisition of additional land, dredging and other works of the water bodies. The total quantity of soil to be dredged will be 16.15 lac cumecs. Dredged soil is a major component that would result from the project which requires disposal/utilization. This study analyses the effect of cement and sand on the engineering properties of soil. The tests were conducted with variable additions of sand (10%, 20% and 30%), whereas cement was added at 12%. Samples with following compositions: soil-cement (12%) and soil-sand (30%) were tested as well. Laboratory experiments were conducted to determine the engineering characteristics of soil, i.e., compaction, strength, and CBR characteristics. The strength characteristics of the soil were determined by unconfined compressive strength test and direct shear test. Unconfined compressive strength of the soil was tested immediately and for a curing period of seven days. CBR test was performed for unsoaked, soaked (worst condition- 4 days) and cured (4 days) samples.

Keywords: Comprehensive flood management programme, dredge soil, strength characteristics, flood.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 869
1263 Object Negotiation Mechanism for an Intelligent Environment Using Event Agents

Authors: Chiung-Hui Chen

Abstract:

With advancements in science and technology, the concept of the Internet of Things (IoT) has gradually developed. The development of the intelligent environment adds intelligence to objects in the living space by using the IoT. In the smart environment, when multiple users share the living space, if different service requirements from different users arise, then the context-aware system will have conflicting situations for making decisions about providing services. Therefore, the purpose of establishing a communication and negotiation mechanism among objects in the intelligent environment is to resolve those service conflicts among users. This study proposes developing a decision-making methodology that uses “Event Agents” as its core. When the sensor system receives information, it evaluates a user’s current events and conditions; analyses object, location, time, and environmental information; calculates the priority of the object; and provides the user services based on the event. Moreover, when the event is not single but overlaps with another, conflicts arise. This study adopts the “Multiple Events Correlation Matrix” in order to calculate the degree values of incidents and support values for each object. The matrix uses these values as the basis for making inferences for system service, and to further determine appropriate services when there is a conflict.

Keywords: Internet of things, intelligent object, event agents, negotiation mechanism, degree of similarity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1178
1262 A Study on Brushless DC Motor for High Torque Density

Authors: Jung-Moo Seo, Jung-Hwan Kim, Se-Hyun Rhyu, Jun-Hyuk Choi, In-Soung Jung,

Abstract:

Brushless DC motor with high torque density and slim topology for easy loading for robot system is proposed and manufactured. Electromagnetic design is executed by equivalent magnetic circuit model and numerical analysis. Manufactured motor is tested and verified characteristics comparing with conventional BLDC motor.

Keywords: Brushless DC motor, Robot joint module, Torque density, Pole/slot ratio

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6877
1261 Reclaiming Pedestrian Space from Car Dominated Neighborhoods

Authors: Andreas L. Savvides

Abstract:

For a long time as a result of accommodating car traffic, planning ideologies in the past put a low priority on public space, pedestrianism and the role of city space as a meeting place for urban dwellers. In addition, according to authors such as Jan Gehl, market forces and changing architectural perceptions began to shift the focus of planning practice from the integration of public space in various pockets around the contemporary city to individual buildings. Eventually, these buildings have become increasingly more isolated and introverted and have turned their backs to the realm of the public space adjoining them. As a result of this practice, the traditional function of public space as a social forum for city dwellers has in many cases been reduced or even phased out. Author Jane Jacobs published her seminal book “The Death and Life of Great American Cities" more than fifty years ago, but her observations and predictions at the time still ring true today, where she pointed out how the dramatic increase in car traffic and its accommodation by the urban planning ideology that was brought about by the Modern movement has prompted a separation of the uses of the city. At the same time it emphasizes free standing buildings that threaten urban space and city life and result in underutilized and lifeless urban cores. In this discussion context, the aim of this paper is to showcase a reversal of just such a situation in the case of the Dasoupolis neighborhood in Strovolos, Cyprus, where enlightened urban design practice has see the reclamation of pedestrian space in a car dominated area.

Keywords: Urban Design, Public Space, Right to the City, Accessibility, Mobility

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904
1260 Best Combination of Design Parameters for Buildings with Buckling-Restrained Braces

Authors: Ángel de J. López-Pérez, Sonia E. Ruiz, Vanessa A. Segovia

Abstract:

Buildings vulnerability due to seismic activity has been highly studied since the middle of last century. As a solution to the structural and non-structural damage caused by intense ground motions, several seismic energy dissipating devices, such as buckling-restrained braces (BRB), have been proposed. BRB have shown to be effective in concentrating a large portion of the energy transmitted to the structure by the seismic ground motion. A design approach for buildings with BRB elements, which is based on a seismic Displacement-Based formulation, has recently been proposed by the coauthors in this paper. It is a practical and easy design method which simplifies the work of structural engineers. The method is used here for the design of the structure-BRB damper system. The objective of the present study is to extend and apply a methodology to find the best combination of design parameters on multiple-degree-of-freedom (MDOF) structural frame – BRB systems, taking into account simultaneously: 1) initial costs and 2) an adequate engineering demand parameter. The design parameters considered here are: the stiffness ratio (α = Kframe/Ktotal), and the strength ratio (γ = Vdamper/Vtotal); where K represents structural stiffness and V structural strength; and the subscripts "frame", "damper" and "total" represent: the structure without dampers, the BRB dampers and the total frame-damper system, respectively. The selection of the best combination of design parameters α and γ is based on an initial costs analysis and on the structural dynamic response of the structural frame-damper system. The methodology is applied to a 12-story 5-bay steel building with BRB, which is located on the intermediate soil of Mexico City. It is found the best combination of design parameters α and γ for the building with BRB under study.

Keywords: Best combination of design parameters, BRB, buildings with energy dissipating devices, buckling-restrained braces, initial costs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1166
1259 Effect of Core Stability Ex ercises on Trunk Muscle Balance in Healthy Adult Individuals

Authors: Amira A. A. Abdallah, Amir A. Beltagi

Abstract:

Background: Core stability training has recently attracted attention for improving muscle balance and optimizing performance in healthy and unhealthy individuals. Purpose: This study investigated the effect of beginner’s core stability exercises on trunk flexors’/extensors’ peak torque ratio and trunk flexors’ and extensors’ peak torques. Methods: Thirty five healthy individuals participated in the study. They were randomly assigned to two groups; experimental “group I, n=20” and control “group II, n=15”. Their mean age, weight and height were 20.7±2.4 vs. 20.3±0.61 years, 66.5±12.1 vs. 68.57±12.2 kg and 166.7±7.8 vs. 164.28 ±7.59 cm. for group I vs. group II. Data were collected using the Biodex Isokinetic system. The participants were tested twice; before and after a 6-week period during which group I performed a core stability training program. Results: The 2x2 Mixed Design ANOVA revealed that there were no significant differences (p>0.025) in the trunk flexors’/extensors’ peak torque ratio between the pre-test and post-test conditions for either group. Moreover, there were no significant differences (p>0.025) in the trunk flexion/extension ratios between both groups at either condition. However, the 2x2 Mixed Design MANOVA revealed significant increases (p<0.025) in the trunk flexors’ and extensors’ peak torques in the post-test condition compared with the pre-test in group I with no significant differences (p>0.025) in group II. Moreover, there was a significant increase (p<0.025) in the trunk flexors’ peak torque only in group I compared with group II in the post-test condition with no significant differences in the other conditions. Interpretation/Conclusion: The improvement in muscle performance indicated by the increase in the trunk flexors’ and extensors’ peak torques in the experimental group recommends including core stability training in the exercise programs that aim to improve muscle performance.

Keywords: Core Stability, Isokinetic, Trunk Muscles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3662
1258 A Novel Multiplex Real-Time PCR Assay Using TaqMan MGB Probes for Rapid Detection of Trisomy 21

Authors: Mehrdad Hashemi, Mitra Behrooz Aghdam, Reza Mahdian, Ahmad Reza Kamyab

Abstract:

Cytogenetic analysis still remains the gold standard method for prenatal diagnosis of trisomy 21 (Down syndrome, DS). Nevertheless, the conventional cytogenetic analysis needs live cultured cells and is too time-consuming for clinical application. In contrast, molecular methods such as FISH, QF-PCR, MLPA and quantitative Real-time PCR are rapid assays with results available in 24h. In the present study, we have successfully used a novel MGB TaqMan probe-based real time PCR assay for rapid diagnosis of trisomy 21 status in Down syndrome samples. We have also compared the results of this molecular method with corresponding results obtained by the cytogenetic analysis. Blood samples obtained from DS patients (n=25) and normal controls (n=20) were tested by quantitative Real-time PCR in parallel to standard G-banding analysis. Genomic DNA was extracted from peripheral blood lymphocytes. A high precision TaqMan probe quantitative Real-time PCR assay was developed to determine the gene dosage of DSCAM (target gene on 21q22.2) relative to PMP22 (reference gene on 17p11.2). The DSCAM/PMP22 ratio was calculated according to the formula; ratio=2 -ΔΔCT. The quantitative Real-time PCR was able to distinguish between trisomy 21 samples and normal controls with the gene ratios of 1.49±0.13 and 1.03±0.04 respectively (p value <0.001). These results represent the presence of 3 copies of target gene in DS samples Vs 2 copies in normal controls. The results of quantitative Real-time PCR were in complete agreement with results of cytogenetic analysis. This study confirms previous reports regarding successful implementation of quantitative Real-time PCR for detection of trisomy 21. However, the assay has been improved by using MGB probes and more accurate data analysis. This assay, in particular, when performed in combination with another molecular assay such as QF-PCR or MLPA, can be used as a reliable technique for rapid prenatal diagnosis of trisomy 21.

Keywords: Trisomy 21, Real-time PCR, MGB-TaqMan Probes, Gene Dosage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2520
1257 Mitigation of Radiation Levels for Base Transceiver Stations based on ITU-T Recommendation K.70

Authors: Reyes C., Ramos B.

Abstract:

This essay presents applicative methods to reduce human exposure levels in the area around base transceiver stations in a environment with multiple sources based on ITU-T recommendation K.70. An example is presented to understand the mitigation techniques and their results and also to learn how they can be applied, especially in developing countries where there is not much research on non-ionizing radiations.

Keywords: Electromagnetic fields (EMF), human exposure limits, intentional radiator, cumulative exposure ratio, base transceiver station (BTS), radiation levels.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2676
1256 Relation of Optimal Pilot Offsets in the Shifted Constellation-Based Method for the Detection of Pilot Contamination Attacks

Authors: Dimitriya A. Mihaylova, Zlatka V. Valkova-Jarvis, Georgi L. Iliev

Abstract:

One possible approach for maintaining the security of communication systems relies on Physical Layer Security mechanisms. However, in wireless time division duplex systems, where uplink and downlink channels are reciprocal, the channel estimate procedure is exposed to attacks known as pilot contamination, with the aim of having an enhanced data signal sent to the malicious user. The Shifted 2-N-PSK method involves two random legitimate pilots in the training phase, each of which belongs to a constellation, shifted from the original N-PSK symbols by certain degrees. In this paper, legitimate pilots’ offset values and their influence on the detection capabilities of the Shifted 2-N-PSK method are investigated. As the implementation of the technique depends on the relation between the shift angles rather than their specific values, the optimal interconnection between the two legitimate constellations is investigated. The results show that no regularity exists in the relation between the pilot contamination attacks (PCA) detection probability and the choice of offset values. Therefore, an adversary who aims to obtain the exact offset values can only employ a brute-force attack but the large number of possible combinations for the shifted constellations makes such a type of attack difficult to successfully mount. For this reason, the number of optimal shift value pairs is also studied for both 100% and 98% probabilities of detecting pilot contamination attacks. Although the Shifted 2-N-PSK method has been broadly studied in different signal-to-noise ratio scenarios, in multi-cell systems the interference from the signals in other cells should be also taken into account. Therefore, the inter-cell interference impact on the performance of the method is investigated by means of a large number of simulations. The results show that the detection probability of the Shifted 2-N-PSK decreases inversely to the signal-to-interference-plus-noise ratio.

Keywords: Channel estimation, inter-cell interference, pilot contamination attacks, wireless communications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 660
1255 Shear Strength Characteristics of Sand-Particulate Rubber Mixture

Authors: Firas Daghistani, Hossam Abuel Naga

Abstract:

Waste tyres is an ongoing global problem that has a negative effect on the environment. Waste tyres are discarded in stockpiles where they provide harm to the environment in many ways. Finding applications to these materials can help in reducing this global problem. One of these applications is recycling these waste materials and using them in geotechnical engineering. Recycled waste tyre particulates can be mixed with sand to form a lightweight material with varying shear strength characteristics. This research further investigates the inclusion of particulate rubber to sand and whether it can increase or decrease the shear strength characteristics of the mixture. For the experiment, a series of direct shear tests was performed on a poorly graded sand with a mean particle size of 0.32 mm mixed with recycled poorly graded particulate rubber with a mean particle size of 0.51 mm. The shear tests were performed on four normal stresses 30, 55, 105, 200 kPa at a shear rate of 1 mm/minute. Different percentages of particulate rubber content were used in the mixture i.e., 10%, 20%, 30% and 50% of sand dry weight at three density states namely loose, slight dense, and dense state. The size ratio of the mixture, which is the mean particle size of the particulate rubber divided by the mean particle size of the sand, was 1.59. The results identified multiple parameters that can influence the shear strength of the mixture. The parameters were: normal stress, particulate rubber content, mixture gradation, mixture size ratio, and the mixture’s density. The inclusion of particulate rubber to sand showed a decrease to the internal friction angle, and an increase to the apparent cohesion. Overall, the inclusion of particulate rubber did not have a significant influence on the shear strength of the mixture. For all the dense states at the low normal stresses 30, and 55 kPa, the inclusion of particulate rubber showed a slight increase in the shear strength where the peak was at 20-30% rubber content of the sand’s dry weight. On the other hand, at the high normal stresses 105, and 200 kPa, there was a slight decrease in the shear strength.

Keywords: Direct shear, granular material, sand-rubber mixture, shear strength, waste material.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 320
1254 Union is Strength in Lossy Image Compression

Authors: Mario Mastriani

Abstract:

In this work, we present a comparison between different techniques of image compression. First, the image is divided in blocks which are organized according to a certain scan. Later, several compression techniques are applied, combined or alone. Such techniques are: wavelets (Haar's basis), Karhunen-Loève Transform, etc. Simulations show that the combined versions are the best, with minor Mean Squared Error (MSE), and higher Peak Signal to Noise Ratio (PSNR) and better image quality, even in the presence of noise.

Keywords: Haar's basis, Image compression, Karhunen-LoèveTransform, Morton's scan, row-rafter scan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1732
1253 The New Relative Efficiency Based on the Least Eigenvalue in Generalized Linear Model

Authors: Chao Yuan, Bao Guang Tian

Abstract:

A new relative efficiency is defined as LSE and BLUE in the generalized linear model. The relative efficiency is based on the ratio of the least eigenvalues. In this paper, we discuss about its lower bound and the relationship between it and generalized relative coefficient. Finally, this paper proves that the new estimation is better under Stein function and special condition in some degree.

Keywords: Generalized linear model, generalized relative coefficient, least eigenvalue, relative efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1174