Search results for: empirical orthogonal functions
4672 Dual-Rail Logic Unit in Double Pass Transistor Logic
Authors: Hamdi Belgacem, Fradi Aymen
Abstract:
In this paper we present a low power, low cost differential logic unit (LU). The proposed LU receives dual-rail inputs and generates dual-rail outputs. The proposed circuit can be used in Arithmetic and Logic Units (ALU) of processor. It can be also dedicated for self-checking applications based on dual duplication code. Four logic functions as well as their inverses are implemented within a single Logic Unit. The hardware overhead for the implementation of the proposed LU is lower than the hardware overhead required for standard LU implemented with standard CMOS logic style. This new implementation is attractive as fewer transistors are required to implement important logic functions. The proposed differential logic unit can perform 8 Boolean logical operations by using only 16 transistors. Spice simulations using a 32 nm technology was utilized to evaluate the performance of the proposed circuit and to prove its acceptable electrical behaviour.Keywords: differential logic unit, double pass transistor logic, low power CMOS design, low cost CMOS design
Procedia PDF Downloads 4524671 Safety Effect of Smart Right-Turn Design at Intersections
Authors: Upal Barua
Abstract:
The risk of severe crashes at high-speed right-turns at intersections is a major safety concern these days. The application of a smart right-turn at an intersection is increasing day by day to address is an issue. The design, ‘Smart Right-turn’ consists of a narrow-angle of channelization at approximately 70°. This design increases the cone of vision of the right-tuning drivers towards the crossing pedestrians as well as traffic on the cross-road. As part of the Safety Improvement Program in Austin Transportation Department, several smart right-turns were constructed at high crash intersections where high-speed right-turns were found to be a contributing factor. This paper features the state of the art techniques applied in planning, engineering, designing and construction of this smart right-turn, key factors driving the success, and lessons learned in the process. This paper also presents the significant crash reductions achieved from the application of this smart right-turn design using Empirical Bayes method. The result showed that smart right-turns can reduce overall right-turn crashes by 43% and severe right-turn crashes by 70%.Keywords: smart right-turn, intersection, cone of vision, empirical Bayes method
Procedia PDF Downloads 2664670 Flocking Swarm of Robots Using Artificial Innate Immune System
Authors: Muneeb Ahmad, Ali Raza
Abstract:
A computational method inspired by the immune system (IS) is presented, leveraging its shared characteristics of robustness, fault tolerance, scalability, and adaptability with swarm intelligence. This method aims to showcase flocking behaviors in a swarm of robots (SR). The innate part of the IS offers a variety of reactive and probabilistic cell functions alongside its self-regulation mechanism which have been translated to enable swarming behaviors. Although, the research is specially focused on flocking behaviors in a variety of simulated environments using e-puck robots in a physics-based simulator (CoppeliaSim); the artificial innate immune system (AIIS) can exhibit other swarm behaviors as well. The effectiveness of the immuno-inspired approach has been established with extensive experimentations, for scalability and adaptability, using standard swarm benchmarks as well as the immunological regulatory functions (i.e., Dendritic Cells’ Maturity and Inflammation). The AIIS-based approach has proved to be a scalable and adaptive solution for emulating the flocking behavior of SR.Keywords: artificial innate immune system, flocking swarm, immune system, swarm intelligence
Procedia PDF Downloads 1064669 The Effectiveness of Cash Flow Management by SMEs in the Mafikeng Local Municipality of South Africa
Authors: Ateba Benedict Belobo, Faan Pelser, Ambe Marcus
Abstract:
Aims: This study arise from repeated complaints from both electronic mails about the underperformance of Mafikeng Small and Medium-Size enterprises after the global financial crisis. The authors were on the view that, this poor performance experienced could be as a result of the negative effects on the cash flow of these businesses due to volatilities in the business environment in general prior to the global crisis. Thus, the paper was mainly aimed at determining the shortcomings experienced by these SMEs with regards to cash flow management. It was also aimed at suggesting possible measures to improve cash flow management of these SMEs in this tough time. Methods: A case study was conducted on 3 beverage suppliers, 27 bottle stores, 3 largest fast consumer goods super markets and 7 automobiles enterprises in the Mafikeng local municipality. A mixed method research design was employed and a purposive sampling was used in selecting SMEs that participated. Views and experiences of participants of the paper were captured through in-depth interviews. Data from the empirical investigation were interpreted using open coding and a simple percentage formula. Results: Findings from the empirical research reflected that majority of Mafikeng SMEs suffer poor operational performance prior to the global financial crisis primarily as a result of poor cash flow management. However, the empirical outcome also indicted other secondary factors contributing to this poor operational performance. Conclusion: Finally, the authorsproposed possible measures that could be used to improve cash flow management and to solve other factors affecting operational performance of SMEs in the Mafikeng local municipality in other to achieve a better business performance.Keywords: cash flow, business performance, global financial crisis, SMEs
Procedia PDF Downloads 4394668 Impact of the Currency Devaluation on Contractors in Egypt
Authors: Mariam Zahwy, Waleed El Nemr, A. Samer Ezeldin
Abstract:
In 2016, the depreciation of the Egyptian pound (EGP) had a substantial impact on Egypt's construction industry. Studies assessing this influence are scarce, though. The impact of devaluation on contractors is measured in this study using empirical data. The difficulties contractors have as a result of rising import material costs, limited financing alternatives, and inflationary pressures are also determined by analyzing survey responses from contractors and industry experts. The approaches contractors utilize to lessen the impact of devaluation are also examined in the research. The survey results show how currency depreciation directly affects contractors in the Egyptian construction industry in terms of financial consequences. Inflationary pressures, fewer financing alternatives, and rising expenses have all affected contractors. To minimize losses, contractors have, nonetheless, put a number of tactics into practice. These findings highlight the importance of understanding and managing the impact of devaluation on the construction industry to ensure its resilience and development.Keywords: construction, devaluation, contractors, material costs, inflationary pressures, empirical data, quantitative research
Procedia PDF Downloads 254667 The Revenue Management Implementation and Its Complexity in the Airline Industry: An Empirical Study on the Egyptian Airline Industry
Authors: Amr Sultan, Sara Elgazzar, Breksal Elmiligy
Abstract:
The airline industry nowadays is becoming a more growing industry facing a severe competition. It is an influential issue in this context to utilize revenue management (RM) concept and practice in order to develop the pricing strategy. There is an unfathomable necessity for RM to assist the airlines and their associates to disparage the cost and recuperate their revenue, which in turn will boost the airline industry performance. The complexity of RM imposes enormous challenges on the airline industry. Several studies have been proposed on the RM adaptation in airlines industry while there is a limited availability of implementing RM and its complexity in the developing countries such as Egypt. This research represents a research schema about the implementation of the RM to the Egyptian airline industry. The research aims at investigating and demonstrating the complexities face implementing RM in the airline industry, up on which the research provides a comprehensive understanding of how to overcome these complexities while adapting RM in the Egyptian airline industry. An empirical study was conducted on the Egyptian airline sector based on a sample of four airlines (Egyptair, Britishair, KLM, and Lufthansa). The empirical study was conducted using a mix of qualitative and quantitative approaches. First, in-depth interviews were carried out to analyze the Egyptian airline sector status and the main challenges faced by the airlines. Then, a structured survey on the three different parties of airline industry; airlines, airfreight forwarders, and passengers were conducted in order to investigate the main complexity factors from different parties' points of view. Finally, a focus group was conducted to develop a best practice framework to overcome the complexities faced the RM adaptation in the Egyptian airline industry. The research provides an original contribution to knowledge by creating a framework to overcome the complexities and challenges in adapting RM in the airline industry generally and the Egyptian airline industry particularly. The framework can be used as a RM tool to increase the effectiveness and efficiency of the Egyptian airline industry performance.Keywords: revenue management, airline industry, revenue management complexity, Egyptian airline industry
Procedia PDF Downloads 4054666 A Study of Evolutional Control Systems
Authors: Ti-Jun Xiao, Zhe Xu
Abstract:
Controllability is one of the fundamental issues in control systems. In this paper, we study the controllability of second order evolutional control systems in Hilbert spaces with memory and boundary controls, which model dynamic behaviors of some viscoelastic materials. Transferring the control problem into a moment problem and showing the Riesz property of a family of functions related to Cauchy problems for some integrodifferential equations, we obtain a general boundary controllability theorem for these second order evolutional control systems. This controllability theorem is applicable to various concrete 1D viscoelastic systems and recovers some previous related results. It is worth noting that Riesz sequences can be used for numerical computations of the control functions and the identification of new Riesz sequence is of independent interest for the basis-function theory. Moreover, using the Riesz sequences, we obtain the existence and uniqueness of (weak) solutions to these second order evolutional control systems in Hilbert spaces. Finally, we derive the exact boundary controllability of a viscoelastic beam equation, as an application of our abstract theorem.Keywords: evolutional control system, controllability, boundary control, existence and uniqueness
Procedia PDF Downloads 2244665 Finding the Elastic Field in an Arbitrary Anisotropic Media by Implementing Accurate Generalized Gaussian Quadrature Solution
Authors: Hossein Kabir, Amir Hossein Hassanpour Mati-Kolaie
Abstract:
In the current study, the elastic field in an anisotropic elastic media is determined by implementing a general semi-analytical method. In this specific methodology, the displacement field is computed as a sum of finite functions with unknown coefficients. These aforementioned functions satisfy exactly both the homogeneous and inhomogeneous boundary conditions in the proposed media. It is worth mentioning that the unknown coefficients are determined by implementing the principle of minimum potential energy. The numerical integration is implemented by employing the Generalized Gaussian Quadrature solution. Furthermore, with the aid of the calculated unknown coefficients, the displacement field, as well as the other parameters of the elastic field, are obtainable as well. Finally, the comparison of the previous analytical method with the current semi-analytical method proposes the efficacy of the present methodology.Keywords: anisotropic elastic media, semi-analytical method, elastic field, generalized gaussian quadrature solution
Procedia PDF Downloads 3214664 Experimental Investigation on Over-Cut in Ultrasonic Machining of WC-Co Composite
Authors: Ravinder Kataria, Jatinder Kumar, B. S. Pabla
Abstract:
Ultrasonic machining is one of the most widely used non-traditional machining processes for machining of materials that are relatively brittle, hard, and fragile such as advanced ceramics, refractories, crystals, quartz etc. Present article has been targeted at investigating the impact of different experimental conditions (power rating, cobalt content, tool material, thickness of work piece, tool geometry, and abrasive grit size) on over cut in ultrasonic drilling of WC-Co composite material. Taguchi’s L-36 orthogonal array has been employed for conducting the experiments. Significant factors have been identified using analysis of variance (ANOVA) test. The experimental results revealed that abrasive grit size and tool material are most significant factors for over cut.Keywords: ANOVA, abrasive grit size, Taguchi, WC-Co, ultrasonic machining
Procedia PDF Downloads 3984663 Simplified Stress Gradient Method for Stress-Intensity Factor Determination
Authors: Jeries J. Abou-Hanna
Abstract:
Several techniques exist for determining stress-intensity factors in linear elastic fracture mechanics analysis. These techniques are based on analytical, numerical, and empirical approaches that have been well documented in literature and engineering handbooks. However, not all techniques share the same merit. In addition to overly-conservative results, the numerical methods that require extensive computational effort, and those requiring copious user parameters hinder practicing engineers from efficiently evaluating stress-intensity factors. This paper investigates the prospects of reducing the complexity and required variables to determine stress-intensity factors through the utilization of the stress gradient and a weighting function. The heart of this work resides in the understanding that fracture emanating from stress concentration locations cannot be explained by a single maximum stress value approach, but requires use of a critical volume in which the crack exists. In order to understand the effectiveness of this technique, this study investigated components of different notch geometry and varying levels of stress gradients. Two forms of weighting functions were employed to determine stress-intensity factors and results were compared to analytical exact methods. The results indicated that the “exponential” weighting function was superior to the “absolute” weighting function. An error band +/- 10% was met for cases ranging from a steep stress gradient in a sharp v-notch to the less severe stress transitions of a large circular notch. The incorporation of the proposed method has shown to be a worthwhile consideration.Keywords: fracture mechanics, finite element method, stress intensity factor, stress gradient
Procedia PDF Downloads 1354662 Continuous Functions Modeling with Artificial Neural Network: An Improvement Technique to Feed the Input-Output Mapping
Authors: A. Belayadi, A. Mougari, L. Ait-Gougam, F. Mekideche-Chafa
Abstract:
The artificial neural network is one of the interesting techniques that have been advantageously used to deal with modeling problems. In this study, the computing with artificial neural network (CANN) is proposed. The model is applied to modulate the information processing of one-dimensional task. We aim to integrate a new method which is based on a new coding approach of generating the input-output mapping. The latter is based on increasing the neuron unit in the last layer. Accordingly, to show the efficiency of the approach under study, a comparison is made between the proposed method of generating the input-output set and the conventional method. The results illustrated that the increasing of the neuron units, in the last layer, allows to find the optimal network’s parameters that fit with the mapping data. Moreover, it permits to decrease the training time, during the computation process, which avoids the use of computers with high memory usage.Keywords: neural network computing, continuous functions generating the input-output mapping, decreasing the training time, machines with big memories
Procedia PDF Downloads 2834661 Labview-Based System for Fiber Links Events Detection
Authors: Bo Liu, Qingshan Kong, Weiqing Huang
Abstract:
With the rapid development of modern communication, diagnosing the fiber-optic quality and faults in real-time is widely focused. In this paper, a Labview-based system is proposed for fiber-optic faults detection. The wavelet threshold denoising method combined with Empirical Mode Decomposition (EMD) is applied to denoise the optical time domain reflectometer (OTDR) signal. Then the method based on Gabor representation is used to detect events. Experimental measurements show that signal to noise ratio (SNR) of the OTDR signal is improved by 1.34dB on average, compared with using the wavelet threshold denosing method. The proposed system has a high score in event detection capability and accuracy. The maximum detectable fiber length of the proposed Labview-based system can be 65km.Keywords: empirical mode decomposition, events detection, Gabor transform, optical time domain reflectometer, wavelet threshold denoising
Procedia PDF Downloads 1234660 The Adoption of Process Management for Accounting Information Systems in Thailand
Authors: Manirath Wongsim
Abstract:
Information Quality (IQ) has become a critical, strategic issue in Accounting Information Systems (AIS) adoption. In order to implement AIS adoption successfully, it is important to consider the quality of information use throughout the adoption process, which seriously impacts the effectiveness of AIS adoption practice and the optimization of AIS adoption decisions. There is a growing need for research to provide insights into issues and solutions related to IQ in AIS adoption. The need for an integrated approach to improve IQ in AIS adoption, as well as the unique characteristics of accounting data, demands an AIS adoption specific IQ framework. This research aims to explore ways of managing information quality and AIS adoption to investigate the relationship between the IQ issues and AIS adoption process. This study has led to the development of a framework for understanding IQ management in AIS adoption. This research was done on 44 respondents as ten organisations from manufacturing firms in Thailand. The findings of the research’s empirical evidence suggest that IQ dimensions in AIS adoption to provide assistance in all process of decision making. This research provides empirical evidence that information quality of AIS adoption affect decision making and suggests that these variables should be considered in adopting AIS in order to improve the effectiveness of AIS.Keywords: information quality, information quality dimensions, accounting information systems, accounting information system adoption
Procedia PDF Downloads 4674659 Non-Standard Monetary Policy Measures and Their Consequences
Authors: Aleksandra Nocoń (Szunke)
Abstract:
The study is a review of the literature concerning the consequences of non-standard monetary policy, which are used by central banks during unconventional periods, threatening instability of the banking sector. In particular, the attention was paid to the effects of non-standard monetary policy tools for financial markets. However, the empirical evidence about their effects and real consequences for the financial markets are still not final. The main aim of the study is to survey the consequences of standard and non-standard monetary policy instruments, implemented during the global financial crisis in the United States, United Kingdom and Euroland, with particular attention to the results for the stabilization of global financial markets. The study analyses the consequences for short and long-term market interest rates, interbank interest rates and LIBOR-OIS spread. The study consists mainly of the empirical review, indicating the impact of the implementation of these tools for the financial markets. The following research methods were used in the study: literature studies, including domestic and foreign literature, cause and effect analysis and statistical analysis.Keywords: asset purchase facility, consequences of monetary policy instruments, non-standard monetary policy, quantitative easing
Procedia PDF Downloads 3324658 Secure Cryptographic Operations on SIM Card for Mobile Financial Services
Authors: Kerem Ok, Serafettin Senturk, Serdar Aktas, Cem Cevikbas
Abstract:
Mobile technology is very popular nowadays and it provides a digital world where users can experience many value-added services. Service Providers are also eager to offer diverse value-added services to users such as digital identity, mobile financial services and so on. In this context, the security of data storage in smartphones and the security of communication between the smartphone and service provider are critical for the success of these services. In order to provide the required security functions, the SIM card is one acceptable alternative. Since SIM cards include a Secure Element, they are able to store sensitive data, create cryptographically secure keys, encrypt and decrypt data. In this paper, we design and implement a SIM and a smartphone framework that uses a SIM card for secure key generation, key storage, data encryption, data decryption and digital signing for mobile financial services. Our frameworks show that the SIM card can be used as a controlled Secure Element to provide required security functions for popular e-services such as mobile financial services.Keywords: SIM card, mobile financial services, cryptography, secure data storage
Procedia PDF Downloads 3124657 Fast Algorithm to Determine Initial Tsunami Wave Shape at Source
Authors: Alexander P. Vazhenin, Mikhail M. Lavrentiev, Alexey A. Romanenko, Pavel V. Tatarintsev
Abstract:
One of the problems obstructing effective tsunami modelling is the lack of information about initial wave shape at source. The existing methods; geological, sea radars, satellite images, contain an important part of uncertainty. Therefore, direct measurement of tsunami waves obtained at the deep water bottom peruse recorders is also used. In this paper we propose a new method to reconstruct the initial sea surface displacement at tsunami source by the measured signal (marigram) approximation with the help of linear combination of synthetic marigrams from the selected set of unit sources, calculated in advance. This method has demonstrated good precision and very high performance. The mathematical model and results of numerical tests are here described.Keywords: numerical tests, orthogonal decomposition, Tsunami Initial Sea Surface Displacement
Procedia PDF Downloads 4694656 Artificial Intelligent Methodology for Liquid Propellant Engine Design Optimization
Authors: Hassan Naseh, Javad Roozgard
Abstract:
This paper represents the methodology based on Artificial Intelligent (AI) applied to Liquid Propellant Engine (LPE) optimization. The AI methodology utilized from Adaptive neural Fuzzy Inference System (ANFIS). In this methodology, the optimum objective function means to achieve maximum performance (specific impulse). The independent design variables in ANFIS modeling are combustion chamber pressure and temperature and oxidizer to fuel ratio and output of this modeling are specific impulse that can be applied with other objective functions in LPE design optimization. To this end, the LPE’s parameter has been modeled in ANFIS methodology based on generating fuzzy inference system structure by using grid partitioning, subtractive clustering and Fuzzy C-Means (FCM) clustering for both inferences (Mamdani and Sugeno) and various types of membership functions. The final comparing optimization results shown accuracy and processing run time of the Gaussian ANFIS Methodology between all methods.Keywords: ANFIS methodology, artificial intelligent, liquid propellant engine, optimization
Procedia PDF Downloads 5904655 Dynamic Reroute Modeling for Emergency Evacuation: Case Study of Brunswick City, Germany
Authors: Yun-Pang Flötteröd, Jakob Erdmann
Abstract:
The human behaviors during evacuations are quite complex. One of the critical behaviors which affect the efficiency of evacuation is route choice. Therefore, the respective simulation modeling work needs to function properly. In this paper, Simulation of Urban Mobility’s (SUMO) current dynamic route modeling during evacuation, i.e. the rerouting functions, is examined with a real case study. The result consistency of the simulation and the reality is checked as well. Four influence factors (1) time to get information, (2) probability to cancel a trip, (3) probability to use navigation equipment, and (4) rerouting and information updating period are considered to analyze possible traffic impacts during the evacuation and to examine the rerouting functions in SUMO. Furthermore, some behavioral characters of the case study are analyzed with use of the corresponding detector data and applied in the simulation. The experiment results show that the dynamic route modeling in SUMO can deal with the proposed scenarios properly. Some issues and function needs related to route choice are discussed and further improvements are suggested.Keywords: evacuation, microscopic traffic simulation, rerouting, SUMO
Procedia PDF Downloads 1944654 Technological Enhancements in Supply Chain Management Post COVID-19
Authors: Miran Ismail
Abstract:
COVID-19 has caused widespread disruption in all economical sectors and industries around the world. The COVID-19 lockdown measures have resulted in production halts, restrictions on persons and goods movement, border closures, logistical constraints, and a slowdown in trade and economic activity. The main subject of this paper is to leverage technology to manage the supply chain effectively and efficiently through the usage of artificial intelligence. The research methodology is based on empirical data collected through a questionnaire survey. One of the approaches utilized is a case study of industrial organizations that face obstacles such as high operational costs, large inventory levels, a lack of well-established supplier relationships, human behavior, and system issues. The main contribution of this research to the body of knowledge is the empirical insights and on supply chain sustainability performance measurement. The results provide guidelines for the selection of advanced technologies to support supply chain processes and for the design of sustainable performance measurement systems.Keywords: information technology, artificial intelligence, supply chain management, industrial organizations
Procedia PDF Downloads 1254653 Analysis of Dynamics Underlying the Observation Time Series by Using a Singular Spectrum Approach
Authors: O. Delage, H. Bencherif, T. Portafaix, A. Bourdier
Abstract:
The main purpose of time series analysis is to learn about the dynamics behind some time ordered measurement data. Two approaches are used in the literature to get a better knowledge of the dynamics contained in observation data sequences. The first of these approaches concerns time series decomposition, which is an important analysis step allowing patterns and behaviors to be extracted as components providing insight into the mechanisms producing the time series. As in many cases, time series are short, noisy, and non-stationary. To provide components which are physically meaningful, methods such as Empirical Mode Decomposition (EMD), Empirical Wavelet Transform (EWT) or, more recently, Empirical Adaptive Wavelet Decomposition (EAWD) have been proposed. The second approach is to reconstruct the dynamics underlying the time series as a trajectory in state space by mapping a time series into a set of Rᵐ lag vectors by using the method of delays (MOD). Takens has proved that the trajectory obtained with the MOD technic is equivalent to the trajectory representing the dynamics behind the original time series. This work introduces the singular spectrum decomposition (SSD), which is a new adaptive method for decomposing non-linear and non-stationary time series in narrow-banded components. This method takes its origin from singular spectrum analysis (SSA), a nonparametric spectral estimation method used for the analysis and prediction of time series. As the first step of SSD is to constitute a trajectory matrix by embedding a one-dimensional time series into a set of lagged vectors, SSD can also be seen as a reconstruction method like MOD. We will first give a brief overview of the existing decomposition methods (EMD-EWT-EAWD). The SSD method will then be described in detail and applied to experimental time series of observations resulting from total columns of ozone measurements. The results obtained will be compared with those provided by the previously mentioned decomposition methods. We will also compare the reconstruction qualities of the observed dynamics obtained from the SSD and MOD methods.Keywords: time series analysis, adaptive time series decomposition, wavelet, phase space reconstruction, singular spectrum analysis
Procedia PDF Downloads 1064652 Pragmatic Discoursal Study of Hedging Constructions in English Language
Authors: Mohammed Hussein Ahmed, Bahar Mohammed Kareem
Abstract:
This study is concerned with the pragmatic discoursal study of hedging constructions in English language. Hedging is a mitigated word used to lessen the impact of the utterance uttered by the speakers. Hedging could be either adverbs, adjectives, verbs and sometimes it may consist of clauses. It aims at finding out the extent to which speakers and participants of the discourse use hedging constructions during their conversations. The study also aims at finding out whether or not there are any significant differences in the types and functions of the frequency of hedging constructions employed by male and female. It is hypothesized that hedging constructions are frequent in English discourse more than any other languages due to its formality and that the frequency of the types and functions are influenced by the gender of the participants. To achieve the aims of the study, two types of procedures have been followed: theoretical and practical. The theoretical procedure consists of presenting a theoretical background of hedging topic which includes its definitions, etymology and theories. The practical procedure consists of selecting a sample of texts and analyzing them according to an adopted model. A number of conclusions will be drawn based on the findings of the study.Keywords: hedging, pragmatics, politeness, theoretical
Procedia PDF Downloads 5884651 Social Media Diffusion And Implications For Opinion Leadership In Northcentral Nigeria
Authors: Chuks Odiegwu-Enwerem
Abstract:
The classical notion of opinion leadership presupposes that the media is at the center of an effective and successful opinion leadership. Under this idea, an opinion leader is an active media user who consumes, understands, digests and interprets the messages for the understanding and acceptance/adoption by lower-end media users – whose access and understanding of media content are supposedly low. Because of their unique access to and presumed understanding of media functions and their content, opinion leaders are typically esteemed by those who look forward to and accept their opinions. Lazarsfeld and Katz’s two-step flow of communication theory is the basis of opinion leadership – propelled by limited access to the media. With the emergence and spread of social media and its unlimited access by all and sundry, however, the study interrogates the relevance and application of opinion leaders and, by implication, the two-step flow communication theory in Nigeria’s Northcentral region. It seeks to determine whether opinion leaders still exist in the picture and if they still exert considerable influence, especially in matters of political conversations and decision-making among the citizens of this area. It further explores whether the diffusion of social media is a reality and how the ‘low-end’ media users react to the new-found freedom of access to media, and how they are using it to inform their decisions on important matters as well as examines if they are still glued to their opinion leaders. This study explores the empirical dimensions of the two-step flow hypothesis in relation to the activities of social media to determine if a change has occurred and in what direction, using mixed methos of Survey and in-depth interviews. Our understanding and belief in some theoretical assumptions may be enhanced or challenged by the study outcome.Keywords: Opinion Leadership, Active Media User, Two-Step-Flow, Social media, Northcentral Nigeria
Procedia PDF Downloads 734650 Research on Straightening Process Model Based on Iteration and Self-Learning
Authors: Hong Lu, Xiong Xiao
Abstract:
Shaft parts are widely used in machinery industry, however, bending deformation often occurred when this kind of parts is being heat treated. This parts needs to be straightened to meet the requirement of straightness. As for the pressure straightening process, a good straightening stroke algorithm is related to the precision and efficiency of straightening process. In this paper, the relationship between straightening load and deflection during the straightening process is analyzed, and the mathematical model of the straightening process has been established. By the mathematical model, the iterative method is used to solve the straightening stroke. Compared to the traditional straightening stroke algorithm, straightening stroke calculated by this method is much more precise; because it can adapt to the change of material performance parameters. Considering that the straightening method is widely used in the mass production of the shaft parts, knowledge base is used to store the data of the straightening process, and a straightening stroke algorithm based on empirical data is set up. In this paper, the straightening process control model which combine the straightening stroke method based on iteration and straightening stroke algorithm based on empirical data has been set up. Finally, an experiment has been designed to verify the straightening process control model.Keywords: straightness, straightening stroke, deflection, shaft parts
Procedia PDF Downloads 3304649 Estimation of Scour Using a Coupled Computational Fluid Dynamics and Discrete Element Model
Authors: Zeinab Yazdanfar, Dilan Robert, Daniel Lester, S. Setunge
Abstract:
Scour has been identified as the most common threat to bridge stability worldwide. Traditionally, scour around bridge piers is calculated using the empirical approaches that have considerable limitations and are difficult to generalize. The multi-physic nature of scouring which involves turbulent flow, soil mechanics and solid-fluid interactions cannot be captured by simple empirical equations developed based on limited laboratory data. These limitations can be overcome by direct numerical modeling of coupled hydro-mechanical scour process that provides a robust prediction of bridge scour and valuable insights into the scour process. Several numerical models have been proposed in the literature for bridge scour estimation including Eulerian flow models and coupled Euler-Lagrange models incorporating an empirical sediment transport description. However, the contact forces between particles and the flow-particle interaction haven’t been taken into consideration. Incorporating collisional and frictional forces between soil particles as well as the effect of flow-driven forces on particles will facilitate accurate modeling of the complex nature of scour. In this study, a coupled Computational Fluid Dynamics and Discrete Element Model (CFD-DEM) has been developed to simulate the scour process that directly models the hydro-mechanical interactions between the sediment particles and the flowing water. This approach obviates the need for an empirical description as the fundamental fluid-particle, and particle-particle interactions are fully resolved. The sediment bed is simulated as a dense pack of particles and the frictional and collisional forces between particles are calculated, whilst the turbulent fluid flow is modeled using a Reynolds Averaged Navier Stocks (RANS) approach. The CFD-DEM model is validated against experimental data in order to assess the reliability of the CFD-DEM model. The modeling results reveal the criticality of particle impact on the assessment of scour depth which, to the authors’ best knowledge, hasn’t been considered in previous studies. The results of this study open new perspectives to the scour depth and time assessment which is the key to manage the failure risk of bridge infrastructures.Keywords: bridge scour, discrete element method, CFD-DEM model, multi-phase model
Procedia PDF Downloads 1314648 Preparing Data for Calibration of Mechanistic-Empirical Pavement Design Guide in Central Saudi Arabia
Authors: Abdulraaof H. Alqaili, Hamad A. Alsoliman
Abstract:
Through progress in pavement design developments, a pavement design method was developed, which is titled the Mechanistic Empirical Pavement Design Guide (MEPDG). Nowadays, the evolution in roads network and highways is observed in Saudi Arabia as a result of increasing in traffic volume. Therefore, the MEPDG currently is implemented for flexible pavement design by the Saudi Ministry of Transportation. Implementation of MEPDG for local pavement design requires the calibration of distress models under the local conditions (traffic, climate, and materials). This paper aims to prepare data for calibration of MEPDG in Central Saudi Arabia. Thus, the first goal is data collection for the design of flexible pavement from the local conditions of the Riyadh region. Since, the modifying of collected data to input data is needed; the main goal of this paper is the analysis of collected data. The data analysis in this paper includes processing each: Trucks Classification, Traffic Growth Factor, Annual Average Daily Truck Traffic (AADTT), Monthly Adjustment Factors (MAFi), Vehicle Class Distribution (VCD), Truck Hourly Distribution Factors, Axle Load Distribution Factors (ALDF), Number of axle types (single, tandem, and tridem) per truck class, cloud cover percent, and road sections selected for the local calibration. Detailed descriptions of input parameters are explained in this paper, which leads to providing of an approach for successful implementation of MEPDG. Local calibration of MEPDG to the conditions of Riyadh region can be performed based on the findings in this paper.Keywords: mechanistic-empirical pavement design guide (MEPDG), traffic characteristics, materials properties, climate, Riyadh
Procedia PDF Downloads 2264647 Investigating a Deterrence Function for Work Trips for Perth Metropolitan Area
Authors: Ali Raouli, Amin Chegenizadeh, Hamid Nikraz
Abstract:
The Perth metropolitan area and its surrounding regions have been expanding rapidly in recent decades and it is expected that this growth will continue in the years to come. With this rapid growth and the resulting increase in population, consideration should be given to strategic planning and modelling for the future expansion of Perth. The accurate estimation of projected traffic volumes has always been a major concern for the transport modelers and planners. Development of a reliable strategic transport model depends significantly on the inputs data into the model and the calibrated parameters of the model to reflect the existing situation. Trip distribution is the second step in four-step modelling (FSM) which is complex due to its behavioral nature. Gravity model is the most common method for trip distribution. The spatial separation between the Origin and Destination (OD) zones will be reflected in gravity model by applying deterrence functions which provide an opportunity to include people’s behavior in choosing their destinations based on distance, time and cost of their journeys. Deterrence functions play an important role for distribution of the trips within a study area and would simulate the trip distances and therefore should be calibrated for any particular strategic transport model to correctly reflect the trip behavior within the modelling area. This paper aims to review the most common deterrence functions and propose a calibrated deterrence function for work trips within the Perth Metropolitan Area based on the information obtained from the latest available Household data and Perth and Region Travel Survey (PARTS) data. As part of this study, a four-step transport model using EMME software has been developed for Perth Metropolitan Area to assist with the analysis and findings.Keywords: deterrence function, four-step modelling, origin destination, transport model
Procedia PDF Downloads 1684646 Effects of Oral L-Carnitine on Liver Functions after Trans arterial Chemoembolization in Hepatocellular Carcinoma Patients
Authors: Ali Kassem, Aly Taha, Abeer Hassan, Kazuhide Higuchi
Abstract:
Introduction: Trans arterial chemoembolization (TACE) for hepatocellular carcinoma (HCC) is usually followed by hepatic dysfunction that limits its efficacy. L-carnitine is recently studied as hepatoprotective agent. Our aim is to evaluate the L-carnitine effects against the deterioration of liver functions after TACE. Method: 53 patients with intermediate stage HCC were assigned into two groups; L-carnitine group (26 patients) who received L-carnitine 300 mg tablet twice daily from 2 weeks before to 12 weeks after TACE and control group (27 patients) without L-carnitine therapy. 28 of studied patients received branched chain amino acids granules. Results: There were significant differences between L-carnitine Vs. control group in mean serum albumin change from baseline to 1 week and 4 weeks after TACE (p < 0.05). L-Carnitine maintained Child-Pugh score at 1 week after TACE and exhibited improvement at 4 weeks after TACE (p < 0.01 Vs 1 week after TACE). Control group has significant Child-Pugh score deterioration from baseline to 1 week after TACE (p < 0.05) and 12 weeks after TACE (p < 0.05). There were significant differences between L-carnitine and control groups in mean Child-Pugh score change from baseline to 4 weeks (p < 0.05) and 12 weeks after TACE (p < 0.05). L-carnitine displayed improvement in (PT) from baseline to 1 week, 4 w (p < 0.05) and 12 weeks after TACE. PT in control group declined less than baseline along all follow up intervals. Total bilirubin in L-carnitine group decreased at 1 week post TACE while in control group, it significantly increased at 1 week (p = 0.01). ALT and C-reactive protein elevation were suppressed at 1 week after TACE in Lcarnitine group. The hepatoprotective effects of L-carnitine were enhanced by concomitant use of branched chain amino acids. Conclusion: L-carnitine and BCAA combination therapy offer a novel supportive strategy after TACE in HCC patients.Keywords: hepatocellular carcinoma, L-carnitine, liver functions , trans-arterial embolization
Procedia PDF Downloads 1554645 Analyzing the Effects of Real Income and Biomass Energy Consumption on Carbon Dioxide (CO2) Emissions: Empirical Evidence from the Panel of Biomass-Consuming Countries
Authors: Eyup Dogan
Abstract:
This empirical aims to analyze the impacts of real income and biomass energy consumption on the level of emissions in the EKC model for the panel of biomass-consuming countries over the period 1980-2011. Because we detect the presence of cross-sectional dependence and heterogeneity across countries for the analyzed data, we use panel estimation methods robust to cross-sectional dependence and heterogeneity. The CADF and the CIPS panel unit root tests indicate that carbon emissions, real income and biomass energy consumption are stationary at the first-differences. The LM bootstrap panel cointegration test shows that the analyzed variables are cointegrated. Results from the panel group-mean DOLS and the panel group-mean FMOLS estimators show that increase in biomass energy consumption decreases CO2 emissions and the EKC hypothesis is validated. Therefore, countries are advised to boost their production and increase the use of biomass energy for lower level of emissions.Keywords: biomass energy, CO2 emissions, EKC model, heterogeneity, cross-sectional dependence
Procedia PDF Downloads 2974644 Modified Weibull Approach for Bridge Deterioration Modelling
Authors: Niroshan K. Walgama Wellalage, Tieling Zhang, Richard Dwight
Abstract:
State-based Markov deterioration models (SMDM) sometimes fail to find accurate transition probability matrix (TPM) values, and hence lead to invalid future condition prediction or incorrect average deterioration rates mainly due to drawbacks of existing nonlinear optimization-based algorithms and/or subjective function types used for regression analysis. Furthermore, a set of separate functions for each condition state with age cannot be directly derived by using Markov model for a given bridge element group, which however is of interest to industrial partners. This paper presents a new approach for generating Homogeneous SMDM model output, namely, the Modified Weibull approach, which consists of a set of appropriate functions to describe the percentage condition prediction of bridge elements in each state. These functions are combined with Bayesian approach and Metropolis Hasting Algorithm (MHA) based Markov Chain Monte Carlo (MCMC) simulation technique for quantifying the uncertainty in model parameter estimates. In this study, factors contributing to rail bridge deterioration were identified. The inspection data for 1,000 Australian railway bridges over 15 years were reviewed and filtered accordingly based on the real operational experience. Network level deterioration model for a typical bridge element group was developed using the proposed Modified Weibull approach. The condition state predictions obtained from this method were validated using statistical hypothesis tests with a test data set. Results show that the proposed model is able to not only predict the conditions in network-level accurately but also capture the model uncertainties with given confidence interval.Keywords: bridge deterioration modelling, modified weibull approach, MCMC, metropolis-hasting algorithm, bayesian approach, Markov deterioration models
Procedia PDF Downloads 7294643 An Empirical Study on Growth, Trade, Foreign Direct Investment and Environment in India
Authors: Shilpi Tripathi
Abstract:
India has adopted the policy of economic reforms (Globalization, Liberalization, and Privatization) in 1991 which has reduced the trade barriers and investment restrictions and further increased the economy’s international trade, foreign direct investment (FDI) inflows and Gross Domestic Product (GDP) growth. The paper empirically studies the relationship between India’s international trades, GDP, FDI and environment during 1978-2012. The first part of the paper focuses on the background and trends of FDI, GDP, trade, and environment (CO2). The second part focuses on the literature regarding the relationship among all the variables. The last part of paper, we examine the results of empirical analysis like co integration and Granger causality between foreign trade, FDI inflows, GDP and CO2 since 1978. The findings of the paper revealed that there is only one uni- directional causality exists between GDP and trade. The direction of causality reveals that international trade is one of the major contributors to the economic growth (GDP). While, there is no causality found between GDP and FDI, FDI, and CO2 and International trade and CO2. The paper concludes with the policy recommendations that will ensure environmental friendly trade, investment and growth in India for future.Keywords: international trade, foreign direct investment, GDP, CO2, co-integration, granger causality test
Procedia PDF Downloads 440