Search results for: open procedure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5150

Search results for: open procedure

3500 Characterization of Crustin from Litopenaeus vannamei

Authors: Suchao Donpudsa, Anchalee Tassanakajon, Vichien Rimphanitchayakit

Abstract:

A crustin gene, LV-SWD1, previously found in the hemocyte cDNA library of Litopenaeus vannamei, contains the open reading frames of 288 bp encoding a putative protein of 96 amino acid residues. The putative signal peptides of the LV-SWD1 were identified using the online SignalP 3.0 with predicted cleavage sites between Ala24-Val25, resulting in 72 residue mature protein with calculated molecular mass of 7.4 kDa and predicted pI of 8.5. This crustin contains a Arg-Pro rich region at the amino-terminus and a single whey acidic protein (WAP) domain at the carboxyl-terminus. In order to characterize their properties and biological activities, the recombinant crustin protein was produced in the Escherichia coli expression system. Antimicrobial assays showed that the growth of Bacillus subtilis was inhibited by this recombinant crustin with MIC of about 25-50 µM.

Keywords: crustin, single whey acidic protein, Litopenaeus vannamei, antimicrobial activity

Procedia PDF Downloads 239
3499 The Boundary Element Method in Excel for Teaching Vector Calculus and Simulation

Authors: Stephen Kirkup

Abstract:

This paper discusses the implementation of the boundary element method (BEM) on an Excel spreadsheet and how it can be used in teaching vector calculus and simulation. There are two separate spreadheets, within which Laplace equation is solved by the BEM in two dimensions (LIBEM2) and axisymmetric three dimensions (LBEMA). The main algorithms are implemented in the associated programming language within Excel, Visual Basic for Applications (VBA). The BEM only requires a boundary mesh and hence it is a relatively accessible method. The BEM in the open spreadsheet environment is demonstrated as being useful as an aid to teaching and learning. The application of the BEM implemented on a spreadsheet for educational purposes in introductory vector calculus and simulation is explored. The development of assignment work is discussed, and sample results from student work are given. The spreadsheets were found to be useful tools in developing the students’ understanding of vector calculus and in simulating heat conduction.

Keywords: boundary element method, Laplace’s equation, vector calculus, simulation, education

Procedia PDF Downloads 157
3498 Approach-Avoidance Conflict in the T-Maze: Behavioral Validation for Frontal EEG Activity Asymmetries

Authors: Eva Masson, Andrea Kübler

Abstract:

Anxiety disorders (AD) are the most prevalent psychological disorders. However, far from most affected individuals are diagnosed and receive treatment. This gap is probably due to the diagnosis criteria, relying on symptoms (according to the DSM-5 definition) with no objective biomarker. Approach-avoidance conflict tasks are one common approach to simulate such disorders in a lab setting, with most of the paradigms focusing on the relationships between behavior and neurophysiology. Approach-avoidance conflict tasks typically place participants in a situation where they have to make a decision that leads to both positive and negative outcomes, thereby sending conflicting signals that trigger the Behavioral Inhibition System (BIS). Furthermore, behavioral validation of such paradigms adds credibility to the tasks – with overt conflict behavior, it is safer to assume that the task actually induced a conflict. Some of those tasks have linked asymmetrical frontal brain activity to induced conflicts and the BIS. However, there is currently no consensus for the direction of the frontal activation. The authors present here a modified version of the T-Maze paradigm, a motivational conflict desktop task, in which behavior is recorded simultaneously to the recording of high-density EEG (HD-EEG). Methods: In this within-subject design, HD-EEG and behavior of 35 healthy participants was recorded. EEG data was collected with a 128 channels sponge-based system. The motivational conflict desktop task consisted of three blocks of repeated trials. Each block was designed to record a slightly different behavioral pattern, to increase the chances of eliciting conflict. This variety of behavioral patterns was however similar enough to allow comparison of the number of trials categorized as ‘overt conflict’ between the blocks. Results: Overt conflict behavior was exhibited in all blocks, but always for under 10% of the trials, in average, in each block. However, changing the order of the paradigms successfully introduced a ‘reset’ of the conflict process, therefore providing more trials for analysis. As for the EEG correlates, the authors expect a different pattern for trials categorized as conflict, compared to the other ones. More specifically, we expect an elevated alpha frequency power in the left frontal electrodes at around 200ms post-cueing, compared to the right one (relative higher right frontal activity), followed by an inversion around 600ms later. Conclusion: With this comprehensive approach of a psychological mechanism, new evidence would be brought to the frontal asymmetry discussion, and its relationship with the BIS. Furthermore, with the present task focusing on a very particular type of motivational approach-avoidance conflict, it would open the door to further variations of the paradigm to introduce different kinds of conflicts involved in AD. Even though its application as a potential biomarker sounds difficult, because of the individual reliability of both the task and peak frequency in the alpha range, we hope to open the discussion for task robustness for neuromodulation and neurofeedback future applications.

Keywords: anxiety, approach-avoidance conflict, behavioral inhibition system, EEG

Procedia PDF Downloads 34
3497 Urban Planning Compilation Problems in China and the Corresponding Optimization Ideas under the Vision of the Hyper-Cycle Theory

Authors: Hong Dongchen, Chen Qiuxiao, Wu Shuang

Abstract:

Systematic science reveals the complex nonlinear mechanisms of behaviour in urban system. However, in China, when the current city planners face with the system, most of them are still taking simple linear thinking to consider the open complex giant system. This paper introduces the hyper-cycle theory, which is one of the basis theories of systematic science, based on the analysis of the reasons why the current urban planning failed, and proposals for optimization ideas that urban planning compilation should change, from controlling quantitative to the changes of relationship, from blueprint planning to progressive planning based on the nonlinear characteristics and from management control to dynamically monitor feedback.

Keywords: systematic science, hyper-cycle theory, urban planning, urban management

Procedia PDF Downloads 398
3496 Hardware-in-the-Loop Test for Automatic Voltage Regulator of Synchronous Condenser

Authors: Ha Thi Nguyen, Guangya Yang, Arne Hejde Nielsen, Peter Højgaard Jensen

Abstract:

Automatic voltage regulator (AVR) plays an important role in volt/var control of synchronous condenser (SC) in power systems. Test AVR performance in steady-state and dynamic conditions in real grid is expensive, low efficiency, and hard to achieve. To address this issue, we implement hardware-in-the-loop (HiL) test for the AVR of SC to test the steady-state and dynamic performances of AVR in different operating conditions. Startup procedure of the system and voltage set point changes are studied to evaluate the AVR hardware response. Overexcitation, underexcitation, and AVR set point loss are tested to compare the performance of SC with the AVR hardware and that of simulation. The comparative results demonstrate how AVR will work in a real system. The results show HiL test is an effective approach for testing devices before deployment and is able to parameterize the controller with lower cost, higher efficiency, and more flexibility.

Keywords: automatic voltage regulator, hardware-in-the-loop, synchronous condenser, real time digital simulator

Procedia PDF Downloads 247
3495 Hand Gestures Based Emotion Identification Using Flex Sensors

Authors: S. Ali, R. Yunus, A. Arif, Y. Ayaz, M. Baber Sial, R. Asif, N. Naseer, M. Jawad Khan

Abstract:

In this study, we have proposed a gesture to emotion recognition method using flex sensors mounted on metacarpophalangeal joints. The flex sensors are fixed in a wearable glove. The data from the glove are sent to PC using Wi-Fi. Four gestures: finger pointing, thumbs up, fist open and fist close are performed by five subjects. Each gesture is categorized into sad, happy, and excited class based on the velocity and acceleration of the hand gesture. Seventeen inspectors observed the emotions and hand gestures of the five subjects. The emotional state based on the investigators assessment and acquired movement speed data is compared. Overall, we achieved 77% accurate results. Therefore, the proposed design can be used for emotional state detection applications.

Keywords: emotion identification, emotion models, gesture recognition, user perception

Procedia PDF Downloads 281
3494 Urban Design as a Tool in Disaster Resilience and Urban Hazard Mitigation: Case of Cochin, Kerala, India

Authors: Vinu Elias Jacob, Manoj Kumar Kini

Abstract:

Disasters of all types are occurring more frequently and are becoming more costly than ever due to various manmade factors including climate change. A better utilisation of the concept of governance and management within disaster risk reduction is inevitable and of utmost importance. There is a need to explore the role of pre- and post-disaster public policies. The role of urban planning/design in shaping the opportunities of households, individuals and collectively the settlements for achieving recovery has to be explored. Governance strategies that can better support the integration of disaster risk reduction and management has to be examined. The main aim is to thereby build the resilience of individuals and communities and thus, the states too. Resilience is a term that is usually linked to the fields of disaster management and mitigation, but today has become an integral part of planning and design of cities. Disaster resilience broadly describes the ability of an individual or community to 'bounce back' from disaster impacts, through improved mitigation, preparedness, response, and recovery. The growing population of the world has resulted in the inflow and use of resources, creating a pressure on the various natural systems and inequity in the distribution of resources. This makes cities vulnerable to multiple attacks by both natural and man-made disasters. Each urban area needs elaborate studies and study based strategies to proceed in the discussed direction. Cochin in Kerala is the fastest and largest growing city with a population of more than 26 lakhs. The main concern that has been looked into in this paper is making cities resilient by designing a framework of strategies based on urban design principles for an immediate response system especially focussing on the city of Cochin, Kerala, India. The paper discusses, understanding the spatial transformations due to disasters and the role of spatial planning in the context of significant disasters. The paper also aims in developing a model taking into consideration of various factors such as land use, open spaces, transportation networks, physical and social infrastructure, building design, and density and ecology that can be implemented in any city of any context. Guidelines are made for the smooth evacuation of people through hassle-free transport networks, protecting vulnerable areas in the city, providing adequate open spaces for shelters and gatherings, making available basic amenities to affected population within reachable distance, etc. by using the tool of urban design. Strategies at the city level and neighbourhood level have been developed with inferences from vulnerability analysis and case studies.

Keywords: disaster management, resilience, spatial planning, spatial transformations

Procedia PDF Downloads 290
3493 Performance-Based Quality Evaluation of Database Conceptual Schemas

Authors: Janusz Getta, Zhaoxi Pan

Abstract:

Performance-based quality evaluation of database conceptual schemas is an important aspect of database design process. It is evident that different conceptual schemas provide different logical schemas and performance of user applications strongly depends on logical and physical database structures. This work presents the entire process of performance-based quality evaluation of conceptual schemas. First, we show format. Then, the paper proposes a new specification of object algebra for representation of conceptual level database applications. Transformation of conceptual schemas and expression of object algebra into implementation schema and implementation in a particular database system allows for precise estimation of the processing costs of database applications and as a consequence for precise evaluation of performance-based quality of conceptual schemas. Then we describe an experiment as a proof of concept for the evaluation procedure presented in the paper.

Keywords: conceptual schema, implementation schema, logical schema, object algebra, performance evaluation, query processing

Procedia PDF Downloads 287
3492 Assessment and Evaluation of Football Performance

Authors: Bulus Kpame, Mukhtar Mohammed Alhaji, Garba Jibril

Abstract:

In any team sport, the most important variables that should be used to measure performance are physical condition, and technical and tactical performance. In a complex game like football, it is extremely difficult to measure the relative importance of each of these variables. However, physical fitness itself has been shown to consist of several components, like endurance, strength, flexibility, agility, coordination and speed. Each of these components has been shown to consist of several subcomponents. This paper attempts to describe a test battery to assess and evaluate physical performance in football players. This battery comprises a functional, structured training session of about 2.5hrs. it consists of quality rating of the warm-up procedure, tests of flexibility, football skills, power, speed, and endurance. Acceptable values for performance in each of the tests are also presented under each test. It is hoped that this battery of tests will be helpful to the coach in determining the effect of a specific training program. It would also be helpful to train physician and trainer, to monitor progress during rehabilitation after sustaining any injury.

Keywords: assessment, evaluation, performance, programs

Procedia PDF Downloads 402
3491 FPGA Implementation of Novel Triangular Systolic Array Based Architecture for Determining the Eigenvalues of Matrix

Authors: Soumitr Sanjay Dubey, Shubhajit Roy Chowdhury, Rahul Shrestha

Abstract:

In this paper, we have presented a novel approach of calculating eigenvalues of any matrix for the first time on Field Programmable Gate Array (FPGA) using Triangular Systolic Arra (TSA) architecture. Conventionally, additional computation unit is required in the architecture which is compliant to the algorithm for determining the eigenvalues and this in return enhances the delay and power consumption. However, recently reported works are only dedicated for symmetric matrices or some specific case of matrix. This works presents an architecture to calculate eigenvalues of any matrix based on QR algorithm which is fully implementable on FPGA. For the implementation of QR algorithm we have used TSA architecture, which is further utilising CORDIC (CO-ordinate Rotation DIgital Computer) algorithm, to calculate various trigonometric and arithmetic functions involved in the procedure. The proposed architecture gives an error in the range of 10−4. Power consumption by the design is 0.598W. It can work at the frequency of 900 MHz.

Keywords: coordinate rotation digital computer, three angle complex rotation, triangular systolic array, QR algorithm

Procedia PDF Downloads 405
3490 The Effect of Spent Mushroom Substrate on Blood Metabolites in Kurdish Male Lambs

Authors: Alireza Vakili, Shahab Ehtesham, Mohsen Danesh Mesgaran

Abstract:

The objective of this study was use different levels of spent mushroom substrate as a suitable substitute for wheat straw in the ration of male lambs. In this study 20 male lambs with the age of 90 days and initial average weight of 33± 1.7 kg were used. The animals were divided separately into single boxes with four treatments (control treatment, spent mushroom substrate 15%, spent mushroom substrate 25% and spent mushroom substrate 35%) and five replications. The experiment period was 114 days being 14 days adaptation and 90 days for breeding. On the days 36 and 94, blood samples were taken from the jugular vein. In order to carry out the trial, 20 male lambs received the four experimental diets in completely randomized design. The statistical analyses were carried out by using the GLM procedure of SAS 9.1. Means among treatments were compared by Tukey test. The results of the study showed that there was no significant differences between the serum biochemical and hematological contents of the lambs in the four treatments (p>0.05). It was concluded that spent mushroom substrate consumption has no harmful effect on the blood parameters of Kurdish male lambs.

Keywords: alternative food, nutrition, sheep performance, spent mushroom substrate

Procedia PDF Downloads 343
3489 eTransformation Framework for the Cognitive Systems

Authors: Ana Hol

Abstract:

Digital systems are in the cognitive wave of the eTransformations and are now extensively aimed at meeting the individuals’ demands, both those of customers requiring services and those of service providers. It is also apparent that successful future systems will not just simply open doors to the traditional owners/users to offer and receive services such as Uber for example does today, but will in the future require more customized and cognitively enabled infrastructures that will be responsive to the system user’s needs. To be able to identify what is required for such systems, this research reviews the historical and the current effects of the eTransformation process by studying: 1. eTransitions of company websites and mobile applications, 2. Emergence of new sheared economy business models as Uber and, 3. New requirements for demand driven, cognitive systems capable of learning and just in time decision making. Based on the analysis, this study proposes a Cognitive eTransformation Framework capable of guiding implementations of new responsive and user aware systems.

Keywords: system implementations, AI supported systems, cognitive systems, eTransformation

Procedia PDF Downloads 235
3488 The Anti-Allergic Activity of Prasaprohyai Preparation Extract after Accelerated Stability Testing

Authors: Sunita Makchuchit, Arunporn Itharat

Abstract:

Prasaprohyai, a Thai traditional medicine preparation listed in the Thai National List of Essential Medicines, is commonly used for treatment of fever and colds. Prasaprohyai preparation consists of 21 different plants, with Kaempferia galanga (50% w/w) as the main ingredient. The objective of this study was to investigate the anti-allergic activity of the crude extract from Prasaprohyai after accelerated stability test procedure. The method of extract used maceration in 95% ethanol and the crude extract was kept under accelerated condition at 40 ± 2 oC and 75 ± 5% relative humidity (RH) for six months. After six months of storage at 40 oC, the crude sample in various storage times (0, 15, 30, 45, 60, 90, 120, 150 and 180 days) were investigated for anti-allergic activity using IgE-sensitized RBL-2H3 cell lines. The results showed that the stability of crude ethanolic extract from Prasaprohyai under accelerated testing had no significant effect of anti-allergic activity when compared with day 0. The results showed that the ethanolic extract could be stored for two years at room temperature without loss of activity.

Keywords: accelerated stability, anti-allergy, prasaprohyai, RBL-2H3 cell lines

Procedia PDF Downloads 483
3487 Experimental Study on Dehumidification Performance of Supersonic Nozzle

Authors: Esam Jassim

Abstract:

Supersonic nozzles are commonly used to purify natural gas in gas processing technology. As an innovated technology, it is employed to overcome the deficit of the traditional method, related to gas dynamics, thermodynamics and fluid dynamics theory. An indoor test rig is built to study the dehumidification process of moisture fluid. Humid air was chosen for the study. The working fluid was circulating in an open loop, which had provision for filtering, metering, and humidifying. A stainless steel supersonic separator is constructed together with the C-D nozzle system. The result shows that dehumidification enhances as NPR increases. This is due to the high intensity in the turbulence caused by the shock formation in the divergent section. Such disturbance strengthens the centrifugal force, pushing more particles toward the near-wall region. In return return, the pressure recovery factor, defined as the ratio of the outlet static pressure of the fluid to its inlet value, decreases with NPR.

Keywords: supersonic nozzle, dehumidification, particle separation, nozzle geometry

Procedia PDF Downloads 333
3486 Frequent-Pattern Tree Algorithm Application to S&P and Equity Indexes

Authors: E. Younsi, H. Andriamboavonjy, A. David, S. Dokou, B. Lemrabet

Abstract:

Software and time optimization are very important factors in financial markets, which are competitive fields, and emergence of new computer tools further stresses the challenge. In this context, any improvement of technical indicators which generate a buy or sell signal is a major issue. Thus, many tools have been created to make them more effective. This worry about efficiency has been leading in present paper to seek best (and most innovative) way giving largest improvement in these indicators. The approach consists in attaching a signature to frequent market configurations by application of frequent patterns extraction method which is here most appropriate to optimize investment strategies. The goal of proposed trading algorithm is to find most accurate signatures using back testing procedure applied to technical indicators for improving their performance. The problem is then to determine the signatures which, combined with an indicator, outperform this indicator alone. To do this, the FP-Tree algorithm has been preferred, as it appears to be the most efficient algorithm to perform this task.

Keywords: quantitative analysis, back-testing, computational models, apriori algorithm, pattern recognition, data mining, FP-tree

Procedia PDF Downloads 357
3485 Global Analysis in a Growth Economic Model with Perfect-Substitution Technologies

Authors: Paolo Russu

Abstract:

The purpose of the present paper is to highlight some features of an economic growth model with environmental negative externalities, giving rise to a three-dimensional dynamic system. In particular, we show that the economy, which is based on a Perfect-Substitution Technologies function of production, has no neither indeterminacy nor poverty trap. This implies that equilibrium select by economy depends on the history (initial values of state variable) of the economy rather than on expectations of economies agents. Moreover, by contrast, we prove that the basin of attraction of locally equilibrium points may be very large, as they can extend up to the boundary of the system phase space. The infinite-horizon optimal control problem has the purpose of maximizing the representative agent’s instantaneous utility function depending on leisure and consumption.

Keywords: Hopf bifurcation, open-access natural resources, optimal control, perfect-substitution technologies, Poincarè compactification

Procedia PDF Downloads 166
3484 A Formal Verification Approach for Linux Kernel Designing

Authors: Zi Wang, Xinlei He, Jianghua Lv, Yuqing Lan

Abstract:

Kernel though widely used, is complicated. Errors caused by some bugs are often costly. Statically, more than half of the mistakes occur in the design phase. Thus, we introduce a modeling method, KMVM (Linux Kernel Modeling and verification Method), based on type theory for proper designation and correct exploitation of the Kernel. In the model, the Kernel is separated into six levels: subsystem, dentry, file, struct, func, and base. Each level is treated as a type. The types are specified in the structure and relationship. At the same time, we use a demanding path to express the function to be implemented. The correctness of the design is verified by recursively checking the type relationship and type existence. The method has been applied to verify the OPEN business of VFS (virtual file system) in Linux Kernel. Also, we have designed and developed a set of security communication mechanisms in the Kernel with verification.

Keywords: formal approach, type theory, Linux Kernel, software program

Procedia PDF Downloads 125
3483 On Strengthening Program of Sixty Years Old Dome Using Carbon Fiber

Authors: Humayun R. H. Kabir

Abstract:

A reinforced concrete dome-built 60 years ago- of circular shape of diameter of 30 m was in distressed conditions due to adverse weathering effects, such as high temperature, wind, and poor maintenance. It was decided to restore the dome to its full strength for future use. A full material strength and durability check including petrography test were conducted. It was observed that the concrete strength was in acceptable range, while bars were corroded more than 40% to their original configurations. Widespread cracks were almost in every meter square. A strengthening program with filling the cracks by injection method, and carbon fiber layup and wrap was considered. Ultra Sound Pulse Velocity (UPV) test was conducted to observe crack depth. Ground Penetration Radar (GPR) test was conducted to observe internal bar conditions and internal cracks. Finally, a load test was conducted to certify the carbon fiber effectiveness, injection method procedure and overall behavior of dome.

Keywords: dome, strengthening program, carbon fiber, load test

Procedia PDF Downloads 248
3482 Barriers to Entry: The Pitfall of Charter School Accountability

Authors: Ian Kingsbury

Abstract:

The rapid expansion of charter schools (public schools that receive government but do not face the same regulations as traditional public schools) over the preceding two decades has raised concerns over the potential for graft and fraud. These concerns are largely justified: Incidents of financial crime and mismanagement are not unheard of, and the charter sector has become a darling of hedge fund managers. In response, several states have strengthened their charter school regulatory regimes. Imposing regulations and attempting to increase accountability seem like sensible measures, and perhaps they are necessary. However, increased regulation may come at the cost of imposing barriers to entry. Specifically, increased regulation often entails evidence for a high likelihood of fiscal solvency. That should theoretically entail access to capital in the short-term, which may systematically preclude Black or Hispanic applicants from opening charter schools. Moreover, increased regulation necessarily entails more red tape. The institutional wherewithal and the number of hours required to complete an application to open a charter school might favor those who have partnered with an education service provider, specifically a charter management organization (CMO) or education management organization (EMO). These potential barriers to entry pose a significant policy concern. Just as policymakers hope to increase the share of minority teachers and principals, they should sensibly care whether individuals who open charter schools look like the students in that school. Moreover, they might be concerned if successful applications in states with stringent regulations are overwhelmingly affiliated with education service providers. One of the original missions of charter schools was to serve as a laboratory of innovation. Approving only those applications affiliated with education service providers (and in effect establishing a parallel network of schools rather than a diverse marketplace of schools) undermines that mission. Data and methods: The analysis examines more than 2,000 charter school applications from 15 states. It compares the outcomes of applications from states with a strong regulatory environment (those with high scores) from NACSA-the National Association of Charter School Authorizers- to applications from states with a weak regulatory environment (those with a low NACSA score). If the hypothesis is correct, applicants not affiliated with an ESP are more likely to be rejected in high-regulation states compared to those affiliated with an ESP, and minority candidates not affiliated with an education service provider (ESP) are particularly likely to be rejected. Initial returns indicate that the hypothesis holds. More applications in low NASCA-scoring Arizona come from individuals not associated with an ESP, and those individuals are as likely to be accepted as those affiliated with an ESP. On the other hand, applicants in high-NACSA scoring Indiana and Ohio are more than 20 percentage points more likely to be accepted if they are affiliated with an ESP, and the effect is particularly pronounced for minority candidates. These findings should spur policymakers to consider the drawbacks of charter school accountability and consider accountability regimes that do not impose barriers to entry.

Keywords: accountability, barriers to entry, charter schools, choice

Procedia PDF Downloads 152
3481 Modelling Railway Noise Over Large Areas, Assisted by GIS

Authors: Conrad Weber

Abstract:

The modelling of railway noise over large projects areas can be very time consuming in terms of preparing the noise models and calculation time. An open-source GIS program has been utilised to assist with the modelling of operational noise levels for 675km of railway corridor. A range of GIS algorithms were utilised to break up the noise model area into manageable calculation sizes. GIS was utilised to prepare and filter a range of noise modelling inputs, including building files, land uses and ground terrain. A spreadsheet was utilised to manage the accuracy of key input parameters, including train speeds, train types, curve corrections, bridge corrections and engine notch settings. GIS was utilised to present the final noise modelling results. This paper explains the noise modelling process and how the spreadsheet and GIS were utilised to accurately model this massive project efficiently.

Keywords: noise, modeling, GIS, rail

Procedia PDF Downloads 118
3480 Test of Capital Account Monetary Model of Floating Exchange Rate Determination: Further Evidence from Selected African Countries

Authors: Oloyede John Adebayo

Abstract:

This paper tested a variant of the monetary model of exchange rate determination, called Frankel’s Capital Account Monetary Model (CAAM) based on Real Interest Rate Differential, on the floating exchange rate experiences of three developing countries of Africa; viz: Ghana, Nigeria and the Gambia. The study adopted the Auto regressive Instrumental Package (AIV) and Almon Polynomial Lag Procedure of regression analysis based on the assumption that the coefficients follow a third-order Polynomial with zero-end constraint. The results found some support for the CAAM hypothesis that exchange rate responds proportionately to changes in money supply, inversely to income and positively to interest rates and expected inflation differentials. On this basis, the study points the attention of monetary authorities and researchers to the relevance and usefulness of CAAM as appropriate tool and useful benchmark for analyzing the exchange rate behaviour of most developing countries.

Keywords: exchange rate, monetary model, interest differentials, capital account

Procedia PDF Downloads 405
3479 Estimating the Value of Statistical Life under the Subsidization and Cultural Effects

Authors: Mohammad A. Alolayan, John S. Evans, James K. Hammitt

Abstract:

The value of statistical life has been estimated for a middle eastern country with high economical subsidization system. In this study, in-person interviews were conducted on a stratified random sample to estimate the value of mortality risk. Double-bounded dichotomous choice questions followed by open-ended question were used in the interview to investigate the willingness to pay of the respondent for mortality risk reduction. High willingness to pay was found to be associated with high income and education. Also, females were found to have lower willingness to pay than males. The estimated value of statistical life is larger than the ones estimated for western countries where taxation system exists. This estimate provides a baseline for monetizing the health benefits for proposed policy or program to the decision makers in an eastern country. Also, the value of statistical life for a country in the region can be extrapolated from this this estimate by using the benefit transfer method.

Keywords: mortality, risk, VSL, willingness-to-pay

Procedia PDF Downloads 310
3478 Evaluation of Dynamic Log Files for Different Dose Rates in IMRT Plans

Authors: Saad Bin Saeed, Fayzan Ahmed, Shahbaz Ahmed, Amjad Hussain

Abstract:

The aim of this study is to evaluate dynamic log files (Dynalogs) at different dose rates by dose-volume histograms (DVH) and used as a (QA) procedure of IMRT. Seven patients of phase one head and neck cancer with similar OAR`s are selected randomly. Reference plans of dose rate 300 and 600 MU/Min with prescribed dose of 50Gy in 25 fractions for each patient is made. Dynalogs produced by delivery of reference plans processed by in-house MATLAB program which produces new field files contain actual positions of multi-leaf collimators (MLC`s) instead of planned positions in reference plans. Copies of reference plans are used to import new field files generated by MATLAB program and renamed as Dyn.plan. After dose calculations of Dyn.plans for different dose rates, DVH, and multiple linear regression tools are used to evaluate reference and Dyn.plans. The results indicate good agreement of correlation between different dose rate plans. The maximum dose difference among PTV and OAR`s are found to be less than 5% and 9% respectively. The study indicates the potential of dynalogs to be used as patient-specific QA of IMRT at different dose rate.

Keywords: IMRT, dynalogs, dose rate, DVH

Procedia PDF Downloads 527
3477 Obstacles to Innovation for SMEs: Evidence from Germany

Authors: Natalia Strobel, Jan Kratzer

Abstract:

Achieving effective innovation is a complex task and during this process firms (especially SMEs) often face obstacles. However, research into obstacles to innovation focusing on SMEs is very scarce. In this study, we propose a theoretical framework for describing these obstacles to innovation and investigate their influence on the innovative performance of SMEs. Data were collected in 2013 through face-to-face interviews with executives of 49 technology SMEs from Germany. The semi-structured interviews were designed on the basis of scales for measuring innovativeness, financial/competitive performance and obstacles to innovation, next to purely open questions. We find that the internal obstacles lack the know-how, capacity overloading, unclear roles and tasks, as well as the external obstacle governmental bureaucracy negatively influence the innovative performance of SMEs. However, in contrast to prior findings this study shows that cooperation ties of firms might also negatively influence the innovative performance.

Keywords: innovation, innovation process, obstacles, SME

Procedia PDF Downloads 347
3476 Evaluating the Possibility of Expanding National Health Insurance Funding From Zakat, Sudan

Authors: Fawzia Mohammed Idris

Abstract:

Zakat is an Islamic procedure for wealth distribution as a social protection mechanism for needy people. This study aimed to assess the possibility to expand the share of fund for national health insurance fund from zakat funds allocated for poor people by measuring the reduction of poverty that result from the investing on direct payment to the needy or by covering them in social health insurance. This study used stata regression as a statistical analysis tool and the finding clarified that there is no significant relationship between the poverty rate as the main indicator and, the number of poor people covered by national health insurance on one hand and the number of benefits poor people from the distribution of zakat fund. This study experienced many difficulties regarding the quality and the consistency of the data. The study suggested that a joint mission between national health insurance fund and zakat chamber to conduct study to assess the efficient use of zakat fund allocated to poor people.

Keywords: health finance, poverty, social health insurance, zakat

Procedia PDF Downloads 140
3475 Quantum Kernel Based Regressor for Prediction of Non-Markovianity of Open Quantum Systems

Authors: Diego Tancara, Raul Coto, Ariel Norambuena, Hoseein T. Dinani, Felipe Fanchini

Abstract:

Quantum machine learning is a growing research field that aims to perform machine learning tasks assisted by a quantum computer. Kernel-based quantum machine learning models are paradigmatic examples where the kernel involves quantum states, and the Gram matrix is calculated from the overlapping between these states. With the kernel at hand, a regular machine learning model is used for the learning process. In this paper we investigate the quantum support vector machine and quantum kernel ridge models to predict the degree of non-Markovianity of a quantum system. We perform digital quantum simulation of amplitude damping and phase damping channels to create our quantum dataset. We elaborate on different kernel functions to map the data and kernel circuits to compute the overlapping between quantum states. We observe a good performance of the models.

Keywords: quantum, machine learning, kernel, non-markovianity

Procedia PDF Downloads 172
3474 Useful Lifetime Prediction of Rail Pads for High Speed Trains

Authors: Chang Su Woo, Hyun Sung Park

Abstract:

Useful lifetime evaluations of rail-pads were very important in design procedure to assure the safety and reliability. It is, therefore, necessary to establish a suitable criterion for the replacement period of rail pads. In this study, we performed properties and accelerated heat aging tests of rail pads considering degradation factors and all environmental conditions including operation, and then derived a lifetime prediction equation according to changes in hardness, thickness, and static spring constants in the Arrhenius plot to establish how to estimate the aging of rail pads. With the useful lifetime prediction equation, the lifetime of e-clip pads was 2.5 years when the change in hardness was 10% at 25°C; and that of f-clip pads was 1.7 years. When the change in thickness was 10%, the lifetime of e-clip pads and f-clip pads is 2.6 years respectively. The results obtained in this study to estimate the useful lifetime of rail pads for high speed trains can be used for determining the maintenance and replacement schedule for rail pads.

Keywords: rail pads, accelerated test, Arrhenius plot, useful lifetime prediction, mechanical engineering design

Procedia PDF Downloads 317
3473 Status of Artisanal Fishery in Libya

Authors: Esmail Shakman, Khaled Etyab, Ibraheim Taboni, Mohamed Et-wail, Abdallah Ben Abdallah

Abstract:

This study was carried out along the Libyan coast during the period from 1st February to 31st March 2013. More than 120 landing sites have been visited in order to investigate their status and fishing activities. The study found that more than 91% of the landing sites were permanent and around 8% were seasonal. The type of landing sites were mostly harbors (42.86%), 31.75% protected bays and 25.4% are open beach. However, seven types of fishing boats were observed; flouka type was the largest percentage (70.06%), then 18.14% for mator, 3.28% for lampara, 0.41% for Tarrad, Gayag (0.16%), 5.97 for Daghesa, and 1.98% for batah. Moreover, the majority of them were concentrated in the western region of the country. The most common used fishing gearsare the trammel nets about 80%, which are used by flouka, mator, Tarrad, and batah. The using of trammel nets rely on the fishing season, fishes size and the target fish species. The other fishing gears are also used but occasionally.

Keywords: fishery, South Mediterranean, landing sites, marine biology

Procedia PDF Downloads 513
3472 Research on Architectural Steel Structure Design Based on BIM

Authors: Tianyu Gao

Abstract:

Digital architectures use computer-aided design, programming, simulation, and imaging to create virtual forms and physical structures. Today's customers want to know more about their buildings. They want an automatic thermostat to learn their behavior and contact them, such as the doors and windows they want to open with a mobile app. Therefore, the architectural display form is more closely related to the customer's experience. Based on the purpose of building informationization, this paper studies the steel structure design based on BIM. Taking the Zigan office building in Hangzhou as an example, it is divided into four parts, namely, the digital design modulus of the steel structure, the node analysis of the steel structure, the digital production and construction of the steel structure. Through the application of BIM software, the architectural design can be synergized, and the building components can be informationized. Not only can the architectural design be feedback in the early stage, but also the stability of the construction can be guaranteed. In this way, the monitoring of the entire life cycle of the building and the meeting of customer needs can be realized.

Keywords: digital architectures, BIM, steel structure, architectural design

Procedia PDF Downloads 189
3471 Analyzing Time Lag in Seismic Waves and Its Effects on Isolated Structures

Authors: Faizan Ahmad, Jenna Wong

Abstract:

Time lag between peak values of horizontal and vertical seismic waves is a well-known phenomenon. Horizontal and vertical seismic waves, secondary and primary waves in nature respectively, travel through different layers of soil and the travel time is dependent upon the medium of wave transmission. In seismic analysis, many standardized codes do not require the actual vertical acceleration to be part of the analysis procedure. Instead, a factor load addition for a particular site is used to capture strength demands in case of vertical excitation. This study reviews the effects of vertical accelerations to analyze the behavior of a linearly rubber isolated structure in different time lag situations and frequency content by application of historical and simulated ground motions using SAP2000. The response of the structure is reviewed under multiple sets of ground motions and trends based on time lag and frequency variations are drawn. The accuracy of these results is discussed and evaluated to provide reasoning for use of real vertical excitations in seismic analysis procedures, especially for isolated structures.

Keywords: seismic analysis, vertical accelerations, time lag, isolated structures

Procedia PDF Downloads 329