Search results for: declarative specification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 272

Search results for: declarative specification

62 Computer Modeling and Plant-Wide Dynamic Simulation for Industrial Flare Minimization

Authors: Sujing Wang, Song Wang, Jian Zhang, Qiang Xu

Abstract:

Flaring emissions during abnormal operating conditions such as plant start-ups, shut-downs, and upsets in chemical process industries (CPI) are usually significant. Flare minimization can help to save raw material and energy for CPI plants, and to improve local environmental sustainability. In this paper, a systematic methodology based on plant-wide dynamic simulation is presented for CPI plant flare minimizations under abnormal operating conditions. Since off-specification emission sources are inevitable during abnormal operating conditions, to significantly reduce flaring emission in a CPI plant, they must be either recycled to the upstream process for online reuse, or stored somewhere temporarily for future reprocessing, when the CPI plant manufacturing returns to stable operation. Thus, the off-spec products could be reused instead of being flared. This can be achieved through the identification of viable design and operational strategies during normal and abnormal operations through plant-wide dynamic scheduling, simulation, and optimization. The proposed study includes three stages of simulation works: (i) developing and validating a steady-state model of a CPI plant; (ii) transiting the obtained steady-state plant model to the dynamic modeling environment; and refining and validating the plant dynamic model; and (iii) developing flare minimization strategies for abnormal operating conditions of a CPI plant via a validated plant-wide dynamic model. This cost-effective methodology has two main merits: (i) employing large-scale dynamic modeling and simulations for industrial flare minimization, which involves various unit models for modeling hundreds of CPI plant facilities; (ii) dealing with critical abnormal operating conditions of CPI plants such as plant start-up and shut-down. Two virtual case studies on flare minimizations for start-up operation (over 50% of emission savings) and shut-down operation (over 70% of emission savings) of an ethylene plant have been employed to demonstrate the efficacy of the proposed study.

Keywords: flare minimization, large-scale modeling and simulation, plant shut-down, plant start-up

Procedia PDF Downloads 289
61 Specification and Unification of All Fundamental Forces Exist in Universe in the Theoretical Perspective – The Universal Mechanics

Authors: Surendra Mund

Abstract:

At the beginning, the physical entity force was defined mathematically by Sir Isaac Newton in his Principia Mathematica as F ⃗=(dp ⃗)/dt in form of his second law of motion. Newton also defines his Universal law of Gravitational force exist in same outstanding book, but at the end of 20th century and beginning of 21st century, we have tried a lot to specify and unify four or five Fundamental forces or Interaction exist in universe, but we failed every time. Usually, Gravity creates problems in this unification every single time, but in my previous papers and presentations, I defined and derived Field and force equations for Gravitational like Interactions for each and every kind of central systems. This force is named as Variational Force by me, and this force is generated by variation in the scalar field density around the body. In this particular paper, at first, I am specifying which type of Interactions are Fundamental in Universal sense (or in all type of central systems or bodies predicted by my N-time Inflationary Model of Universe) and then unify them in Universal framework (defined and derived by me as Universal Mechanics in a separate paper) as well. This will also be valid in Universal dynamical sense which includes inflations and deflations of universe, central system relativity, Universal relativity, ϕ-ψ transformation and transformation of spin, physical perception principle, Generalized Fundamental Dynamical Law and many other important Generalized Principles of Generalized Quantum Mechanics (GQM) and Central System Theory (CST). So, In this article, at first, I am Generalizing some Fundamental Principles, and then Unifying Variational Forces (General form of Gravitation like Interactions) and Flow Generated Force (General form of EM like Interactions), and then Unify all Fundamental Forces by specifying Weak and Strong Interactions in form of more basic terms - Variational, Flow Generated and Transformational Interactions.

Keywords: Central System Force, Disturbance Force, Flow Generated Forces, Generalized Nuclear Force, Generalized Weak Interactions, Generalized EM-Like Interactions, Imbalance Force, Spin Generated Forces, Transformation Generated Force, Unified Force, Universal Mechanics, Uniform And Non-Uniform Variational Interactions, Variational Interactions

Procedia PDF Downloads 22
60 The Model of Learning Centre on OTOP Production Process Based on Sufficiency Economic Philosophy for Sustainable Life Quality

Authors: Napasri Suwanajote

Abstract:

The purposes of this research were to analyse and evaluate successful factors in OTOP production process for the developing of learning centre on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The research has been designed as a qualitative study to gather information from 30 OTOP producers in Bangkontee District, Samudsongkram Province. They were all interviewed on 3 main parts. Part 1 was about the production process including 1) production 2) product development 3) the community strength 4) marketing possibility and 5) product quality. Part 2 evaluated appropriate successful factors including 1) the analysis of the successful factors 2) evaluate the strategy based on Sufficiency Economic Philosophy and 3) the model of learning centre on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The results showed that the production did not affect the environment with potential in continuing standard quality production. They used the raw materials in the country. On the aspect of product and community strength in the past 1 year, it was found that there was no appropriate packaging showing product identity according to global market standard. They needed the training on packaging especially for food and drink products. On the aspect of product quality and product specification, it was found that the products were certified by the local OTOP standard. There should be a responsible organization to help the uncertified producers pass the standard. However, there was a problem on food contamination which was hazardous to the consumers. The producers should cooperate with the government sector or educational institutes involving with food processing to reach FDA standard. The results from small group discussion showed that the community expected high education and better standard living. Some problems reported by the community included informal debt and drugs in the community. There were 8 steps in developing the model of learning centre on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality.

Keywords: production process, OTOP, sufficiency economic philosophy, marketing management

Procedia PDF Downloads 210
59 Assessing the Impact of Climate Change on Pulses Production in Khyber Pakhtunkhwa, Pakistan

Authors: Khuram Nawaz Sadozai, Rizwan Ahmad, Munawar Raza Kazmi, Awais Habib

Abstract:

Climate change and crop production are intrinsically associated with each other. Therefore, this research study is designed to assess the impact of climate change on pulses production in Southern districts of Khyber Pakhtunkhwa (KP) Province of Pakistan. Two pulses (i.e. chickpea and mung bean) were selected for this research study with respect to climate change. Climatic variables such as temperature, humidity and precipitation along with pulses production and area under cultivation of pulses were encompassed as the major variables of this study. Secondary data of climatic variables and crop variables for the period of thirty four years (1986-2020) were obtained from Pakistan Metrological Department and Agriculture Statistics of KP respectively. Panel data set of chickpea and mung bean crops was estimated separately. The analysis validate that both data sets were a balanced panel data. The Hausman specification test was run separately for both the panel data sets whose findings had suggested the fixed effect model can be deemed as an appropriate model for chickpea panel data, however random effect model was appropriate for estimation of the panel data of mung bean. Major findings confirm that maximum temperature is statistically significant for the chickpea yield. This implies if maximum temperature increases by 1 0C, it can enhance the chickpea yield by 0.0463 units. However, the impact of precipitation was reported insignificant. Furthermore, the humidity was statistically significant and has a positive association with chickpea yield. In case of mung bean the minimum temperature was significantly contributing in the yield of mung bean. This study concludes that temperature and humidity can significantly contribute to enhance the pulses yield. It is recommended that capacity building of pulses growers may be made to adapt the climate change strategies. Moreover, government may ensure the availability of climate change resistant varieties of pulses to encourage the pulses cultivation.

Keywords: climate change, pulses productivity, agriculture, Pakistan

Procedia PDF Downloads 16
58 Laboratory Investigation of the Pavement Condition in Lebanon: Implementation of Reclaimed Asphalt Pavement in the Base Course and Asphalt Layer

Authors: Marinelle El-Khoury, Lina Bouhaya, Nivine Abbas, Hassan Sleiman

Abstract:

The road network in the north of Lebanon is a prime example of the lack of pavement design and execution in Lebanon.  These roads show major distresses and hence, should be tested and evaluated. The aim of this research is to investigate and determine the deficiencies in road surface design in Lebanon, and to propose an environmentally friendly asphalt mix design. This paper consists of several parts: (i) evaluating pavement performance and structural behavior, (ii) identifying the distresses using visual examination followed by laboratory tests, (iii) deciding the optimal solution where rehabilitation or reconstruction is required and finally, (iv) identifying a sustainable method, which uses recycled material in the proposed mix. The asphalt formula contains Reclaimed Asphalt Pavement (RAP) in the base course layer and in the asphalt layer. Visual inspection of the roads in Tripoli shows that these roads face a high level of distress severity. Consequently, the pavement should be reconstructed rather than simply rehabilitated. Coring was done to determine the pavement layer thickness. The results were compared to the American Association of State Highway and Transportation Officials (AASHTO) design methodology and showed that the existing asphalt thickness is lower than the required asphalt thickness. Prior to the pavement reconstruction, the road materials were tested according to the American Society for Testing and Materials (ASTM) specification to identify whether the materials are suitable. Accordingly, the ASTM tests that were performed on the base course are Sieve analysis, Atterberg limits, modified proctor, Los Angeles, and California Bearing Ratio (CBR) tests. Results show a CBR value higher than 70%. Hence, these aggregates could be used as a base course layer. The asphalt layer was also tested and the results of the Marshall flow and stability tests meet the ASTM specifications. In the last section, an environmentally friendly mix was proposed. An optimal RAP percentage of 30%, which produced a well graded base course and asphalt mix, was determined through a series of trials.

Keywords: asphalt mix, reclaimed asphalt pavement, California bearing ratio, sustainability

Procedia PDF Downloads 95
57 The Effect of Cross-Curriculum of L1 and L2 on Elementary School Students’ Linguistic Proficiency: To Sympathize with Others

Authors: Reiko Yamamoto

Abstract:

This paper reports on a project to integrate Japanese (as a first language) and English (as a second language) education. This study focuses on the mutual effects of the two languages on the linguistic proficiency of elementary school students. The research team consisted of elementary school teachers and researchers at a university. The participants of the experiment were students between 3rd and 6th grades at an elementary school. The research process consisted of seven steps: 1) specifying linguistic proficiency; 2) developing the cross-curriculum of L1 and L2; 3) forming can-do statements; 4) creating a self-evaluation questionnaire; 5) executing the self-evaluation questionnaire at the beginning of the school year; 6) instructing L1 and L2 based on the curriculum; and 7) executing the self-evaluation questionnaire at the beginning of the next school year. In Step 1, the members of the research team brainstormed ways to specify elementary school students’ linguistic proficiency that can be observed in various scenes. It was revealed that the teachers evaluate their students’ linguistic proficiency on the basis of the students’ utterances, but also informed by their non-verbal communication abilities. This led to the idea that competency for understanding others’ minds through the use of physical movement or bodily senses in communication in L1 – to sympathize with others – can be transferred to that same competency in communication in L2. Based on the specification of linguistic proficiency that L1 and L2 have in common, a cross-curriculum of L1 and L2 was developed in Step 2. In Step 3, can-do statements based on the curriculum were also formed, building off of the action-oriented approach from the Common European Framework of Reference for Languages (CEFR) used in Europe. A self-evaluation questionnaire consisting of the main can-do statements was given to the students between 3rd grade and 6th grade at the beginning of the school year (Step 4 and Step 5), and all teachers gave L1 and L2 instruction based on the curriculum to the students for one year (Step 6). The same questionnaire was given to the students at the beginning of the next school year (Step 7). The results of statistical analysis proved the enhancement of the students’ linguistic proficiency. This verified the validity of developing the cross-curriculum of L1 and L2 and adapting it in elementary school. It was concluded that elementary school students do not distinguish between L1 and L2, and that they just try to understand others’ minds through physical movement or senses in any language.

Keywords: cross curriculum of L1 and L2, elementary school education, language proficiency, sympathy with others

Procedia PDF Downloads 414
56 Impact of Import Restriction on Rice Production in Nigeria

Authors: C. O. Igberi, M. U. Amadi

Abstract:

This research paper on the impact of import restriction on rice production in Nigeria is aimed at finding/proffering valid solutions to the age long problem of rice self-sufficiency, through a better understanding of policy measures used in the past, in this case, the effectiveness of rice import restriction of the early 90’s. It tries to answer the questions of; import restriction boosting domestic rice production and the macroeconomic determining factors of Gross Domestic Rice Product (GDRP). The research probe is investigated through literature and analytical frameworks, such that time series data on the GDRP, Gross Fixed Capital Formation (GFCF), average foreign rice producers’ prices(PPF), domestic producers’ prices (PPN) and the labour force (LABF) are collated for analysis (with an import restriction dummy variable, POL1). The research objectives/hypothesis are analysed using; Cointegration, Vector Error Correction Model (VECM), Impulse Response Function (IRF) and Granger Causality Test(GCT) methodologies. Results show that in the short-run error correction specification for GDRP, a percentage (1%) deviation away from the long-run equilibrium in a current quarter is only corrected by 0.14% in the subsequent quarter. Also, the rice import restriction policy had no significant effect on the GDRP at this time. Other findings show that the policy period has, in fact, had effects on the PPN and LABF. The choice variables used are valid macroeconomic factors that explain the GDRP of Nigeria, as adduced from the IRF and GCT, and in the long-run. Policy recommendations suggest that the import restriction is not disqualified as a veritable tool for improving domestic rice production, rather better enforcement procedures and strict adherence to the policy dictates is needed. Furthermore, accompanying policies which drive public and private capital investment and accumulation must be introduced. Also, employment rate and labour substitution in the agricultural sector should not be drastically changed, rather its welfare and efficiency be improved.

Keywords: import restriction, gross domestic rice production, cointegration, VECM, Granger causality, impulse response function

Procedia PDF Downloads 176
55 Using Serious Games to Integrate the Potential of Mass Customization into the Fuzzy Front-End of New Product Development

Authors: Michael N. O'Sullivan, Con Sheahan

Abstract:

Mass customization is the idea of offering custom products or services to satisfy the needs of each individual customer while maintaining the efficiency of mass production. Technologies like 3D printing and artificial intelligence have many start-ups hoping to capitalize on this dream of creating personalized products at an affordable price, and well established companies scrambling to innovate and maintain their market share. However, the majority of them are failing as they struggle to understand one key question – where does customization make sense? Customization and personalization only make sense where the value of the perceived benefit outweighs the cost to implement it. In other words, will people pay for it? Looking at the Kano Model makes it clear that it depends on the product. In products where customization is an inherent need, like prosthetics, mass customization technologies can be highly beneficial. However, for products that already sell as a standard, like headphones, offering customization is likely only an added bonus, and so the product development team must figure out if the customers’ perception of the added value of this feature will outweigh its premium price tag. This can be done through the use of a ‘serious game,’ whereby potential customers are given a limited budget to collaboratively buy and bid on potential features of the product before it is developed. If the group choose to buy customization over other features, then the product development team should implement it into their design. If not, the team should prioritize the features on which the customers have spent their budget. The level of customization purchased can also be translated to an appropriate production method, for example, the most expensive type of customization would likely be free-form design and could be achieved through digital fabrication, while a lower level could be achieved through short batch production. Twenty-five teams of final year students from design, engineering, construction and technology tested this methodology when bringing a product from concept through to production specification, and found that it allowed them to confidently decide what level of customization, if any, would be worth offering for their product, and what would be the best method of producing it. They also found that the discussion and negotiations between players during the game led to invaluable insights, and often decided to play a second game where they offered customers the option to buy the various customization ideas that had been discussed during the first game.

Keywords: Kano model, mass customization, new product development, serious game

Procedia PDF Downloads 107
54 Language Choice and Language Maintenance of Northeastern Thai Staff in Suan Sunandha Rajabhat University

Authors: Napasri Suwanajote

Abstract:

The purposes of this research were to analyze and evaluate successful factors in OTOP production process for the developing of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The research has been designed as a qualitative study to gather information from 30 OTOP producers in Bangkontee District, Samudsongkram Province. They were all interviewed on 3 main parts. Part 1 was about the production process including 1) production, 2) product development, 3) the community strength, 4) marketing possibility, and 5) product quality. Part 2 evaluated appropriate successful factors including 1) the analysis of the successful factors, 2) evaluate the strategy based on Sufficiency Economic Philosophy, and 3) the model of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The results showed that the production did not affect the environment with potential in continuing standard quality production. They used the raw materials in the country. On the aspect of product and community strength in the past 1 year, it was found that there was no appropriate packaging showing product identity according to global market standard. They needed the training on packaging especially for food and drink products. On the aspect of product quality and product specification, it was found that the products were certified by the local OTOP standard. There should be a responsible organization to help the uncertified producers pass the standard. However, there was a problem on food contamination which was hazardous to the consumers. The producers should cooperate with the government sector or educational institutes involving with food processing to reach FDA standard. The results from small group discussion showed that the community expected high education and better standard living. Some problems reported by the community included informal debt and drugs in the community. There were 8 steps in developing the model of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality.

Keywords: production process, OTOP, sufficiency economic philosophy, language choice

Procedia PDF Downloads 204
53 Cognitive Emotion Regulation Strategies in 9–14-Year-Old Hungarian Children with Neurotypical Development in the Light of the Hungarian Version of Cognitive Emotion Regulation Questionnaire for Children

Authors: Dorottya Horváth, Andras Lang, Diana Varro-Horvath

Abstract:

This research activity and study is part of a major research effort to gain an integrative, neuropsychological, and personality psychological understanding of Attention Deficit Hyperactivity Disorder (ADHD) and thus improve the specification of diagnostic and therapeutic care. In the past, the neuropsychology section has investigated working memory, executive function, attention, and behavioural manifestations in children. Currently, we are looking for personality psychological protective factors for ADHD and its symptomatic exacerbation. We hypothesise that secure attachment, adaptive emotion regulation, and high resilience are protective factors. The aim of this study is to measure and report the results of a Hungarian sample of the Cognitive Emotion Regulation Questionnaire for Children (CERQ-k) because before studying groups with different developmental differences, it is essential to know the average scores of groups with neurotypical devel-opment. Until now, there was no Hungarian version of the above test, so we used our own translation. This questionnaire has been developed to assess children's thoughts after experiencing negative life events. It consists of 4-4 items per subscale, for a total of 36 items. The response categories for each item range from 1 (almost never) to 5 (almost always). The subscales were self-blame, blaming others, acceptance, planning, positive refocusing, rumination or thought-focusing, positive reappraisal, putting into perspective, and catastrophizing. The data for this study were collected from 120 children aged 9-14 years. It was analysed using descriptive statistical analysis, where the mean and standard deviation values for each age group, as well as the Cronbach's alpha value, were significant in testing the reliability of the questionnaire. The results showed that the questionnaire is a reliable and valid measuring instrument also on a Hungarian sample. These developments and results will allow the use of a version of the Cognitive Emotion Regulation Questionnaire for children in Hungarian and pave the way for the study of different developmental groups such as children with learning disabilities and/or with ADHD.

Keywords: neurotypical development, emotion regulation, negative life events, CERQ-k, Hungarian average scores

Procedia PDF Downloads 41
52 The Relationship between Central Bank Independence and Inflation: Evidence from Africa

Authors: R. Bhattu Babajee, Marie Sandrine Estelle Benoit

Abstract:

The past decades have witnessed a considerable institutional shift towards Central Bank Independence across economies of the world. The motivation behind such a change is the acceptance that increased central bank autonomy has the power of alleviating inflation bias. Hence, studying whether Central Bank Independence acts as a significant factor behind the price stability in the African economies or whether this macroeconomic aim in these countries result from other economic, political or social factors is a pertinent issue. The main research objective of this paper is to assess the relationship between central bank autonomy and inflation in African economies where inflation has proved to be a serious problem. In this optic, we shall measure the degree of CBI in Africa by computing the turnover rates of central banks governors thereby studying whether decisions made by African central banks are affected by external forces. The purpose of this study is to investigate empirically the association between Central Bank Independence (CBI) and inflation for 10 African economies over a period of 17 years, from 1995 to 2012. The sample includes Botswana, Egypt, Ghana, Kenya, Madagascar, Mauritius, Mozambique, Nigeria, South Africa, and Uganda. In contrast to empirical research, we have not been using the usual static panel model for it is associated with potential mis specification arising from the absence of dynamics. To this issue a dynamic panel data model which integrates several control variables has been used. Firstly, the analysis includes dynamic terms to explain the tenacity of inflation. Given the confirmation of inflation inertia, that is very likely in African countries there exists the need for including lagged inflation in the empirical model. Secondly, due to known reverse causality between Central Bank Independence and inflation, the system generalized method of moments (GMM) is employed. With GMM estimators, the presence of unknown forms of heteroskedasticity is admissible as well as auto correlation in the error term. Thirdly, control variables have been used to enhance the efficiency of the model. The main finding of this paper is that central bank independence is negatively associated with inflation even after including control variables.

Keywords: central bank independence, inflation, macroeconomic variables, price stability

Procedia PDF Downloads 345
51 New Drug Discoveries and Packaging Challenges

Authors: Anupam Chanda

Abstract:

Presently Packaging plays a significant role for drug discoveries. The process of selecting materials and the type of packaging also offers an opportunity for the Packaging scientist to look for biological delivery choices. Most injectable protein products were supplied in some sort of glass vial, prefilled syringe, cartridge. Those product having high Ph content there is a chance of “delamination “from inner surface of glass vial. With protein-based drugs, the biggest issue is the effect of packaging derivatives on the protein’s threedimensional and surface structure. These are any effects that relate to denaturation or aggregation of the protein due to oxidation or interactions from contaminants or impurities in the preparation. The potential for these effects needs to be carefully considered in choosing the container and the container closure system to avoid putting patients in jeopardy. Cause of Delamination : -Formulations with a high pH include phosphate and citrate buffers increase the risk of glass delamination. -High alkali content in glass could accelerate erosion. -High temperature during the vial-forming process increase the risk of glass delamination. -Terminal sterilization (irradiated at 20-40 kGy for 150 min) also is a risk factor for specific products(veterinary parenteral administration),could cause delamination. -High product-storage temperatures and long exposure times can increase the rate and severity of glass delamination. How to prevent Delamination -Treating the surface of the glass vials with materials, such as ammonium sulfate or siliconization can reduce the rate of glass erosion. -Consider alternative sterilization methods only in rare cases. -The correct specification for the glass to ensure its suitability for the pH of the product. -Use Cyclic olefin copolymer(COC)/Cyclic olefin Polymer(COP) Adsorption of protein and Solutions: Option#1 Coat with linear methoxylated polyglycerol and hyperbranchedmethoxylated polyglycerol. Option#2 Thehyperbranched non-methoxylated coating performed best. Option#3 Coat with hyperbranched polyglycerol Option#4 Right selection of Sterilization of glass vial/syringe.

Keywords: delamination of glass, ptrotien adoptions inside the glass surface, extractable & leachable solutions, injectable designs for new drugs

Procedia PDF Downloads 75
50 A Method to Ease the Military Certification Process by Taking Advantage of Civil Standards in the Scope of Human Factors

Authors: Burcu Uçan

Abstract:

The certification approach differs in civil and military projects in aviation. Sets of criteria and standards created by airworthiness authorities for the determination of certification basis are distinct. While the civil standards are more understandable and clear because of not only include detailed specifications but also the help of guidance materials such as Advisory Circular, military criteria do not provide this level of guidance. Therefore, specifications that are more negotiable and sometimes more difficult to reconcile arise for the certification basis of a military aircraft. This study investigates a method of how to develop a military specification set by taking advantage of civil standards, regarding the European Military Airworthiness Criteria (EMACC) that establishes the airworthiness criteria for aircraft systems. Airworthiness Certification Criteria (MIL-HDBK-516C) is a handbook published for guidance that contains qualitative evaluation for military aircrafts meanwhile Certification Specifications (CS-29) is published for civil aircrafts by European Union Aviation Safety Agency (EASA). This method intends to compare and contrast specifications that MIL-HDBK-516C and CS-29 contain within the scope of Human Factors. Human Factors supports human performance and aims to improve system performance by encompassing knowledge from a range of scientific disciplines. Human Factors focuses on how people perform their tasks and reduce the risk of an accident occurring due to human physical and cognitive limitations. Hence, regardless of whether the project is civil or military, the specifications must be guided at a certain level by taking into account human limits. This study presents an advisory method for this purpose. The method in this study develops a solution for the military certification process by identifying the CS requirement corresponding to the criteria in the MIL-HDBK-516C by means of EMACC. Thus, it eases understanding the expectations of the criteria and establishing derived requirements. As a result of this method, it may not always be preferred to derive new requirements. Instead, it is possible to add remarks to make the expectancy of the criteria and required verification methods more comprehensible for all stakeholders. This study contributes to creating a certification basis for military aircraft, which is difficult and takes plenty of time for stakeholders to agree due to gray areas in the certification process for military aircrafts.

Keywords: human factors, certification, aerospace, requirement

Procedia PDF Downloads 43
49 Analytical Performance of Cobas C 8000 Analyzer Based on Sigma Metrics

Authors: Sairi Satari

Abstract:

Introduction: Six-sigma is a metric that quantifies the performance of processes as a rate of Defects-Per-Million Opportunities. Sigma methodology can be applied in chemical pathology laboratory for evaluating process performance with evidence for process improvement in quality assurance program. In the laboratory, these methods have been used to improve the timeliness of troubleshooting, reduce the cost and frequency of quality control and minimize pre and post-analytical errors. Aim: The aim of this study is to evaluate the sigma values of the Cobas 8000 analyzer based on the minimum requirement of the specification. Methodology: Twenty-one analytes were chosen in this study. The analytes were alanine aminotransferase (ALT), albumin, alkaline phosphatase (ALP), Amylase, aspartate transaminase (AST), total bilirubin, calcium, chloride, cholesterol, HDL-cholesterol, creatinine, creatinine kinase, glucose, lactate dehydrogenase (LDH), magnesium, potassium, protein, sodium, triglyceride, uric acid and urea. Total error was obtained from Clinical Laboratory Improvement Amendments (CLIA). The Bias was calculated from end cycle report of Royal College of Pathologists of Australasia (RCPA) cycle from July to December 2016 and coefficient variation (CV) from six-month internal quality control (IQC). The sigma was calculated based on the formula :Sigma = (Total Error - Bias) / CV. The analytical performance was evaluated based on the sigma, sigma > 6 is world class, sigma > 5 is excellent, sigma > 4 is good and sigma < 4 is satisfactory and sigma < 3 is poor performance. Results: Based on the calculation, we found that, 96% are world class (ALT, albumin, ALP, amylase, AST, total bilirubin, cholesterol, HDL-cholesterol, creatinine, creatinine kinase, glucose, LDH, magnesium, potassium, triglyceride and uric acid. 14% are excellent (calcium, protein and urea), and 10% ( chloride and sodium) require more frequent IQC performed per day. Conclusion: Based on this study, we found that IQC should be performed frequently for only Chloride and Sodium to ensure accurate and reliable analysis for patient management.

Keywords: sigma matrics, analytical performance, total error, bias

Procedia PDF Downloads 147
48 Steps toward the Support Model of Decision-Making in Hungary: The Impact of the Article 12 of the UN Convention on the Rights of Persons with Disabilities on the Hungarian National Legislation

Authors: Szilvia Halmos

Abstract:

Hungary was one of the first countries to sign and ratify the UN Convention on the Rights of Persons with Disabilities (hereinafter: CRPD). Consequently, Hungary assumed an obligation under international law to review the national law in the light of the Article 12 of the CRPD requiring the States parties to guarantee the equality of persons with disabilities in terms of legal capacity, and to replace the regimes of substitute decision-making by the instruments of supported decision-making. This article is often characterized as one of the key norms of the CRPD, since the legal autonomy of the persons with disabilities is an essential precondition of their participation in the social life on an equal basis with others, envisaged by the social paradigm of disability. This paper examines the impact of the CRPD on the relevant Hungarian national legal norms, with special focus on the relevant rules of the recently codified Civil Code. The employed research methodologies include (1) the specification of the implementation requirements imposed by the Article 12 of the CRPD, (2) the determination of the indicators of the appropriate implementation, (3) the critical analysis of compliance of the relevant Hungarian legal regulation with the indicators, (4) with respect to the relevant case law of the Hungarian Constitutional Court and ordinary courts, the European Court of Human Rights and the Committee of Rights of Persons with Disabilities and (5) to the available empirical figures on the functioning of substitute and supported decision-making regimes. It will be established that the new Civil Code has made large steps toward the equality of persons with disabilities in terms of legal capacity and the support model of decision-making by the introduction of some specific instruments of supported decision-making and the restriction of the application of guardianship. Nevertheless, the regulation currently in effect fails to represent some crucial principles of the Article 12 of the CRPD, such as the non-discrimination of persons with psycho-social disabilities, the support of the articulation of the will and preferences of the individual instead of his/her best interest in the course of decision-making. The changes in the practice of the substitute and the support model brought about by the new legal norms can also be assessed as significant, however, so far unsatisfactory. The number of registered supporters is rather low, and the preconditions of the effective functioning of the support (e.g. the proper training of the supporters) are not ensured.

Keywords: Article 12 of the UN CRPD, Hungarian law on legal capacity, persons with intellectual and psycho-social disabilities, supported decision-making

Procedia PDF Downloads 265
47 A Survey and Analysis on Inflammatory Pain Detection and Standard Protocol Selection Using Medical Infrared Thermography from Image Processing View Point

Authors: Mrinal Kanti Bhowmik, Shawli Bardhan Jr., Debotosh Bhattacharjee

Abstract:

Human skin containing temperature value more than absolute zero, discharges infrared radiation related to the frequency of the body temperature. The difference in infrared radiation from the skin surface reflects the abnormality present in human body. Considering the difference, detection and forecasting the temperature variation of the skin surface is the main objective of using Medical Infrared Thermography(MIT) as a diagnostic tool for pain detection. Medical Infrared Thermography(MIT) is a non-invasive imaging technique that records and monitors the temperature flow in the body by receiving the infrared radiated from the skin and represent it through thermogram. The intensity of the thermogram measures the inflammation from the skin surface related to pain in human body. Analysis of thermograms provides automated anomaly detection associated with suspicious pain regions by following several image processing steps. The paper represents a rigorous study based survey related to the processing and analysis of thermograms based on the previous works published in the area of infrared thermal imaging for detecting inflammatory pain diseases like arthritis, spondylosis, shoulder impingement, etc. The study also explores the performance analysis of thermogram processing accompanied by thermogram acquisition protocols, thermography camera specification and the types of pain detected by thermography in summarized tabular format. The tabular format provides a clear structural vision of the past works. The major contribution of the paper introduces a new thermogram acquisition standard associated with inflammatory pain detection in human body to enhance the performance rate. The FLIR T650sc infrared camera with high sensitivity and resolution is adopted to increase the accuracy of thermogram acquisition and analysis. The survey of previous research work highlights that intensity distribution based comparison of comparable and symmetric region of interest and their statistical analysis assigns adequate result in case of identifying and detecting physiological disorder related to inflammatory diseases.

Keywords: acquisition protocol, inflammatory pain detection, medical infrared thermography (MIT), statistical analysis

Procedia PDF Downloads 320
46 Automated, Objective Assessment of Pilot Performance in Simulated Environment

Authors: Maciej Zasuwa, Grzegorz Ptasinski, Antoni Kopyt

Abstract:

Nowadays flight simulators offer tremendous possibilities for safe and cost-effective pilot training, by utilization of powerful, computational tools. Due to technology outpacing methodology, vast majority of training related work is done by human instructors. It makes assessment not efficient, and vulnerable to instructors’ subjectivity. The research presents an Objective Assessment Tool (gOAT) developed at the Warsaw University of Technology, and tested on SW-4 helicopter flight simulator. The tool uses database of the predefined manoeuvres, defined and integrated to the virtual environment. These were implemented, basing on Aeronautical Design Standard Performance Specification Handling Qualities Requirements for Military Rotorcraft (ADS-33), with predefined Mission-Task-Elements (MTEs). The core element of the gOAT enhanced algorithm that provides instructor a new set of information. In details, a set of objective flight parameters fused with report about psychophysical state of the pilot. While the pilot performs the task, the gOAT system automatically calculates performance using the embedded algorithms, data registered by the simulator software (position, orientation, velocity, etc.), as well as measurements of physiological changes of pilot’s psychophysiological state (temperature, sweating, heart rate). Complete set of measurements is presented on-line to instructor’s station and shown in dedicated graphical interface. The presented tool is based on open source solutions, and flexible for editing. Additional manoeuvres can be easily added using guide developed by authors, and MTEs can be changed by instructor even during an exercise. Algorithm and measurements used allow not only to implement basic stress level measurements, but also to reduce instructor’s workload significantly. Tool developed can be used for training purpose, as well as periodical checks of the aircrew. Flexibility and ease of modifications allow the further development to be wide ranged, and the tool to be customized. Depending on simulation purpose, gOAT can be adjusted to support simulator of aircraft, helicopter, or unmanned aerial vehicle (UAV).

Keywords: automated assessment, flight simulator, human factors, pilot training

Procedia PDF Downloads 124
45 The Impact of Climate Change on Typical Material Degradation Criteria over Timurid Historical Heritage

Authors: Hamed Hedayatnia, Nathan Van Den Bossche

Abstract:

Understanding the ways in which climate change accelerates or slows down the process of material deterioration is the first step towards assessing adaptive approaches for the conservation of historical heritage. Analysis of the climate change effects on the degradation risk assessment parameters like freeze-thaw cycles and wind erosion is also a key parameter when considering mitigating actions. Due to the vulnerability of cultural heritage to climate change, the impact of this phenomenon on material degradation criteria with the focus on brick masonry walls in Timurid heritage, located in Iran, was studied. The Timurids were the final great dynasty to emerge from the Central Asian steppe. Through their patronage, the eastern Islamic world in northwestern of Iran, especially in Mashhad and Herat, became a prominent cultural center. Goharshad Mosque is a mosque in Mashhad of the Razavi Khorasan Province, Iran. It was built by order of Empress Goharshad, the wife of Shah Rukh of the Timurid dynasty in 1418 CE. Choosing an appropriate regional climate model was the first step. The outputs of two different climate model: the 'ALARO-0' and 'REMO,' were analyzed to find out which model is more adopted to the area. For validating the quality of the models, a comparison between model data and observations was done in 4 different climate zones in Iran for a period of 30 years. The impacts of the projected climate change were evaluated until 2100. To determine the material specification of Timurid bricks, standard brick samples from a Timurid mosque were studied. Determination of water absorption coefficient, defining the diffusion properties and determination of real density, and total porosity tests were performed to characterize the specifications of brick masonry walls, which is needed for running HAM-simulations. Results from the analysis showed that the threatening factors in each climate zone are almost different, but the most effective factor around Iran is the extreme temperature increase and erosion. In the north-western region of Iran, one of the key factors is wind erosion. In the north, rainfall erosion and mold growth risk are the key factors. In the north-eastern part, in which our case study is located, the important parameter is wind erosion.

Keywords: brick, climate change, degradation criteria, heritage, Timurid period

Procedia PDF Downloads 98
44 Detection Kit of Type 1 Diabetes Mellitus with Autoimmune Marker GAD65 (Glutamic Acid Decarboxylase)

Authors: Aulanni’am Aulanni’am

Abstract:

Incidence of Diabetes Mellitus (DM) progressively increasing it became a serious problem in Indonesia and it is a disease that government is priority to be addressed. The longer a person is suffering from diabetes the more likely to develop complications particularly diabetic patients who are not well maintained. Therefore, Incidence of Diabetes Mellitus needs to be done in the early diagnosis of pre-phase of the disease. In this pre-phase disease, already happening destruction of pancreatic beta cells and declining in beta cell function and the sign autoimmunity reactions associated with beta cell destruction. Type 1 DM is a multifactorial disease triggered by genetic and environmental factors, which leads to the destruction of pancreatic beta cells. Early marker of "beta cell autoreactivity" is the synthesis of autoantibodies against 65-kDa protein, which can be a molecule that can be detected early in the disease pathomechanism. The importance of early diagnosis of diabetic patients held in the phase of pre-disease is to determine the progression towards the onset of pancreatic beta cell destruction and take precautions. However, the price for this examination is very expensive ($ 150/ test), the anti-GAD65 abs examination cannot be carried out routinely in most or even in all laboratories in Indonesia. Therefore, production-based Rapid Test Recombinant Human Protein GAD65 with "Reverse Flow Immunchromatography Technique" in Indonesia is believed to reduce costs and improve the quality of care of patients with diabetes in Indonesia. Rapid Test Product innovation is very simple and suitable for screening and routine inspection of GAD65 autoantibodies. In the blood serum of patients with diabetes caused by autoimmunity, autoantibody-GAD65 is a major serologic marker to detect autoimmune reaction because their concentration level of stability.GAD65 autoantibodies can be found 10 years before clinical symptoms of diabetes. Early diagnosis is more focused to detect the presence autontibodi-GAD65 given specification and high sensitivity. Autoantibodies- GAD65 that circulates in the blood is a major indicator of the destruction of the islet cells of the pancreas. Results of research in collaboration with Biofarma has produced GAD65 autoantibodies based Rapid Test had conducted the soft launch of products and has been tested with the results of a sensitivity of 100 percent and a specificity between 90 and 96% compared with the gold standard (import product) which worked based on ELISA method.

Keywords: diabetes mellitus, GAD65 autoantibodies, rapid test, sensitivity, specificity

Procedia PDF Downloads 236
43 Measurements for Risk Analysis and Detecting Hazards by Active Wearables

Authors: Werner Grommes

Abstract:

Intelligent wearables (illuminated vests or hand and foot-bands, smart watches with a laser diode, Bluetooth smart glasses) overflow the market today. They are integrated with complex electronics and are worn very close to the body. Optical measurements and limitation of the maximum light density are needed. Smart watches are equipped with a laser diode or control different body currents. Special glasses generate readable text information that is received via radio transmission. Small high-performance batteries (lithium-ion/polymer) supply the electronics. All these products have been tested and evaluated for risk. These products must, for example, meet the requirements for electromagnetic compatibility as well as the requirements for electromagnetic fields affecting humans or implant wearers. Extensive analyses and measurements were carried out for this purpose. Many users are not aware of these risks. The result of this study should serve as a suggestion to do it better in the future or simply to point out these risks. Commercial LED warning vests, LED hand and foot-bands, illuminated surfaces with inverter (high voltage), flashlights, smart watches, and Bluetooth smart glasses were checked for risks. The luminance, the electromagnetic emissions in the low-frequency as well as in the high-frequency range, audible noises, and nervous flashing frequencies were checked by measurements and analyzed. Rechargeable lithium-ion or lithium-polymer batteries can burn or explode under special conditions like overheating, overcharging, deep discharge or using out of the temperature specification. Some risk analysis becomes necessary. The result of this study is that many smart wearables are worn very close to the body, and an extensive risk analysis becomes necessary. Wearers of active implants like a pacemaker or implantable cardiac defibrillator must be considered. If the wearable electronics include switching regulators or inverter circuits, active medical implants in the near field can be disturbed. A risk analysis is necessary.

Keywords: safety and hazards, electrical safety, EMC, EMF, active medical implants, optical radiation, illuminated warning vest, electric luminescent, hand and head lamps, LED, e-light, safety batteries, light density, optical glare effects

Procedia PDF Downloads 84
42 Hypersonic Propulsion Requirements for Sustained Hypersonic Flight for Air Transportation

Authors: James Rate, Apostolos Pesiridis

Abstract:

In this paper, the propulsion requirements required to achieve sustained hypersonic flight for commercial air transportation are evaluated. In addition, a design methodology is developed and used to determine the propulsive capabilities of both ramjet and scramjet engines. Twelve configurations are proposed for hypersonic flight using varying combinations of turbojet, turbofan, ramjet and scramjet engines. The optimal configuration was determined based on how well each of the configurations met the projected requirements for hypersonic commercial transport. The configurations were separated into four sub-configurations each comprising of three unique derivations. The first sub-configuration comprised four afterburning turbojets and either one or two ramjets idealised for Mach 5 cruise. The number of ramjets required was dependent on the thrust required to accelerate the vehicle from a speed where the turbojets cut out to Mach 5 cruise. The second comprised four afterburning turbojets and either one or two scramjets, similar to the first configuration. The third used four turbojets, one scramjet and one ramjet to aid acceleration from Mach 3 to Mach 5. The fourth configuration was the same as the third, but instead of turbojets, it implemented turbofan engines for the preliminary acceleration of the vehicle. From calculations which determined the fuel consumption at incremental Mach numbers this paper found that the ideal solution would require four turbojet engines and two Scramjet engines. The ideal mission profile was determined as being an 8000km sortie based on an averaging of popular long haul flights with strong business ties, which included Los Angeles to Tokyo, London to New York and Dubai to Beijing. This paper deemed that these routes would benefit from hypersonic transport links based on the previously mentioned factors. This paper has found that this configuration would be sufficient for the 8000km flight to be completed in approximately two and a half hours and would consume less fuel than Concord in doing so. However, this propulsion configuration still result in a greater fuel cost than a conventional passenger. In this regard, this investigation contributes towards the specification of the engine requirements throughout a mission profile for a hypersonic passenger vehicle. A number of assumptions have had to be made for this theoretical approach but the authors believe that this investigation lays the groundwork for appropriate framing of the propulsion requirements for sustained hypersonic flight for commercial air transportation. Despite this, it does serve as a crucial step in the development of the propulsion systems required for hypersonic commercial air transportation. This paper provides a methodology and a focus for the development of the propulsion systems that would be required for sustained hypersonic flight for commercial air transportation.

Keywords: hypersonic, ramjet, propulsion, Scramjet, Turbojet, turbofan

Procedia PDF Downloads 290
41 Strategic Public Procurement: A Lever for Social Entrepreneurship and Innovation

Authors: B. Orser, A. Riding, Y. Li

Abstract:

To inform government about how gender gaps in SME ( small and medium-sized enterprise) contracting might be redressed, the research question was: What are the key obstacles to, and response strategies for, increasing the engagement of women business owners among SME suppliers to the government of Canada? Thirty-five interviews with senior policymakers, supplier diversity organization executives, and expert witnesses to the Canadian House of Commons, Standing Committee on Government Operations and Estimates. Qualitative data were conducted and analysed using N’Vivo 11 software. High order response categories included: (a) SME risk mitigation strategies, (b) SME procurement program design, and (c) performance measures. Primary obstacles cited were government red tape and long and complicated requests for proposals (RFPs). The majority of 'common' complaints occur when SMEs have questions about the federal procurement process. Witness responses included use of outcome-based rather than prescriptive procurement practices, more agile procurement, simplified RFPs, making payment within 30 days a procurement priority. Risk mitigation strategies included provision of procurement officers to assess risks and opportunities for businesses and development of more agile procurement procedures and processes. Recommendations to enhance program design included: improved definitional consistency of qualifiers and selection criteria, better co-ordination across agencies; clarification about how SME suppliers benefit from federal contracting; goal setting; specification of categories that are most suitable for women-owned businesses; and, increasing primary contractor awareness about the importance of subcontract relationships. Recommendations also included third-party certification of eligible firms and the need to enhance SMEs’ financial literacy to reduce financial errors. Finally, there remains the need for clear and consistent pre-program statistics to establish baselines (by sector, issuing department) performance measures, targets based on percentage of contracts granted, value of contract, percentage of target employee (women, indigenous), and community benefits including hiring local employees. The study advances strategies to enhance federal procurement programs to facilitate socio-economic policy objectives.

Keywords: procurement, small business, policy, women

Procedia PDF Downloads 89
40 Evaluation of an Integrated Supersonic System for Inertial Extraction of CO₂ in Post-Combustion Streams of Fossil Fuel Operating Power Plants

Authors: Zarina Chokparova, Ighor Uzhinsky

Abstract:

Carbon dioxide emissions resulting from burning of the fossil fuels on large scales, such as oil industry or power plants, leads to a plenty of severe implications including global temperature raise, air pollution and other adverse impacts on the environment. Besides some precarious and costly ways for the alleviation of CO₂ emissions detriment in industrial scales (such as liquefaction of CO₂ and its deep-water treatment, application of adsorbents and membranes, which require careful consideration of drawback effects and their mitigation), one physically and commercially available technology for its capture and disposal is supersonic system for inertial extraction of CO₂ in after-combustion streams. Due to the flue gas with a carbon dioxide concentration of 10-15 volume percent being emitted from the combustion system, the waste stream represents a rather diluted condition at low pressure. The supersonic system induces a flue gas mixture stream to expand using a converge-and-diverge operating nozzle; the flow velocity increases to the supersonic ranges resulting in rapid drop of temperature and pressure. Thus, conversion of potential energy into the kinetic power causes a desublimation of CO₂. Solidified carbon dioxide can be sent to the separate vessel for further disposal. The major advantages of the current solution are its economic efficiency, physical stability, and compactness of the system, as well as needlessness of addition any chemical media. However, there are several challenges yet to be regarded to optimize the system: the way for increasing the size of separated CO₂ particles (as they are represented on a micrometers scale of effective diameter), reduction of the concomitant gas separated together with carbon dioxide and provision of CO₂ downstream flow purity. Moreover, determination of thermodynamic conditions of the vapor-solid mixture including specification of the valid and accurate equation of state remains to be an essential goal. Due to high speeds and temperatures reached during the process, the influence of the emitted heat should be considered, and the applicable solution model for the compressible flow need to be determined. In this report, a brief overview of the current technology status will be presented and a program for further evaluation of this approach is going to be proposed.

Keywords: CO₂ sequestration, converging diverging nozzle, fossil fuel power plant emissions, inertial CO₂ extraction, supersonic post-combustion carbon dioxide capture

Procedia PDF Downloads 121
39 Comparative Studies and Optimization of Biodiesel Production from Oils of Selected Seeds of Nigerian Origin

Authors: Ndana Mohammed, Abdullahi Musa Sabo

Abstract:

The oils used in this work were extracted from seeds of Ricinuscommunis, Heaveabrasiliensis, Gossypiumhirsutum, Azadirachtaindica, Glycin max and Jatrophacurcasby solvent extraction method using n-hexane, and gave the yield of 48.00±0.00%, 44.30±0.52%, 45.50±0.64%, 47.60±0.51%, 41.50±0.32% and 46.50±0.71% respectively. However these feed stocks are highly challenging to trans-esterification reaction because they were found to contain high amount of free fatty acids (FFA) (6.37±0.18, 17.20±0.00, 6.14±0.05, 8.60±0.14, 5.35±0.07, 4.24±0.02mgKOH/g) in order of the above. As a result, two-stage trans-esterification reactions process was used to produce biodiesel; Acid esterification was used to reduce high FFA to 1% or less, and the second stage involve the alkaline trans-esterification/optimization of process condition to obtain high yield quality biodiesel. The salient features of this study include; characterization of oils using AOAC, AOCS standard methods to reveal some properties that may determine the viability of sample seeds as potential feed stocks for biodiesel production, such as acid value, saponification value, Peroxide value, Iodine value, Specific gravity, Kinematic viscosity, and free fatty acid profile. The optimization of process parameters in biodiesel production was investigated. Different concentrations of alkaline catalyst (KOH) (0.25, 0.5, 0.75, 1.0 and 1.50w/v, methanol/oil molar ratio (3:1, 6:1, 9:1, 12:1, and 15:1), reaction temperature (500 C, 550 C, 600 C, 650 C, 700 C), and the rate of stirring (150 rpm,225 rpm,300 rpm and 375 rpm) were used for the determination of optimal condition at which maximum yield of biodiesel would be obtained. However, while optimizing one parameter other parameters were kept fixed. The result shows the optimal biodiesel yield at a catalyst concentration of 1%, methanol/oil molar ratio of 6:1, except oil from ricinuscommunis which was obtained at 9:1, the reaction temperature of 650 C was observed for all samples, similarly the stirring rate of 300 rpm was also observed for all samples except oil from ricinuscommunis which was observed at 375 rpm. The properties of biodiesel fuel were evaluated and the result obtained conformed favorably to ASTM and EN standard specifications for fossil diesel and biodiesel. Therefore biodiesel fuel produced can be used as substitute for fossil diesel. The work also reports the result of the study on the evaluation of the effect of the biodiesel storage on its physicochemical properties to ascertain the level of deterioration with time. The values obtained for the entire samples are completely out of standard specification for biodiesel before the end of the twelve months test period, and are clearly degraded. This suggests the biodiesels from oils of Ricinuscommunis, Heaveabrasiliensis, Gossypiumhirsutum, Azadirachtaindica, Glycin max and Jatrophacurcascannot be stored beyond twelve months.

Keywords: biodiesel, characterization, esterification, optimization, transesterification

Procedia PDF Downloads 392
38 The Psychological Specification of Motivation of Managerial Activity

Authors: Laura Petrosyan

Abstract:

The high and persistent working results are possible when people are interested in the results of the work. Motivation of working may be present as a psychological complicated phenomena, which determines person's behavior in working process. Researchers point out that working motivation is displayed in three correlated conditions. These are interest in outcomes of work, satisfaction with the work, and the third, is the level of devotion of employee. Solution of the problem of effective staff management depends on the development of workers' skills. Despite, above mentioned problem could be solved by the process of finding methods to induce the employees to the effective work. Motivation of the managerial activity aroused not only during the working process, but also before it starts. During education the future manager obtains many professional skills. However, the experience shows, that only professional skills are not enough for the effective work. Presently, one of the global educational problems is the development of motivation in professions. In psychological literature the fact is mentioned, that the motivation can be inside and outside. Outside motivation is active only at short time. Instead, inside motivation can be active during all process of the professional development. Hence, the motivation of managerial activity might be developed during the education. The future manager choose the profession being under some impression of personal qualities. Detection of future manager’s motivation will influence on the development of syllabuses. Moreover, use of the psychological methods could be evolved for preparing motivated managers. Conducted research has been done in the Public Administration Academy of the RA. The aim of research was to discover students' motivation of profession. 102 master students took part in the research from Public Administration Academy. In the research were used the following methods: method of identifying a person's motivation to succeed (T. Elers) and method of studying students’ motivation (T.E. Ilyin). First of the methods designed to explore a person's motivational orientation to get success represented by Hackhausen. The method gives the opportunity to reveal the level of motivation to success. In the second method separated three scales: i) Knowledge achievements, ii) Knowledge of the profession, iii) Get a diploma. The data obtained from these tests gave quantitative data. Aanalyses of our survey results exposes that within master students the high level have the average rates of knowledge achievements. The average rates of knowledge of the profession and geting a diploma not in high level. Furthermore, there are almost equal to each other. In the educational process The student acquiring skills not synthesize with the wield profession. Results show that specialists really view about profession not formulated yet.

Keywords: managerial activity, motivation, psychological complicated phenomena, working process, education the future manager

Procedia PDF Downloads 419
37 Assessing Professionalism, Communication, and Collaboration among Emergency Physicians by Implementing a 360-Degree Evaluation

Authors: Ahmed Al Ansari, Khalid Al Khalifa

Abstract:

Objective: Multisource feedback (MSF), also called the 360-Degree evaluation is an evaluation process by which questionnaires are distributed amongst medical peers and colleagues to assess physician performance from different sources other than the attending or the supervising physicians. The aim of this study was to design, implement, and evaluate a 360-Degree process in assessing emergency physicians trainee in the Kingdom of Bahrain. Method: The study was undertaken in Bahrain Defense Force Hospital which is a military teaching hospital in the Kingdom of Bahrain. Thirty emergency physicians (who represent the total population of the emergency physicians in our hospital) were assessed in this study. We developed an instrument modified from the Physician achievement review instrument PAR which was used to assess Physician in Alberta. We focused in our instrument to assess professionalism, communication skills and collaboration only. To achieve face and content validity, table of specification was constructed and a working group was involved in constructing the instrument. Expert opinion was considered as well. The instrument consisted of 39 items; were 15 items to assess professionalism, 13 items to assess communication skills, and 11 items to assess collaboration. Each emergency physicians was evaluated with 3 groups of raters, 4 Medical colleague emergency physicians, 4 medical colleague who are considered referral physicians from different departments, and 4 Coworkers from the emergency department. Independent administrative team was formed to carry on the responsibility of distributing the instruments and collecting them in closed envelopes. Each envelope was consisted of that instrument and a guide for the implementation of the MSF and the purpose of the study. Results: A total of 30 emergency physicians 16 males and 14 females who represent the total number of the emergency physicians in our hospital were assessed. The total collected forms is 269, were 105 surveys from coworkers working in emergency department, 93 surveys from medical colleague emergency physicians, and 116 surveys from referral physicians from different departments. The total mean response rates were 71.2%. The whole instrument was found to be suitable for factor analysis (KMO = 0.967; Bartlett test significant, p<0.00). Factor analysis showed that the data on the questionnaire decomposed into three factors which counted for 72.6% of the total variance: professionalism, collaboration, and communication. Reliability analysis indicated that the instrument full scale had high internal consistency (Cronbach’s α 0.98). The generalizability coefficients (Ep2) were 0.71 for the surveys. Conclusions: Based on the present results, the current instruments and procedures have high reliability, validity, and feasibility in assessing emergency physicians trainee in the emergency room.

Keywords: MSF system, emergency, validity, generalizability

Procedia PDF Downloads 332
36 Specification of Requirements to Ensure Proper Implementation of Security Policies in Cloud-Based Multi-Tenant Systems

Authors: Rebecca Zahra, Joseph G. Vella, Ernest Cachia

Abstract:

The notion of cloud computing is rapidly gaining ground in the IT industry and is appealing mostly due to making computing more adaptable and expedient whilst diminishing the total cost of ownership. This paper focuses on the software as a service (SaaS) architecture of cloud computing which is used for the outsourcing of databases with their associated business processes. One approach for offering SaaS is basing the system’s architecture on multi-tenancy. Multi-tenancy allows multiple tenants (users) to make use of the same single application instance. Their requests and configurations might then differ according to specific requirements met through tenant customisation through the software. Despite the known advantages, companies still feel uneasy to opt for the multi-tenancy with data security being a principle concern. The fact that multiple tenants, possibly competitors, would have their data located on the same server process and share the same database tables heighten the fear of unauthorised access. Security is a vital aspect which needs to be considered by application developers, database administrators, data owners and end users. This is further complicated in cloud-based multi-tenant system where boundaries must be established between tenants and additional access control models must be in place to prevent unauthorised cross-tenant access to data. Moreover, when altering the database state, the transactions need to strictly adhere to the tenant’s known business processes. This paper focuses on the fact that security in cloud databases should not be considered as an isolated issue. Rather it should be included in the initial phases of the database design and monitored continuously throughout the whole development process. This paper aims to identify a number of the most common security risks and threats specifically in the area of multi-tenant cloud systems. Issues and bottlenecks relating to security risks in cloud databases are surveyed. Some techniques which might be utilised to overcome them are then listed and evaluated. After a description and evaluation of the main security threats, this paper produces a list of software requirements to ensure that proper security policies are implemented by a software development team when designing and implementing a multi-tenant based SaaS. This would then assist the cloud service providers to define, implement, and manage security policies as per tenant customisation requirements whilst assuring security for the customers’ data.

Keywords: cloud computing, data management, multi-tenancy, requirements, security

Procedia PDF Downloads 127
35 Multicollinearity and MRA in Sustainability: Application of the Raise Regression

Authors: Claudia García-García, Catalina B. García-García, Román Salmerón-Gómez

Abstract:

Much economic-environmental research includes the analysis of possible interactions by using Moderated Regression Analysis (MRA), which is a specific application of multiple linear regression analysis. This methodology allows analyzing how the effect of one of the independent variables is moderated by a second independent variable by adding a cross-product term between them as an additional explanatory variable. Due to the very specification of the methodology, the moderated factor is often highly correlated with the constitutive terms. Thus, great multicollinearity problems arise. The appearance of strong multicollinearity in a model has important consequences. Inflated variances of the estimators may appear, there is a tendency to consider non-significant regressors that they probably are together with a very high coefficient of determination, incorrect signs of our coefficients may appear and also the high sensibility of the results to small changes in the dataset. Finally, the high relationship among explanatory variables implies difficulties in fixing the individual effects of each one on the model under study. These consequences shifted to the moderated analysis may imply that it is not worth including an interaction term that may be distorting the model. Thus, it is important to manage the problem with some methodology that allows for obtaining reliable results. After a review of those works that applied the MRA among the ten top journals of the field, it is clear that multicollinearity is mostly disregarded. Less than 15% of the reviewed works take into account potential multicollinearity problems. To overcome the issue, this work studies the possible application of recent methodologies to MRA. Particularly, the raised regression is analyzed. This methodology mitigates collinearity from a geometrical point of view: the collinearity problem arises because the variables under study are very close geometrically, so by separating both variables, the problem can be mitigated. Raise regression maintains the available information and modifies the problematic variables instead of deleting variables, for example. Furthermore, the global characteristics of the initial model are also maintained (sum of squared residuals, estimated variance, coefficient of determination, global significance test and prediction). The proposal is implemented to data from countries of the European Union during the last year available regarding greenhouse gas emissions, per capita GDP and a dummy variable that represents the topography of the country. The use of a dummy variable as the moderator is a special variant of MRA, sometimes called “subgroup regression analysis.” The main conclusion of this work is that applying new techniques to the field can improve in a substantial way the results of the analysis. Particularly, the use of raised regression mitigates great multicollinearity problems, so the researcher is able to rely on the interaction term when interpreting the results of a particular study.

Keywords: multicollinearity, MRA, interaction, raise

Procedia PDF Downloads 73
34 Streamlining the Fuzzy Front-End and Improving the Usability of the Tools Involved

Authors: Michael N. O'Sullivan, Con Sheahan

Abstract:

Researchers have spent decades developing tools and techniques to aid teams in the new product development (NPD) process. Despite this, it is evident that there is a huge gap between their academic prevalence and their industry adoption. For the fuzzy front-end, in particular, there is a wide range of tools to choose from, including the Kano Model, the House of Quality, and many others. In fact, there are so many tools that it can often be difficult for teams to know which ones to use and how they interact with one another. Moreover, while the benefits of using these tools are obvious to industrialists, they are rarely used as they carry a learning curve that is too steep and they become too complex to manage over time. In essence, it is commonly believed that they are simply not worth the effort required to learn and use them. This research explores a streamlined process for the fuzzy front-end, assembling the most effective tools and making them accessible to everyone. The process was developed iteratively over the course of 3 years, following over 80 final year NPD teams from engineering, design, technology, and construction as they carried a product from concept through to production specification. Questionnaires, focus groups, and observations were used to understand the usability issues with the tools involved, and a human-centred design approach was adopted to produce a solution to these issues. The solution takes the form of physical toolkit, similar to a board game, which allows the team to play through an example of a new product development in order to understand the process and the tools, before using it for their own product development efforts. A complimentary website is used to enhance the physical toolkit, and it provides more examples of the tools being used, as well as deeper discussions on each of the topics, allowing teams to adapt the process to their skills, preferences and product type. Teams found the solution very useful and intuitive and experienced significantly less confusion and mistakes with the process than teams who did not use it. Those with a design background found it especially useful for the engineering principles like Quality Function Deployment, while those with an engineering or technology background found it especially useful for design and customer requirements acquisition principles, like Voice of the Customer. Products developed using the toolkit are added to the website as more examples of how it can be used, creating a loop which helps future teams understand how the toolkit can be adapted to their project, whether it be a small consumer product or a large B2B service. The toolkit unlocks the potential of these beneficial tools to those in industry, both for large, experienced teams and for inexperienced start-ups. It allows users to assess the market potential of their product concept faster and more effectively, arriving at the product design stage with technical requirements prioritized according to their customers’ needs and wants.

Keywords: new product development, fuzzy front-end, usability, Kano model, quality function deployment, voice of customer

Procedia PDF Downloads 88
33 Welfare Dynamics and Food Prices' Changes: Evidence from Landholding Groups in Rural Pakistan

Authors: Lubna Naz, Munir Ahmad, G. M. Arif

Abstract:

This study analyzes static and dynamic welfare impacts of food price changes for various landholding groups in Pakistan. The study uses three classifications of land ownership, landless, small landowners and large landowners, for analysis. The study uses Panel Survey, Pakistan Rural Household Survey (PRHS) of Pakistan Institute of Development Economics Islamabad, of rural households from two largest provinces (Sindh and Punjab) of Pakistan. The study uses all three waves (2001, 2004 and 2010) of PRHS. This research work makes three important contributions in literature. First, this study uses Quadratic Almost Ideal Demand System (QUAIDS) to estimate demand functions for eight food groups-cereals, meat, milk and milk products, vegetables, cooking oil, pulses and other food. The study estimates food demand functions with Nonlinear Seemingly Unrelated (NLSUR), and employs Lagrange Multiplier and test on the coefficient of squared expenditure term to determine inclusion of squared expenditure term. Test results support the inclusion of squared expenditure term in the food demand model for each of landholding groups (landless, small landowners and large landowners). This study tests for endogeneity and uses control function for its correction. The problem of observed zero expenditure is dealt with a two-step procedure. Second, it creates low price and high price periods, based on literature review. It uses elasticity coefficients from QUAIDS to analyze static and dynamic welfare effects (first and second order Tylor approximation of expenditure function is used) of food price changes across periods. The study estimates compensation variation (CV), money metric loss from food price changes, for landless, small and large landowners. Third, this study compares the findings on welfare implications of food price changes based on QUAIDS with the earlier research in Pakistan, which used other specification of the demand system. The findings indicate that dynamic welfare impacts of food price changes are lower as compared to static welfare impacts for all landholding groups. The static and dynamic welfare impacts of food price changes are highest for landless. The study suggests that government should extend social security nets to landless poor and categorically to vulnerable landless (without livestock) to redress the short-term impact of food price increase. In addition, the government should stabilize food prices and particularly cereal prices in the long- run.

Keywords: QUAIDS, Lagrange multiplier, NLSUR, and Tylor approximation

Procedia PDF Downloads 344