Search results for: two-stage choice process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16268

Search results for: two-stage choice process

13448 Prediction of Compressive Strength of Concrete from Early Age Test Result Using Design of Experiments (Rsm)

Authors: Salem Alsanusi, Loubna Bentaher

Abstract:

Response Surface Methods (RSM) provide statistically validated predictive models that can then be manipulated for finding optimal process configurations. Variation transmitted to responses from poorly controlled process factors can be accounted for by the mathematical technique of propagation of error (POE), which facilitates ‘finding the flats’ on the surfaces generated by RSM. The dual response approach to RSM captures the standard deviation of the output as well as the average. It accounts for unknown sources of variation. Dual response plus propagation of error (POE) provides a more useful model of overall response variation. In our case, we implemented this technique in predicting compressive strength of concrete of 28 days in age. Since 28 days is quite time consuming, while it is important to ensure the quality control process. This paper investigates the potential of using design of experiments (DOE-RSM) to predict the compressive strength of concrete at 28th day. Data used for this study was carried out from experiment schemes at university of Benghazi, civil engineering department. A total of 114 sets of data were implemented. ACI mix design method was utilized for the mix design. No admixtures were used, only the main concrete mix constituents such as cement, coarse-aggregate, fine aggregate and water were utilized in all mixes. Different mix proportions of the ingredients and different water cement ratio were used. The proposed mathematical models are capable of predicting the required concrete compressive strength of concrete from early ages.

Keywords: mix proportioning, response surface methodology, compressive strength, optimal design

Procedia PDF Downloads 253
13447 Scaling-Down an Agricultural Waste Biogas Plant Fermenter

Authors: Matheus Pessoa, Matthias Kraume

Abstract:

Scale-Down rules in process engineering help us to improve and develop Industrial scale parameters into lab scale. Several scale-down rules available in the literature like Impeller Power Number, Agitation device Power Input, Substrate Tip Speed, Reynolds Number and Cavern Development were investigated in order to stipulate the rotational speed to operate an 11 L working volume lab-scale bioreactor within industrial process parameters. Herein, xanthan gum was used as a fluid with a representative viscosity of a hypothetical biogas plant, with H/D = 1 and central agitation, fermentation broth using sewage sludge and sugar beet pulp as substrate. The results showed that the cavern development strategy was the best method for establishing a rotational speed for the bioreactor operation, while the other rules presented values out of reality for this article proposes.

Keywords: anaerobic digestion, cavern development, scale down rules, xanthan gum

Procedia PDF Downloads 477
13446 The Impact of Brand-Related User-Generated Content on Brand Positioning: A Study on Private Higher Education Institutes in Vietnam

Authors: Charitha Harshani Perera, Rajkishore Nayak, Long Thang Van Nguyen

Abstract:

With the advent of social media, Vietnam has changed the way customers perceive the information about the brand. In the context of higher education, the adoption of social media has received attention with the increasing rate of social media usage among undergraduates. Brand-related user-generated content (UGC) on social media emphasizes the social ties between users and users’ participation, which promotes the communication to build and maintain the relationship with the brands. Although brand positioning offers a significant competitive advantage, the association with brand-related user-generated content in social media with brand positioning in the context of higher education is still an under-researched area. Accordingly, using social identity theory and social exchange theory, this research aims to deepen our understanding of the influence of brand-related user-generated content on brand positioning and purchase intention. Employing a quantitative survey design,384 Vietnamese undergraduates were selected based on purposive sampling. The findings suggest that brand-related user-generated content influence brand positioning and brand choice intention. However, there is a significant mediating effect of the reliability and understandability of the content.

Keywords: brand positioning, brand-related user-generated content, emerging countries, higher education

Procedia PDF Downloads 156
13445 A Multi-criteria Decision Method For The Recruitment Of Academic Personnel Based On The Analytical Hierarchy Process And The Delphi Method In A Neutrosophic Environment (Full Text)

Authors: Antonios Paraskevas, Michael Madas

Abstract:

For a university to maintain its international competitiveness in education, it is essential to recruit qualitative academic staff as it constitutes its most valuable asset. This selection demonstrates a significant role in achieving strategic objectives, particularly by emphasizing a firm commitment to exceptional student experience and innovative teaching and learning practices of high quality. In this vein, the appropriate selection of academic staff establishes a very important factor of competitiveness, efficiency and reputation of an academic institute. Within this framework, our work demonstrates a comprehensive methodological concept that emphasizes on the multi-criteria nature of the problem and on how decision makers could utilize our approach in order to proceed to the appropriate judgment. The conceptual framework introduced in this paper is built upon a hybrid neutrosophic method based on the Neutrosophic Analytical Hierarchy Process (N-AHP), which uses the theory of neutrosophy sets and is considered suitable in terms of significant degree of ambiguity and indeterminacy observed in decision-making process. To this end, our framework extends the N-AHP by incorporating the Neutrosophic Delphi Method (N-DM). By applying the N-DM, we can take into consideration the importance of each decision-maker and their preferences per evaluation criterion. To the best of our knowledge, the proposed model is the first which applies Neutrosophic Delphi Method in the selection of academic staff. As a case study, it was decided to use our method to a real problem of academic personnel selection, having as main goal to enhance the algorithm proposed in previous scholars’ work, and thus taking care of the inherit ineffectiveness which becomes apparent in traditional multi-criteria decision-making methods when dealing with situations alike. As a further result, we prove that our method demonstrates greater applicability and reliability when compared to other decision models.

Keywords: analytical hierarchy process, delphi method, multi-criteria decision maiking method, neutrosophic set theory, personnel recruitment

Procedia PDF Downloads 174
13444 Ideological Manipulations and Cultural-Norm Constraints

Authors: Masoud Hassanzade Novin, Bahloul Salmani

Abstract:

Translation cannot be considered as a simple linguistic act. Through the rise of descriptive approach in the late 1970s and 1980s, translation process managed to meet the requirements of social aspects as well as linguistic approaches. To have the translation considered as the cross-cultural communication through which various cultures communicate in ideological and cultural constraints, the contrastive analysis was conducted in this paper to reveal the distortions imposed in the translated texts. The corpus of the study involved the novel 1984 written by George Orwell and its Persian translated texts which were analyzed through the qualitative type of the research based on critical discourse analysis (CDA) and Toury's norms as well as Lefever's concepts of ideology. Results of the study revealed the point that ideology and the cultural constraints were considered as an important stimulus which can control the process of the translation.

Keywords: critical discourse analysis, ideology, norms, translated texts

Procedia PDF Downloads 325
13443 High Pressure Delignification Process for Nanocrystalline Cellulose Production from Agro-Waste Biomass

Authors: Sakinul Islam, Nhol Kao, Sati Bhattacharya, Rahul Gupta

Abstract:

Nanocrystalline cellulose (NCC) has been widely used for miscellaneous applications due to its superior properties over other nanomaterials. However, the major problems associated with the production of NCC are long reaction time, low production rate and inefficient process. The mass production of NCC within a short period of time is still a great challenge. The main objective of this study is to produce NCC from rice husk agro waste biomass from a high pressure delignification process (HPDP), followed by bleaching and hydrolysis processes. The HPDP has not been explored for NCC production from rice husk biomass (RHB) until now. In order to produce NCC, powder rice husk (PRH) was placed into a stainless steel reactor at 80 ˚C under 5 bars. Aqueous solution of NaOH (4M) was used for the dissolution of lignin and other amorphous impurities from PRH. After certain experimental times (1h, 3.5h and 6h), bleaching and hydrolysis were carried out on delignified samples. NaOCl (20%) and H2SO4 (4M) solutions were used for bleaching and hydrolysis processes, respectively. The NCC suspension from hydrolysis was sonicated and neutralized by buffer solution for various characterisations. Finally NCC suspension was dried and analyzed by FTIR, XRD, SEM, AFM and TEM. The chemical composition of NCC and PRH was estimated by TAPPI (Technical Association of Pulp and Paper Industry) standard methods to observe the product purity. It was found that, the 6h of the HPDP was more efficient to produce good quality NCC than that at 1h and 3.5h due to low separation of non-cellulosic components from RHB. The analyses indicated the crystallinity of NCC to be 71 %, particle size of 20-50 nm (diameter) and 100-200 nm in length.

Keywords: nanocrystalline cellulose, NCC, high pressure delignification, bleaching, hydrolysis, agro-waste biomass

Procedia PDF Downloads 250
13442 Density Determination of Liquid Niobium by Means of Ohmic Pulse-Heating for Critical Point Estimation

Authors: Matthias Leitner, Gernot Pottlacher

Abstract:

Experimental determination of critical point data like critical temperature, critical pressure, critical volume and critical compressibility of high-melting metals such as niobium is very rare due to the outstanding experimental difficulties in reaching the necessary extreme temperature and pressure regimes. Experimental techniques to achieve such extreme conditions could be diamond anvil devices, two stage gas guns or metal samples hit by explosively accelerated flyers. Electrical pulse-heating under increased pressures would be another choice. This technique heats thin wire samples of 0.5 mm diameter and 40 mm length from room temperature to melting and then further to the end of the stable phase, the spinodal line, within several microseconds. When crossing the spinodal line, the sample explodes and reaches the gaseous phase. In our laboratory, pulse-heating experiments can be performed under variation of the ambient pressure from 1 to 5000 bar and allow a direct determination of critical point data for low-melting, but not for high-melting metals. However, the critical point also can be estimated by extrapolating the liquid phase density according to theoretical models. A reasonable prerequisite for the extrapolation is the existence of data that cover as much as possible of the liquid phase and at the same time exhibit small uncertainties. Ohmic pulse-heating was therefore applied to determine thermal volume expansion, and from that density of niobium over the entire liquid phase. As a first step, experiments under ambient pressure were performed. The second step will be to perform experiments under high-pressure conditions. During the heating process, shadow images of the expanding sample wire were captured at a frame rate of 4 × 105 fps to monitor the radial expansion as a function of time. Simultaneously, the sample radiance was measured with a pyrometer operating at a mean effective wavelength of 652 nm. To increase the accuracy of temperature deduction, spectral emittance in the liquid phase is also taken into account. Due to the high heating rates of about 2 × 108 K/s, longitudinal expansion of the wire is inhibited which implies an increased radial expansion. As a consequence, measuring the temperature dependent radial expansion is sufficient to deduce density as a function of temperature. This is accomplished by evaluating the full widths at half maximum of the cup-shaped intensity profiles that are calculated from each shadow image of the expanding wire. Relating these diameters to the diameter obtained before the pulse-heating start, the temperature dependent volume expansion is calculated. With the help of the known room-temperature density, volume expansion is then converted into density data. The so-obtained liquid density behavior is compared to existing literature data and provides another independent source of experimental data. In this work, the newly determined off-critical liquid phase density was in a second step utilized as input data for the estimation of niobium’s critical point. The approach used, heuristically takes into account the crossover from mean field to Ising behavior, as well as the non-linearity of the phase diagram’s diameter.

Keywords: critical point data, density, liquid metals, niobium, ohmic pulse-heating, volume expansion

Procedia PDF Downloads 206
13441 Improvement of Ground Truth Data for Eye Location on Infrared Driver Recordings

Authors: Sorin Valcan, Mihail Gaianu

Abstract:

Labeling is a very costly and time consuming process which aims to generate datasets for training neural networks in several functionalities and projects. For driver monitoring system projects, the need for labeled images has a significant impact on the budget and distribution of effort. This paper presents the modifications done to an algorithm used for the generation of ground truth data for 2D eyes location on infrared images with drivers in order to improve the quality of the data and performance of the trained neural networks. The algorithm restrictions become tougher, which makes it more accurate but also less constant. The resulting dataset becomes smaller and shall not be altered by any kind of manual label adjustment before being used in the neural networks training process. These changes resulted in a much better performance of the trained neural networks.

Keywords: labeling automation, infrared camera, driver monitoring, eye detection, convolutional neural networks

Procedia PDF Downloads 98
13440 Leaching of Copper from Copper Ore Using Sulphuric Acid in the Presence of Hydrogen Peroxide as an Oxidizing Agent: An Optimized Process

Authors: Hilary Rutto

Abstract:

Leaching with acids are the most commonly reagents used to remove copper ions from its copper ores. It is important that the process conditions are optimized to improve the leaching efficiency. In the present study the effects of pH, oxidizing agent (hydrogen peroxide), stirring speed, solid to liquid ratio and acid concentration on the leaching of copper ions from it ore were investigated using a pH Stat apparatus. Copper ions were analyzed at the end of each experiment using Atomic Absorption (AAS) machine. Results showed that leaching efficiency improved with an increase in acid concentration, stirring speed, oxidizing agent, pH and decreased with an increase in the solid to liquid ratio.

Keywords: leaching, copper, oxidizing agent, pH stat apparatus

Procedia PDF Downloads 362
13439 Design of the Fiber Lay-Up for the Composite Wind Turbine Blade in VARTM

Authors: Tzai-Shiung Li, Wen-Bin Young

Abstract:

The wind turbine blade sustains various kinds of loadings during the operating and parking state. Due to the increasing size of the wind turbine blade, it is important to arrange the composite materials in a sufficient way to reach the optimal utilization of the material strength. In the fabrication process of the vacuum assisted resin transfer molding, the fiber content of the turbine blade depends on the vacuum pressure. In this study, a design of the fiber layup for the vacuum assisted resin transfer molding is conducted to achieve the efficient utilization the material strength. This design is for the wind turbine blade consisting of shell skins with or without the spar structure.

Keywords: resin film infiltration, vacuum assisted resin transfer molding process, wind turbine blade, composite materials

Procedia PDF Downloads 369
13438 Statistically Significant Differences of Carbon Dioxide and Carbon Monoxide Emission in Photocopying Process

Authors: Kiurski S. Jelena, Kecić S. Vesna, Oros B. Ivana

Abstract:

Experimental results confirmed the temporal variation of carbon dioxide and carbon monoxide concentration during the working shift of the photocopying process in a small photocopying shop in Novi Sad, Serbia. The statistically significant differences of target gases were examined with two-way analysis of variance without replication followed by Scheffe's post hoc test. The existence of statistically significant differences was obtained for carbon monoxide emission which is pointed out with F-values (12.37 and 31.88) greater than Fcrit (6.94) in contrary to carbon dioxide emission (F-values of 1.23 and 3.12 were less than Fcrit).  Scheffe's post hoc test indicated that sampling point A (near the photocopier machine) and second time interval contribute the most on carbon monoxide emission.

Keywords: analysis of variance, carbon dioxide, carbon monoxide, photocopying indoor, Scheffe's test

Procedia PDF Downloads 312
13437 Design of a Tool for Generating Test Cases from BPMN

Authors: Prat Yotyawilai, Taratip Suwannasart

Abstract:

Business Process Model and Notation (BPMN) is more important in the business process and creating functional models, and is a standard for OMG, which becomes popular in various organizations and in education. Researches related to software testing based on models are prominent. Although most researches use the UML model in software testing, not many researches use the BPMN Model in creating test cases. Therefore, this research proposes a design of a tool for generating test cases from the BPMN. The model is analyzed and the details of the various components are extracted before creating a flow graph. Both details of components and the flow graph are used in generating test cases.

Keywords: software testing, test case, BPMN, flow graph

Procedia PDF Downloads 544
13436 Efficient Position Based Operation Code Authentication

Authors: Hashim Ali, Sheheryar Khan

Abstract:

Security for applications is always been a keen issue of concern. In general, security is to allow access of grant to legal user or to deny non-authorized access to the system. Shoulder surfing is an observation technique to hack an account or to enter into a system. When a malicious observer is capturing or recording the fingers of a user while he is entering sensitive inputs (PIN, Passwords etc.) and may be able to observe user’s password credential. It is very rigorous for a novice user to prevent himself from shoulder surfing or unaided observer in a public place while accessing his account. In order to secure the user account, there are five factors of authentication; they are: “(i) something you have, (ii) something you are, (iii) something you know, (iv) somebody you know, (v) something you process”. A technique has been developed of fifth-factor authentication “something you process” to provide novel approach to the user. In this paper, we have applied position based operational code authentication in such a way to more easy and user friendly to the user.

Keywords: shoulder surfing, malicious observer, sensitive inputs, authentication

Procedia PDF Downloads 257
13435 The Analysis of Gizmos Online Program as Mathematics Diagnostic Program: A Story from an Indonesian Private School

Authors: Shofiayuningtyas Luftiani

Abstract:

Some private schools in Indonesia started integrating the online program Gizmos in the teaching-learning process. Gizmos was developed to supplement the existing curriculum by integrating it into the instructional programs. The program has some features using an inquiry-based simulation, in which students conduct exploration by using a worksheet while teachers use the teacher guidelines to direct and assess students’ performance In this study, the discussion about Gizmos highlights its features as the assessment media of mathematics learning for secondary school students. The discussion is based on the case study and literature review from the Indonesian context. The purpose of applying Gizmos as an assessment media refers to the diagnostic assessment. As a part of the diagnostic assessment, the teachers review the student exploration sheet, analyze particularly in the students’ difficulties and consider findings in planning future learning process. This assessment becomes important since the teacher needs the data about students’ persistent weaknesses. Additionally, this program also helps to build student’ understanding by its interactive simulation. Currently, the assessment over-emphasizes the students’ answers in the worksheet based on the provided answer keys while students perform their skill in translating the question, doing the simulation and answering the question. Whereas, the assessment should involve the multiple perspectives and sources of students’ performance since teacher should adjust the instructional programs with the complexity of students’ learning needs and styles. Consequently, the approach to improving the assessment components is selected to challenge the current assessment. The purpose of this challenge is to involve not only the cognitive diagnosis but also the analysis of skills and error. Concerning the selected setting for this diagnostic assessment that develops the combination of cognitive diagnosis, skills analysis and error analysis, the teachers should create an assessment rubric. The rubric plays the important role as the guide to provide a set of criteria for the assessment. Without the precise rubric, the teacher potentially ineffectively documents and follows up the data about students at risk of failure. Furthermore, the teachers who employ the program of Gizmos as the diagnostic assessment might encounter some obstacles. Based on the condition of assessment in the selected setting, the obstacles involve the time constrain, the reluctance of higher teaching burden and the students’ behavior. Consequently, the teacher who chooses the Gizmos with those approaches has to plan, implement and evaluate the assessment. The main point of this assessment is not in the result of students’ worksheet. However, the diagnostic assessment has the two-stage process; the process to prompt and effectively follow-up both individual weaknesses and those of the learning process. Ultimately, the discussion of Gizmos as the media of the diagnostic assessment refers to the effort to improve the mathematical learning process.

Keywords: diagnostic assessment, error analysis, Gizmos online program, skills analysis

Procedia PDF Downloads 171
13434 Place Branding and the Sense of Place in the Italian UNESCO World Heritage Site of Vicenza

Authors: A. Chtourou, K. Ben Youssef, M. Friel, T. Leicht

Abstract:

These Place attributes and destination images associated with tourism destinations are often crucial important for tourist travel decisions and choice behavior. Understanding the interactions between them is fundamental for developing sustainable place brands. Despite their extensive use on an empirical ground, little research has been done in terms of analyzing the constructs that determine the sense of place in the marketing of cultural heritage sites and on how tourist experiences at such places influence tourist motivations to revisit destinations. By referring to the Italian city of Vicenza, internationally renowned for its gold jewelry production and for the Palladian architectures and buildings which have been recognized World Heritage by the UNESCO, the paper aims to identify how destination image, place familiarity and travel satisfaction influence tourists’ motivations to revisit Vicenza. After an introduction and literature review, the paper investigates the importance of the core constructs that determine the sense of place in the tourist practice. In accordance with previous research, the results provide evidence that favorable travel experiences influence revisit intentions positively. The managerial implications and recommendations for the city of Vicenza are discussed.

Keywords: consumer behavior, heritage tourism, sense of place, place branding, territorial marketing

Procedia PDF Downloads 397
13433 The Determinants of Co-Production for Value Co-Creation: Quadratic Effects

Authors: Li-Wei Wu, Chung-Yu Wang

Abstract:

Recently, interest has been generated in the search for a new reference framework for value creation that is centered on the co-creation process. Co-creation implies cooperative value creation between service firms and customers and requires the building of experiences as well as the resolution of problems through the combined effort of the parties in the relationship. For customers, values are always co-created through their participation in services. Customers can ultimately determine the value of the service in use. This new approach emphasizes that a customer’s participation in the service process is considered indispensable to value co-creation. An important feature of service in the context of exchange is co-production, which implies that a certain amount of participation is needed from customers to co-produce a service and hence co-create value. Co-production no doubt helps customers better understand and take charge of their own roles in the service process. Thus, this proposal is to encourage co-production, thus facilitating value co-creation of that is reflected in both customers and service firms. Four determinants of co-production are identified in this study, namely, commitment, trust, asset specificity, and decision-making uncertainty. Commitment is an essential dimension that directly results in successful cooperative behaviors. Trust helps establish a relational environment that is fundamental to cross-border cooperation. Asset specificity motivates co-production because this determinant may enhance return on asset investment. Decision-making uncertainty prompts customers to collaborate with service firms in making decisions. In other words, customers adjust their roles and are increasingly engaged in co-production when commitment, trust, asset specificity, and decision-making uncertainty are enhanced. Although studies have examined the preceding effects, to our best knowledge, none has empirically examined the simultaneous effects of all the curvilinear relationships in a single study. When these determinants are excessive, however, customers will not engage in co-production process. In brief, we suggest that the relationships of commitment, trust, asset specificity, and decision-making uncertainty with co-production are curvilinear or are inverse U-shaped. These new forms of curvilinear relationships have not been identified in existing literature on co-production; therefore, they complement extant linear approaches. Most importantly, we aim to consider both the bright and the dark sides of the determinants of co-production.

Keywords: co-production, commitment, trust, asset specificity, decision-making uncertainty

Procedia PDF Downloads 177
13432 Optimizing Approach for Sifting Process to Solve a Common Type of Empirical Mode Decomposition Mode Mixing

Authors: Saad Al-Baddai, Karema Al-Subari, Elmar Lang, Bernd Ludwig

Abstract:

Empirical mode decomposition (EMD), a new data-driven of time-series decomposition, has the advantage of supposing that a time series is non-linear or non-stationary, as is implicitly achieved in Fourier decomposition. However, the EMD suffers of mode mixing problem in some cases. The aim of this paper is to present a solution for a common type of signals causing of EMD mode mixing problem, in case a signal suffers of an intermittency. By an artificial example, the solution shows superior performance in terms of cope EMD mode mixing problem comparing with the conventional EMD and Ensemble Empirical Mode decomposition (EEMD). Furthermore, the over-sifting problem is also completely avoided; and computation load is reduced roughly six times compared with EEMD, an ensemble number of 50.

Keywords: empirical mode decomposition (EMD), mode mixing, sifting process, over-sifting

Procedia PDF Downloads 379
13431 Performance, Scalability and Reliability Engineering: Shift Left and Shift Right Approach

Authors: Jyothirmayee Pola

Abstract:

Ideally, a test-driven development (TDD) or agile or any other process should be able to define and implement performance, scalability, and reliability (PSR) of the product with a higher quality of service (QOS) and should have the ability to fix any PSR issues with lesser cost before it hits the production. Most PSR test strategies for new product introduction (NPI) include assumptions about production load requirements but never accurate. NPE (New product Enhancement) include assumptions for new features that are being developed whilst workload distribution for older features can be derived by analyzing production transactions. This paper talks about how to shift left PSR towards design phase of release management process to get better QOS w.r.t PSR for any product under development. It also explains the ROI for future customer onboarding both for Service Oriented Architectures (SOA) and Microservices architectures and how to define PSR requirements.

Keywords: component PSR, performance engineering, performance tuning, reliability, return on investment, scalability, system PSR

Procedia PDF Downloads 60
13430 The Relationship between Intermediate Input Source and Innovation Performance in Business Group-Affiliated Firms

Authors: M. Fernández, T. Gómez, J. Fleta

Abstract:

Although firm innovation is a crucial factor for enhancing their competitive advantage in the current context of globalization, achieving innovations poses a significant challenge because of the degree of expertise required and the associated financial costs. Firms affiliated with business groups can choose whether their purchases of intermediate inputs are domestic (i.e., national source) or from foreign markets (i.e., international source) and whether the supplier firms are affiliated (i.e., internal source) or non-affiliated (i.e., external source). This has led to studies investigating the role of different sources of intermediate inputs in promoting innovation performance. The present study seeks to fill this gap by exploring the relationship between the source of intermediate inputs and innovation performance in firms belonging to Spanish non-MNE groups. For this purpose, we will distinguish among three intermediate input sources, international sourcing, domestic external sourcing, and internal sourcing, as their choice could be induced by different causes and have different consequences. Finally, it is analyzed radical and incremental innovation as innovation performance because they are closely related to the concept of technological development and reflect different innovation behavior. The paper includes a sample of around 4,100 firm-year observations of manufacturing firms (non-MNE) belonging to groups located in Spain between 2006 and 2020.

Keywords: intermediate input source, innovation performance, business group affiliated firms, Spain

Procedia PDF Downloads 5
13429 An Enhanced Hybrid Backoff Technique for Minimizing the Occurrence of Collision in Mobile Ad Hoc Networks

Authors: N. Sabiyath Fatima, R. K. Shanmugasundaram

Abstract:

In Mobile Ad-hoc Networks (MANETS), every node performs both as transmitter and receiver. The existing backoff models do not exactly forecast the performance of the wireless network. Also, the existing models experience elevated packet collisions. Every time a collision happens, the station’s contention window (CW) is doubled till it arrives at the utmost value. The main objective of this paper is to diminish collision by means of contention window Multiplicative Increase Decrease Backoff (CWMIDB) scheme. The intention of rising CW is to shrink the collision possibility by distributing the traffic into an outsized point in time. Within wireless Ad hoc networks, the CWMIDB algorithm dynamically controls the contention window of the nodes experiencing collisions. During packet communication, the backoff counter is evenly selected from the given choice of [0, CW-1]. At this point, CW is recognized as contention window and its significance lies on the amount of unsuccessful transmission that had happened for the packet. On the initial transmission endeavour, CW is put to least amount value (C min), if transmission effort fails, subsequently the value gets doubled, and once more the value is set to least amount on victorious broadcast. CWMIDB is simulated inside NS2 environment and its performance is compared with Binary Exponential Backoff Algorithm. The simulation results show improvement in transmission probability compared to that of the existing backoff algorithm.

Keywords: backoff, contention window, CWMIDB, MANET

Procedia PDF Downloads 262
13428 Investigations on the Application of Avalanche Simulations: A Survey Conducted among Avalanche Experts

Authors: Korbinian Schmidtner, Rudolf Sailer, Perry Bartelt, Wolfgang Fellin, Jan-Thomas Fischer, Matthias Granig

Abstract:

This study focuses on the evaluation of snow avalanche simulations, based on a survey that has been carried out among avalanche experts. In the last decades, the application of avalanche simulation tools has gained recognition within the realm of hazard management. Traditionally, avalanche runout models were used to predict extreme avalanche runout and prepare avalanche maps. This has changed rather dramatically with the application of numerical models. For safety regulations such as road safety simulation tools are now being coupled with real-time meteorological measurements to predict frequent avalanche hazard. That places new demands on model accuracy and requires the simulation of physical processes that previously could be ignored. These simulation tools are based on a deterministic description of the avalanche movement allowing to predict certain quantities (e.g. pressure, velocities, flow heights, runout lengths etc.) of the avalanche flow. Because of the highly variable regimes of the flowing snow, no uniform rheological law describing the motion of an avalanche is known. Therefore, analogies to fluid dynamical laws of other materials are stated. To transfer these constitutional laws to snow flows, certain assumptions and adjustments have to be imposed. Besides these limitations, there exist high uncertainties regarding the initial and boundary conditions. Further challenges arise when implementing the underlying flow model equations into an algorithm executable by a computer. This implementation is constrained by the choice of adequate numerical methods and their computational feasibility. Hence, the model development is compelled to introduce further simplifications and the related uncertainties. In the light of these issues many questions arise on avalanche simulations, on their assets and drawbacks, on potentials for improvements as well as their application in practice. To address these questions a survey among experts in the field of avalanche science (e.g. researchers, practitioners, engineers) from various countries has been conducted. In the questionnaire, special attention is drawn on the expert’s opinion regarding the influence of certain variables on the simulation result, their uncertainty and the reliability of the results. Furthermore, it was tested to which degree a simulation result influences the decision making for a hazard assessment. A discrepancy could be found between a large uncertainty of the simulation input parameters as compared to a relatively high reliability of the results. This contradiction can be explained taking into account how the experts employ the simulations. The credibility of the simulations is the result of a rather thoroughly simulation study, where different assumptions are tested, comparing the results of different flow models along with the use of supplemental data such as chronicles, field observation, silent witnesses i.a. which are regarded as essential for the hazard assessment and for sanctioning simulation results. As the importance of avalanche simulations grows within the hazard management along with their further development studies focusing on the modeling fashion could contribute to a better understanding how knowledge of the avalanche process can be gained by running simulations.

Keywords: expert interview, hazard management, modeling, simulation, snow avalanche

Procedia PDF Downloads 315
13427 Transesterification of Jojoba Oil Wax Using Microwave Technique

Authors: Moataz Elsawy, Hala F. Naguib, Hilda A. Aziz, Eid A. Ismail, Labiba I. Hussein, Maher Z. Elsabee

Abstract:

Jojoba oil-wax is extracted from the seeds of the jojoba (Simmondsia chinensis Link Schneider), a perennial shrub that grows in semi-desert areas in Egypt and in some parts of the world. The main uses of jojoba oil wax are in the cosmetics and pharmaceutical industry, but new uses could arise related to the search of new energetic crops. This paper summarizes a process to convert the jojoba oil wax to biodiesel by transesterification with ethanol and a series of aliphatic alcohols using a more economic and energy saving method in a domestic microwave. The effect of time and power of the microwave on the extent of the transesterification using ethanol and other aliphatic alcohols has been studied. The separation of the alkyl esters from the fatty alcohols rich fraction has been done in a single crystallization step at low temperature (−18°C) from low boiling point petroleum ether. Gas chromatography has been used to follow up the transesterification process. All products have been characterized by spectral analysis.

Keywords: jojoba oil, transesterification, microwave, gas chromatography jojoba esters, jojoba alcohol

Procedia PDF Downloads 449
13426 Study of Biofuel Produced by Babassu Oil Fatty Acids Esterification

Authors: F. A. F. da Ponte, J. Q. Malveira, I. A. Maciel, M. C. G. Albuquerque

Abstract:

In this work aviation, biofuel production was studied by fatty acids (C6 to C16) esterification. The process variables in heterogeneous catalysis were evaluated using an experimental design. Temperature and reaction time were the studied parameters, and the methyl esters content was the response of the experimental design. An ion exchange resin was used as a heterogeneous catalyst. The process optimization was carried out using response surface methodology (RSM) and polynomial model of second order. Results show that the most influential variables on the linear coefficient of each effect studied were temperature and reaction time. The best result of methyl esters conversion in the experimental design was under the conditions: 10% wt of catalyst; 100 °C and 4 hours of reaction. The best-achieved conversion was 96.5% wt of biofuel.

Keywords: esterification, ion-exchange resins, response surface methodology, biofuel

Procedia PDF Downloads 483
13425 Development of a Triangular Evaluation Protocol in a Multidisciplinary Design Process of an Ergometric Step

Authors: M. B. Ricardo De Oliveira, A. Borghi-Silva, E. Paravizo, F. Lizarelli, L. Di Thomazzo, D. Braatz

Abstract:

Prototypes are a critical feature in the product development process, as they help the project team visualize early concept flaws, communicate ideas and introduce an initial product testing. Involving stakeholders, such as consumers and users, in prototype tests allows the gathering of valuable feedback, contributing for a better product and making the design process more participatory. Even though recent studies have shown that user evaluation of prototypes is valuable, few articles provide a method or protocol on how designers should conduct it. This multidisciplinary study (involving the areas of physiotherapy, engineering and computer science) aims to develop an evaluation protocol, using an ergometric step prototype as the product prototype to be assessed. The protocol consisted of performing two tests (the 2 Minute Step Test and the Portability Test) to allow users (patients) and consumers (physiotherapists) to have an experience with the prototype. Furthermore, the protocol contained four Likert-Scale questionnaires (one for users and three for consumers), that inquired participants about how they perceived the design characteristics of the product (performance, safety, materials, maintenance, portability, usability and ergonomics), in their use of the prototype. Additionally, the protocol indicated the need to conduct interviews with the product designers, in order to link their feedback to the ones from the consumers and users. Both tests and interviews were recorded for further analysis. The participation criteria for the study was gender and age for patients, gender and experience with 2 Minute Step Test for physiotherapists and involvement level in the product development project for designers. The questionnaire's reliability was validated using Cronbach's Alpha and the quantitative data of the questionnaires were analyzed using non-parametric hypothesis tests with a significance level of 0.05 (p <0.05) and descriptive statistics. As a result, this study provides a concise evaluation protocol which can assist designers in their development process, collecting quantitative feedback from consumer and users, and qualitative feedback from designers.

Keywords: Product Design, Product Evaluation, Prototypes, Step

Procedia PDF Downloads 106
13424 Through Seligman’s Lenses: Creating a Culture of Well-Being in Higher-Education

Authors: Neeru Deep, Kimberly McAlister

Abstract:

Mental health issues have been increasing worldwide for many decades, but the COVID-19 pandemic has brought mental health issues into the spotlight. Within higher education, promoting the well-being of students has dramatically increased in focus. The Northwestern State University of Louisiana opened the Center for Positivity, Well-being, and Hope using the action research process of reflecting, planning, acting, and observing. The study’s purpose is two-fold: First, it highlights how to create a collaborative team to reflect, plan, and act to develop a well-being culture in higher education institutions. Second, it investigates the efficacy of the center through Seligman’s lenses. The researchers shared their experience in the first three phases of the action research process and then applied an identical concurrent mixed methods design. A purposive sample evaluated the efficacy of the center through Seligman’s lenses. The researcher administered PERMA-Profiler Measure, the PERMA-Profiler Measure overview, the CoPWH Evaluation I, and the CoPWH Evaluation II questionnaires to collect qualitative and quantitative data. The thematic analysis for qualitative and descriptive statistics for quantitative data concluded that the center creates a well-being culture and promotes well-being in college students. In conclusion, this action research shares the successful implementation of the cyclic process of research in promoting a well-being culture in higher education with the implications for promoting a well-being culture in various educational settings, workplaces, and communities.

Keywords: action research, mixed methods research design, Seligman, well-being.

Procedia PDF Downloads 116
13423 Serious Gaming for Behaviour Change: A Review

Authors: Ramy Hammady, Sylvester Arnab

Abstract:

Significant attention has been directed to adopt game interventions practically to change certain behaviours in many disciplines such as health, education, psychology through many years. That’s due to the intrinsic motivation that games can cause and the substantial impact the games can leave on the player. Many review papers were induced to highlight and measure the effectiveness of the game’s interventions on changing behaviours; however, most of these studies neglected the game design process itself and the game features and elements that can stimuli changing behaviours. Therefore, this paper aims to identify the most game design mechanics and features that are the most influencing on changing behaviour during or after games interventions. This paper also sheds light on the theories of changing behaviours that clearly can led the game design process. This study gives directions to game designers to spot the most influential game features and mechanics for changing behaviour games in order to exploit it on the same manner.

Keywords: behaviour change, game design, serious gaming, gamification, review

Procedia PDF Downloads 197
13422 Electroremediation of Saturated and Unsaturated Nickel-Contaminated Soils

Authors: Waddah Abdullah, Saleh Al-Sarem

Abstract:

Electrokinetic remediation was undoubtedly proven to be one of the most efficient techniques used to clean up soils contaminated with polar charged contaminants (such as heavy metals) and non-polar organic contaminants. It can be efficiently used to clean up low permeability mud, wastewater, electroplating wastes, sludge, and marine dredging. This study presented and discussed the results of electrokinetic remediation processes to clean up soils contaminated with nickel. Two types of electrokinetics cells were used: an open cell and an advanced cylindrical cell. Two types of soils were used for this investigation; the Azraq green clay which has very low permeability taken from the eastern part of Jordan (city of Azraq) and a sandy soil having, relatively, very high permeability. The clayey soil was spiked with 500 ppm of nickel, and the sandy soil was spiked with 1500 ppm of nickel. Fully saturated and partially saturated clayey soils were used for the clean-up process. Clayey soils were tested under a direct current of 80 mA and 50 mA to study the effect of the electrical current on the remediation process. Chelating agent (Na-EDTA), disodium ethylene diamine tetraacetatic acid, was used in both types of soils to enhance the electroremediation process. The effect of carbonates presence in the contaminated soils, also, was investigated by use of sodium carbonate and calcium carbonate. pH changes in the anode and the cathode compartments were controlled by use of buffer solutions. The results of the investigation showed that for the fully saturated clayey soil spiked with nickel had an average removal efficiency of 64%, and the average removal efficiency was 46% for the unsaturated clayey soil. For the sandy soil, the average removal efficiency of Nickel was 90%. Test results showed that presence of carbonates in the remediated soils retarded the clean-up process of nickel-contaminated soils (removal efficiency was reduced from 90% to 60%). EDTA enhanced decontamination of nickel contaminated clayey and sandy soils with carbonates was studied. The average removal efficiency increased from 60% (prior to using EDTA) to more than 90% after using EDTA.

Keywords: buffer solution, EDTA, electroremediation, nickel removal efficiency

Procedia PDF Downloads 171
13421 Optimization and Feasibility Analysis of a PV/Wind/ Battery Hybrid Energy Conversion

Authors: Doaa M. Atia, Faten H. Fahmy, Ninet M. A. El-Rahman, Hassan T. Dorra

Abstract:

In this paper, the optimum design for renewable energy system powered an aquaculture pond was determined. Hybrid Optimization Model for Electric Renewable (HOMER) software program, which is developed by U.S National Renewable Energy Laboratory (NREL), is used for analyzing the feasibility of the stand-alone and hybrid system in this study. HOMER program determines whether renewable energy resources satisfy hourly electric demand or not. The program calculates energy balance for every 8760 hours in a year to simulate operation of the system. This optimization compares the demand for the electrical energy for each hour of the year with the energy supplied by the system for that hour and calculates the relevant energy flow for each component in the model. The essential principle is to minimize the total system cost while HOMER ensures control of the system. Moreover the feasibility analysis of the energy system is also studied. Wind speed, solar irradiance, interest rate and capacity shortage are the parameters which are taken into consideration. The simulation results indicate that the hybrid system is the best choice in this study, yielding lower net present cost. Thus, it provides higher system performance than PV or wind stand-alone systems.

Keywords: wind stand-alone system, photovoltaic stand-alone system, hybrid system, optimum system sizing, feasibility, cost analysis

Procedia PDF Downloads 327
13420 Determination of Verapamil Hydrochloride in Tablets and Injection Solutions With the Verapamil-Selective Electrode and Possibilities of Application in Pharmaceutical Analysis

Authors: Faisal A. Salih

Abstract:

Verapamil hydrochloride (Ver) is a drug used in medicine for arrythmia, angina and hypertension as a calcium channel blocker. For the quantitative determination of Ver in dosage forms, the HPLC method is most often used. A convenient alternative to the chromatographic method is potentiometry using a Verselective electrode, which does not require expensive equipment, can be used without separation from the matrix components, which significantly reduces the analysis time, and does not use toxic organic solvents, being a "green", "environmentally friendly" technique. It has been established in this study that the rational choice of the membrane plasticizer and the preconditioning and measurement algorithms, which prevent nonexchangeable extraction of Ver into the membrane phase, makes it possible to achieve excellent analytical characteristics of Ver-selective electrodes based on commercially available components. In particular, an electrode with the following membrane composition: PVC (32.8 wt %), ortho-nitrophenyloctyl ether (66.6 wt %), and tetrakis-4-chlorophenylborate (0.6 wt % or 0.01 M) have the lower detection limit 4 × 10−8 M and potential reproducibility 0.15–0.22 mV. Both direct potentiometry (DP) and potentiometric titration (PT) methods can be used for the determination of Ver in tablets and injection solutions. Masses of Ver per average tablet weight determined by the methods of DP and PT for the same set of 10 tablets were (80.4±0.2 and80.7±0.2) mg, respectively. The masses of Ver in solutions for injection, determined by DP for two ampoules from one set, were (5.00±0.015 and 5.004±0.006) mg. In all cases, good reproducibility and excellent correspondence with the declared quantities were observed.

Keywords: verapamil, potentiometry, ion-selective electrode, pharmaceutical analysis

Procedia PDF Downloads 72
13419 Cut-Out Animation as an Technic and Development inside History Process

Authors: Armagan Gokcearslan

Abstract:

The art of animation has developed very rapidly from the aspects of script, sound and music, motion, character design, techniques being used and technological tools being developed since the first years until today. Technical variety attracts a particular attention in the art of animation. Being perceived as a kind of illusion in the beginning; animations commonly used the Flash Sketch technique. Animations artists using the Flash Sketch technique created scenes by drawing them on a blackboard with chalk. The Flash Sketch technique was used by primary animation artists like Emile Cohl, Winsor McCay ande Blackton. And then tools like Magical Lantern, Thaumatrope, Phenakisticope, and Zeotrap were developed and started to be used intensely in the first years of the art of animation. Today, on the other hand, the art of animation is affected by developments in the computer technology. It is possible to create three-dimensional and two-dimensional animations with the help of various computer software. Cut-out technique is among the important techniques being used in the art of animation. Cut-out animation technique is based on the art of paper cutting. Examining cut-out animations; it is observed that they technically resemble the art of paper cutting. The art of paper cutting has a rooted history. It is possible to see the oldest samples of paper cutting in the People’s Republic of China in the period after the 2. century B.C. when the Chinese invented paper. The most popular artist using the cut-out animation technique is the German artist Lotte Reiniger. This study titled “Cut-out Animation as a Technic and Development Inside History Process” will embrace the art of paper cutting, the relationship between the art of paper cutting and cut-out animation, its development within the historical process, animation artists producing artworks in this field, important cut-out animations, and their technical properties.

Keywords: cut-out, paper art, animation, technic

Procedia PDF Downloads 257