Search results for: direct displacement based design
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 37810

Search results for: direct displacement based design

24850 Streamlining Cybersecurity Risk Assessment for Industrial Control and Automation Systems: Leveraging the National Institute of Standard and Technology’s Risk Management Framework (RMF) Using Model-Based System Engineering (MBSE)

Authors: Gampel Alexander, Mazzuchi Thomas, Sarkani Shahram

Abstract:

The cybersecurity landscape is constantly evolving, and organizations must adapt to the changing threat environment to protect their assets. The implementation of the NIST Risk Management Framework (RMF) has become critical in ensuring the security and safety of industrial control and automation systems. However, cybersecurity professionals are facing challenges in implementing RMF, leading to systems operating without authorization and being non-compliant with regulations. The current approach to RMF implementation based on business practices is limited and insufficient, leaving organizations vulnerable to cyberattacks resulting in the loss of personal consumer data and critical infrastructure details. To address these challenges, this research proposes a Model-Based Systems Engineering (MBSE) approach to implementing cybersecurity controls and assessing risk through the RMF process. The study emphasizes the need to shift to a modeling approach, which can streamline the RMF process and eliminate bloated structures that make it difficult to receive an Authorization-To-Operate (ATO). The study focuses on the practical application of MBSE in industrial control and automation systems to improve the security and safety of operations. It is concluded that MBSE can be used to solve the implementation challenges of the NIST RMF process and improve the security of industrial control and automation systems. The research suggests that MBSE provides a more effective and efficient method for implementing cybersecurity controls and assessing risk through the RMF process. The future work for this research involves exploring the broader applicability of MBSE in different industries and domains. The study suggests that the MBSE approach can be applied to other domains beyond industrial control and automation systems.

Keywords: authorization-to-operate (ATO), industrial control systems (ICS), model-based system’s engineering (MBSE), risk management framework (RMF)

Procedia PDF Downloads 87
24849 Removal of Heavy Metals from Municipal Wastewater Using Constructed Rhizofiltration System

Authors: Christine A. Odinga, G. Sanjay, M. Mathew, S. Gupta, F. M. Swalaha, F. A. O. Otieno, F. Bux

Abstract:

Wastewater discharged from municipal treatment plants contain an amalgamation of trace metals. The presence of metal pollutants in wastewater poses a huge challenge to the choice and applications of the preferred treatment method. Conventional treatment methods are inefficient in the removal of trace metals due to their design approach. This study evaluated the treatment performance of a constructed rhizofiltration system in the removal of heavy metals from municipal wastewater. The study was conducted at an eThekwni municipal wastewater treatment plant in Kingsburgh - Durban in the province of KwaZulu-Natal. The construction details of the pilot-scale rhizofiltration unit included three different layers of substrate consisting of medium stones, coarse gravel and fine sand. The system had one section planted with Phragmites australis L. and Kyllinga nemoralis L. while the other section was unplanted and acted as the control. Influent, effluent and sediment from the system were sampled and assessed for the presence of and removal of selected trace heavy metals using standard methods. Efficiency of metals removal was established by gauging the transfer of metals into leaves, roots and stem of the plants by calculations based on standard statistical packages. The Langmuir model was used to assess the heavy metal adsorption mechanisms of the plants. Heavy metals were accumulated in the entire rhizofiltration system at varying percentages of 96.69% on planted and 48.98% on control side for cadmium. Chromium was 81% and 24%, Copper was 23.4% and 1.1%, Nickel was 72% and 46.5, Lead was 63% and 31%, while Zinc was 76% and 84% on the on the water and sediment of the planted and control sides of the rhizofilter respectively. The decrease in metal adsorption efficiencies on the planted side followed the pattern of Cd>Cr>Zn>Ni>Pb>Cu and Ni>Cd>Pb>Cr>Cu>Zn on the control side. Confirmatory analysis using Electron Scanning Microscopy revealed that higher amounts of metals was deposited in the root system with values ranging from 0.015mg/kg (Cr), 0.250 (Cu), 0.030 (Pb) for P. australis, and 0.055mg/kg (Cr), 0.470mg/kg (Cu) and 0.210mg/kg,(Pb) for K. nemoralis respectively. The system was found to be efficient in removing and reducing metals from wastewater and further research is necessary to establish the immediate mechanisms that the plants display in order to achieve these reductions.

Keywords: wastewater treatment, Phragmites australis L., Kyllinga nemoralis L., heavy metals, pathogens, rhizofiltration

Procedia PDF Downloads 259
24848 Metabolic Variables and Associated Factors in Acute Pancreatitis Patients Correlates with Health-Related Quality of Life

Authors: Ravinder Singh, Pratima Syal

Abstract:

Background: The rising prevalence and incidence of Acute Pancreatitis (AP) and its associated metabolic variables known as metabolic syndrome (MetS) are common medical conditions with catastrophic consequences and substantial treatment costs. The correlation between MetS and AP, as well as their impact on Health Related Quality of Life (HRQoL) is uncertain, and because there are so few published studies, further research is needed. As a result, we planned this study to determine the relationship between MetS components impact on HRQoL in AP patients. Patients and Methods: A prospective, observational study involving the recruitment of patients with AP with and without MetS was carried out in tertiary care hospital of North India. Patients were classified with AP if they were diagnosed with two or more components of the following criteria, abdominal pain, serum amylase and lipase levels two or more times normal, imaging trans-abdominal ultrasound, computed tomography, or magnetic resonance. The National Cholesterol Education Program–Adult Treatment Panel III (NCEP-ATP III) criterion was used to diagnose the MetS. The various socio-demographic variables were also taken into consideration for the calculation of statistical significance (P≤.05) in AP patients. Finally, the correlation between AP and MetS, along with their impact on HRQoL was assessed using Student's t test, Pearson Correlation Coefficient, and Short Form-36 (SF-36). Results: AP with MetS (n = 100) and AP without MetS (n = 100) patients were divided into two groups. Gender, Age, Educational Status, Tobacco use, Body Mass Index (B.M.I), and Waist Hip Ratio (W.H.R) were the socio-demographic parameters found to be statistically significant (P≤.05) in AP patients with MetS. Also, all the metabolic variables were also found to statistically significant (P≤.05) and found to be increased in patients with AP with MetS as compared to AP without MetS except HDL levels. Using the SF-36 form, a greater significant decline was observed in physical component summary (PCS) and mental component summary (MCS) in patients with AP with MetS as compared to patients without MetS (P≤.05). Furthermore, a negative association between all metabolic variables with the exception of HDL, and AP was found to be producing deterioration in PCS and MCS. Conclusion: The study demonstrated that patients with AP with MetS had a worse overall HRQOL than patients with AP without MetS due to number of socio-demographic and metabolic variables having direct correlation impacting physical and mental health of patients.

Keywords: metabolic disorers, QOL, cost effectiveness, pancreatitis

Procedia PDF Downloads 112
24847 Analytical Model of Locomotion of a Thin-Film Piezoelectric 2D Soft Robot Including Gravity Effects

Authors: Zhiwu Zheng, Prakhar Kumar, Sigurd Wagner, Naveen Verma, James C. Sturm

Abstract:

Soft robots have drawn great interest recently due to a rich range of possible shapes and motions they can take on to address new applications, compared to traditional rigid robots. Large-area electronics (LAE) provides a unique platform for creating soft robots by leveraging thin-film technology to enable the integration of a large number of actuators, sensors, and control circuits on flexible sheets. However, the rich shapes and motions possible, especially when interacting with complex environments, pose significant challenges to forming well-generalized and robust models necessary for robot design and control. In this work, we describe an analytical model for predicting the shape and locomotion of a flexible (steel-foil-based) piezoelectric-actuated 2D robot based on Euler-Bernoulli beam theory. It is nominally (unpowered) lying flat on the ground, and when powered, its shape is controlled by an array of piezoelectric thin-film actuators. Key features of the models are its ability to incorporate the significant effects of gravity on the shape and to precisely predict the spatial distribution of friction against the contacting surfaces, necessary for determining inchworm-type motion. We verified the model by developing a distributed discrete element representation of a continuous piezoelectric actuator and by comparing its analytical predictions to discrete-element robot simulations using PyBullet. Without gravity, predicting the shape of a sheet with a linear array of piezoelectric actuators at arbitrary voltages is straightforward. However, gravity significantly distorts the shape of the sheet, causing some segments to flatten against the ground. Our work includes the following contributions: (i) A self-consistent approach was developed to exactly determine which parts of the soft robot are lifted off the ground, and the exact shape of these sections, for an arbitrary array of piezoelectric voltages and configurations. (ii) Inchworm-type motion relies on controlling the relative friction with the ground surface in different sections of the robot. By adding torque-balance to our model and analyzing shear forces, the model can then determine the exact spatial distribution of the vertical force that the ground is exerting on the soft robot. Through this, the spatial distribution of friction forces between ground and robot can be determined. (iii) By combining this spatial friction distribution with the shape of the soft robot, in the function of time as piezoelectric actuator voltages are changed, the inchworm-type locomotion of the robot can be determined. As a practical example, we calculated the performance of a 5-actuator system on a 50-µm thick steel foil. Piezoelectric properties of commercially available thin-film piezoelectric actuators were assumed. The model predicted inchworm motion of up to 200 µm per step. For independent verification, we also modelled the system using PyBullet, a discrete-element robot simulator. To model a continuous thin-film piezoelectric actuator, we broke each actuator into multiple segments, each of which consisted of two rigid arms with appropriate mass connected with a 'motor' whose torque was set by the applied actuator voltage. Excellent agreement between our analytical model and the discrete-element simulator was shown for both for the full deformation shape and motion of the robot.

Keywords: analytical modeling, piezoelectric actuators, soft robot locomotion, thin-film technology

Procedia PDF Downloads 173
24846 Impact of aSolar System Designed to Improve the Microclimate of an Agricultural Greenhouse

Authors: Nora Arbaoui, Rachid Tadili, Ilham Ihoume

Abstract:

The improvement of the agricultural production and food preservation processes requires the introduction of heating and cooling techniques in greenhouses. To develop these techniques, our work proposes a design of an integrated and autonomous solar system for heating, cooling, and production conservation in greenhouses. The hot air produced by the greenhouse effect during the day will be evacuated to compartments annexed in the greenhouse to dry the surplus agricultural production that is not sold on the market. In this paper, we will give a description of this solar system and the calculation of the fluid’s volume used for heat storage that will be released during the night.

Keywords: solar system, agricultural greenhouse, heating, cooling, storage, drying

Procedia PDF Downloads 99
24845 Factors That Influence Decision Making of Foreign Volunteer Tourists in Thailand

Authors: Paramet Damchoo

Abstract:

The purpose of this study is to study the factors that influence the decision making of foreign volunteer tourists in Thailand. A sample size was 400 drawn from 10 provinces of Thailand using cluster sampling method. The factor analysis was used to analysis the data. The findings indicate that volunteer tourism which was based in Thailand contained a total of 45 activities which could be divided into 4 categories. The most of these tourists were from Europe including UK and Scandinavia which was 54.50 percent. Moreover, the tourists were male rather than female and 63.50 Percent of them ware younger than 20 years old. It is also found that there are 67.00 percent of the tourists used website to find where the volunteer tourism was based. Finally, the factors that influence the decision making of foreign volunteer tourists in Thailand consist of a wide variety of activities together with a flexibility in their activities and also low prices.

Keywords: decision making, volunteer tourism, special interest tourism, GAP year

Procedia PDF Downloads 337
24844 River Network Delineation from Sentinel 1 Synthetic Aperture Radar Data

Authors: Christopher B. Obida, George A. Blackburn, James D. Whyatt, Kirk T. Semple

Abstract:

In many regions of the world, especially in developing countries, river network data are outdated or completely absent, yet such information is critical for supporting important functions such as flood mitigation efforts, land use and transportation planning, and the management of water resources. In this study, a method was developed for delineating river networks using Sentinel 1 imagery. Unsupervised classification was applied to multi-temporal Sentinel 1 data to discriminate water bodies from other land covers then the outputs were combined to generate a single persistent water bodies product. A thinning algorithm was then used to delineate river centre lines, which were converted into vector features and built into a topologically structured geometric network. The complex river system of the Niger Delta was used to compare the performance of the Sentinel-based method against alternative freely available water body products from United States Geological Survey, European Space Agency and OpenStreetMap and a river network derived from a Shuttle Rader Topography Mission Digital Elevation Model. From both raster-based and vector-based accuracy assessments, it was found that the Sentinel-based river network products were superior to the comparator data sets by a substantial margin. The geometric river network that was constructed permitted a flow routing analysis which is important for a variety of environmental management and planning applications. The extracted network will potentially be applied for modelling dispersion of hydrocarbon pollutants in Ogoniland, a part of the Niger Delta. The approach developed in this study holds considerable potential for generating up to date, detailed river network data for the many countries where such data are deficient.

Keywords: Sentinel 1, image processing, river delineation, large scale mapping, data comparison, geometric network

Procedia PDF Downloads 135
24843 The Performance and the Induced Rebar Corrosion of Acrylic Resins for Injection Systems in Concrete Structures

Authors: C. S. Paglia, E. Pesenti, A. Krattiger

Abstract:

Commercially available methacrylate and acrylamide-based acrylic resins for injection in concrete systems have been tested with respect to the sealing performance and the rebar corrosion. Among the different resins, a methacrylate-based type of acrylic resin significantly inhibited the rebar corrosion. This was mainly caused by the relatively high pH of the resin and the resin aqueous solution. This resin also exhibited a relatively high sealing performance, in particular after exposing the resin to durability tests. The corrosion inhibition behaviour and the sealing properties after the exposition to durability tests were maintained up to one year. The other resins either promoted the corrosion of the rebar and/or exhibited relatively low sealing properties.

Keywords: acrylic resin, sealing performance, rebar corrosion, materials

Procedia PDF Downloads 125
24842 Piezoelectric based Passive Vibration Control of Composite Turbine Blade using Shunt Circuit

Authors: Kouider Bendine, Zouaoui Satla, Boukhoulda Farouk Benallel, Shun-Qi Zhang

Abstract:

Turbine blades are subjected to a variety of loads, lead to an undesirable vibration. Such vibration can cause serious damages or even lead to a total failure of the blade. The present paper addresses the vibration control of turbine blade. The study aims to propose a passive vibration control using piezoelectric material. the passive control is effectuated by shunting an RL circuit to the piezoelectric patch in a parallel configuration. To this end, a Finite element model for the blade with the piezoelectric patch is implemented in ANSYS APDL. The model is then subjected to a harmonic frequency-based analysis for the case of control on and off. The results show that the proposed methodology was able to reduce blade vibration by 18%.

Keywords: blade, active piezoelectric vibration control, finite element., shunt circuit

Procedia PDF Downloads 96
24841 An Object-Oriented Modelica Model of the Water Level Swell during Depressurization of the Reactor Pressure Vessel of the Boiling Water Reactor

Authors: Rafal Bryk, Holger Schmidt, Thomas Mull, Ingo Ganzmann, Oliver Herbst

Abstract:

Prediction of the two-phase water mixture level during fast depressurization of the Reactor Pressure Vessel (RPV) resulting from an accident scenario is an important issue from the view point of the reactor safety. Since the level swell may influence the behavior of some passive safety systems, it has been recognized that an assumption which at the beginning may be considered as a conservative one, not necessary leads to a conservative result. This paper discusses outcomes obtained during simulations of the water dynamics and heat transfer during sudden depressurization of a vessel filled up to a certain level with liquid water under saturation conditions and with the rest of the vessel occupied by saturated steam. In case of the pressure decrease e.g. due to the main steam line break, the liquid water evaporates abruptly, being a reason thereby, of strong transients in the vessel. These transients and the sudden emergence of void in the region occupied at the beginning by liquid, cause elevation of the two-phase mixture. In this work, several models calculating the water collapse and swell levels are presented and validated against experimental data. Each of the models uses different approach to calculate void fraction. The object-oriented models were developed with the Modelica modelling language and the OpenModelica environment. The models represent the RPV of the Integral Test Facility Karlstein (INKA) – a dedicated test rig for simulation of KERENA – a new Boiling Water Reactor design of Framatome. The models are based on dynamic mass and energy equations. They are divided into several dynamic volumes in each of which, the fluid may be single-phase liquid, steam or a two-phase mixture. The heat transfer between the wall of the vessel and the fluid is taken into account. Additional heat flow rate may be applied to the first volume of the vessel in order to simulate the decay heat of the reactor core in a similar manner as it is simulated at INKA. The comparison of the simulations results against the reference data shows a good agreement.

Keywords: boiling water reactor, level swell, Modelica, RPV depressurization, thermal-hydraulics

Procedia PDF Downloads 206
24840 Seismic Response of Structure Using a Three Degree of Freedom Shake Table

Authors: Ketan N. Bajad, Manisha V. Waghmare

Abstract:

Earthquakes are the biggest threat to the civil engineering structures as every year it cost billions of dollars and thousands of deaths, around the world. There are various experimental techniques such as pseudo-dynamic tests – nonlinear structural dynamic technique, real time pseudo dynamic test and shaking table test method that can be employed to verify the seismic performance of structures. Shake table is a device that is used for shaking structural models or building components which are mounted on it. It is a device that simulates a seismic event using existing seismic data and nearly truly reproducing earthquake inputs. This paper deals with the use of shaking table test method to check the response of structure subjected to earthquake. The various types of shake table are vertical shake table, horizontal shake table, servo hydraulic shake table and servo electric shake table. The goal of this experiment is to perform seismic analysis of a civil engineering structure with the help of 3 degree of freedom (i.e. in X Y Z direction) shake table. Three (3) DOF shaking table is a useful experimental apparatus as it imitates a real time desired acceleration vibration signal for evaluating and assessing the seismic performance of structure. This study proceeds with the proper designing and erection of 3 DOF shake table by trial and error method. The table is designed to have a capacity up to 981 Newton. Further, to study the seismic response of a steel industrial building, a proportionately scaled down model is fabricated and tested on the shake table. The accelerometer is mounted on the model, which is used for recording the data. The experimental results obtained are further validated with the results obtained from software. It is found that model can be used to determine how the structure behaves in response to an applied earthquake motion, but the model cannot be used for direct numerical conclusions (such as of stiffness, deflection, etc.) as many uncertainties involved while scaling a small-scale model. The model shows modal forms and gives the rough deflection values. The experimental results demonstrate shake table as the most effective and the best of all methods available for seismic assessment of structure.

Keywords: accelerometer, three degree of freedom shake table, seismic analysis, steel industrial shed

Procedia PDF Downloads 133
24839 Improvements and Implementation Solutions to Reduce the Computational Load for Traffic Situational Awareness with Alerts (TSAA)

Authors: Salvatore Luongo, Carlo Luongo

Abstract:

This paper discusses the implementation solutions to reduce the computational load for the Traffic Situational Awareness with Alerts (TSAA) application, based on Automatic Dependent Surveillance-Broadcast (ADS-B) technology. In 2008, there were 23 total mid-air collisions involving general aviation fixed-wing aircraft, 6 of which were fatal leading to 21 fatalities. These collisions occurred during visual meteorological conditions, indicating the limitations of the see-and-avoid concept for mid-air collision avoidance as defined in the Federal Aviation Administration’s (FAA). The commercial aviation aircraft are already equipped with collision avoidance system called TCAS, which is based on classic transponder technology. This system dramatically reduced the number of mid-air collisions involving air transport aircraft. In general aviation, the same reduction in mid-air collisions has not occurred, so this reduction is the main objective of the TSAA application. The major difference between the original conflict detection application and the TSAA application is that the conflict detection is focused on preventing loss of separation in en-route environments. Instead TSAA is devoted to reducing the probability of mid-air collision in all phases of flight. The TSAA application increases the flight crew traffic situation awareness providing alerts of traffic that are detected in conflict with ownship in support of the see-and-avoid responsibility. The relevant effort has been spent in the design process and the code generation in order to maximize the efficiency and performances in terms of computational load and memory consumption reduction. The TSAA architecture is divided into two high-level systems: the “Threats database” and the “Conflict detector”. The first one receives the traffic data from ADS-B device and provides the memorization of the target’s data history. Conflict detector module estimates ownship and targets trajectories in order to perform the detection of possible future loss of separation between ownship and each target. Finally, the alerts are verified by additional conflict verification logic, in order to prevent possible undesirable behaviors of the alert flag. In order to reduce the computational load, a pre-check evaluation module is used. This pre-check is only a computational optimization, so the performances of the conflict detector system are not modified in terms of number of alerts detected. The pre-check module uses analytical trajectories propagation for both target and ownship. This allows major accuracy and avoids the step-by-step propagation, which requests major computational load. Furthermore, the pre-check permits to exclude the target that is certainly not a threat, using an analytical and efficient geometrical approach, in order to decrease the computational load for the following modules. This software improvement is not suggested by FAA documents, and so it is the main innovation of this work. The efficiency and efficacy of this enhancement are verified using fast-time and real-time simulations and by the execution on a real device in several FAA scenarios. The final implementation also permits the FAA software certification in compliance with DO-178B standard. The computational load reduction allows the installation of TSAA application also on devices with multiple applications and/or low capacity in terms of available memory and computational capabilities

Keywords: traffic situation awareness, general aviation, aircraft conflict detection, computational load reduction, implementation solutions, software certification

Procedia PDF Downloads 277
24838 Characteristics of Sorghum (Sorghum bicolor L. Moench) Flour on the Soaking Time of Peeled Grains and Particle Size Treatment

Authors: Sri Satya Antarlina, Elok Zubaidah, Teti Istiana, Harijono

Abstract:

Sorghum bicolor (Sorghum bicolor L. Moench) has the potential as a flour for gluten-free food products. Sorghum flour production needs grain soaking treatment. Soaking can reduce the tannin content which is an anti-nutrient, so it can increase the protein digestibility. Fine particle size decreases the yield of flour, so it is necessary to study various particle sizes to increase the yield. This study aims to determine the characteristics of sorghum flour in the treatment of soaking peeled grain and particle size. The material of white sorghum varieties KD-4 from farmers in East Java, Indonesia. Factorial randomized factorial design (two factors), repeated three times, factor I were the time of grain soaking (five levels) that were 0, 12, 24, 36, and 48 hours, factor II was the size of the starch particles sifted with a fineness level of 40, 60, 80, and 100 mesh. The method of making sorghum flour is grain peeling, soaking peeled grain, drying using the oven at 60ᵒC, milling, and sieving. Physico-chemical analysis of sorghum flour. The results show that there is an interaction between soaking time of grain with the size of sorghum flour particles. Interaction in yield of flour, L* color (brightness level), whiteness index, paste properties, amylose content, protein content, bulk density, and protein digestibility. The method of making sorghum flour through the soaking of peeled grain and the difference in particle size has an important role in producing the physicochemical properties of the specific flour. Based on the characteristics of sorghum flour produced, it is determined the method of making sorghum flour through sorghum grain soaking for 24 hours, the particle size of flour 80 mesh. The sorghum flour with characteristic were 24.88% yield of flour, 88.60 color L* (brightness level), 69.95 whiteness index, 3615 Cp viscosity, 584.10 g/l of bulk density, 24.27% db protein digestibility, 90.02% db starch content, 23.4% db amylose content, 67.45% db amylopectin content, 0.22% db crude fiber content, 0.037% db tannin content, 5.30% db protein content, ash content 0.18% db, carbohydrate content 92.88 % db, and 1.94% db fat content. The sorghum flour is recommended for cookies products.

Keywords: characteristic, sorghum (Sorghum bicolor L. Moench) flour, grain soaking, particle size, physicochemical properties

Procedia PDF Downloads 155
24837 STML: Service Type-Checking Markup Language for Services of Web Components

Authors: Saqib Rasool, Adnan N. Mian

Abstract:

Web components are introduced as the latest standard of HTML5 for writing modular web interfaces for ensuring maintainability through the isolated scope of web components. Reusability can also be achieved by sharing plug-and-play web components that can be used as off-the-shelf components by other developers. A web component encapsulates all the required HTML, CSS and JavaScript code as a standalone package which must be imported for integrating a web component within an existing web interface. It is then followed by the integration of web component with the web services for dynamically populating its content. Since web components are reusable as off-the-shelf components, these must be equipped with some mechanism for ensuring their proper integration with web services. The consistency of a service behavior can be verified through type-checking. This is one of the popular solutions for improving the quality of code in many programming languages. However, HTML does not provide type checking as it is a markup language and not a programming language. The contribution of this work is to introduce a new extension of HTML called Service Type-checking Markup Language (STML) for adding support of type checking in HTML for JSON based REST services. STML can be used for defining the expected data types of response from JSON based REST services which will be used for populating the content within HTML elements of a web component. Although JSON has five data types viz. string, number, boolean, object and array but STML is made to supports only string, number and object. This is because of the fact that both object and array are considered as string, when populated in HTML elements. In order to define the data type of any HTML element, developer just needs to add the custom STML attributes of st-string, st-number and st-boolean for string, number and boolean respectively. These all annotations of STML are used by the developer who is writing a web component and it enables the other developers to use automated type-checking for ensuring the proper integration of their REST services with the same web component. Two utilities have been written for developers who are using STML based web components. One of these utilities is used for automated type-checking during the development phase. It uses the browser console for showing the error description if integrated web service is not returning the response with expected data type. The other utility is a Gulp based command line utility for removing the STML attributes before going in production. This ensures the delivery of STML free web pages in the production environment. Both of these utilities have been tested to perform type checking of REST services through STML based web components and results have confirmed the feasibility of evaluating service behavior only through HTML. Currently, STML is designed for automated type-checking of integrated REST services but it can be extended to introduce a complete service testing suite based on HTML only, and it will transform STML from Service Type-checking Markup Language to Service Testing Markup Language.

Keywords: REST, STML, type checking, web component

Procedia PDF Downloads 249
24836 Analysis of Autonomous Orbit Determination for Lagrangian Navigation Constellation with Different Dynamical Models

Authors: Gao Youtao, Zhao Tanran, Jin Bingyu, Xu Bo

Abstract:

Global navigation satellite system(GNSS) can deliver navigation information for spacecraft orbiting on low-Earth orbits and medium Earth orbits. However, the GNSS cannot navigate the spacecraft on high-Earth orbit or deep space probes effectively. With the deep space exploration becoming a hot spot of aerospace, the demand for a deep space satellite navigation system is becoming increasingly prominent. Many researchers discussed the feasibility and performance of a satellite navigation system on periodic orbits around the Earth-Moon libration points which can be called Lagrangian point satellite navigation system. Autonomous orbit determination (AOD) is an important performance for the Lagrangian point satellite navigation system. With this ability, the Lagrangian point satellite navigation system can reduce the dependency on ground stations. AOD also can greatly reduce total system cost and assure mission continuity. As the elliptical restricted three-body problem can describe the Earth-Moon system more accurately than the circular restricted three-body problem, we study the autonomous orbit determination of Lagrangian navigation constellation using only crosslink range based on elliptical restricted three body problem. Extended Kalman filter is used in the autonomous orbit determination. In order to compare the autonomous orbit determination results based on elliptical restricted three-body problem to the results of autonomous orbit determination based on circular restricted three-body problem, we give the autonomous orbit determination position errors of a navigation constellation include four satellites based on the circular restricted three-body problem. The simulation result shows that the Lagrangian navigation constellation can achieve long-term precise autonomous orbit determination using only crosslink range. In addition, the type of the libration point orbit will influence the autonomous orbit determination accuracy.

Keywords: extended Kalman filter, autonomous orbit determination, quasi-periodic orbit, navigation constellation

Procedia PDF Downloads 278
24835 In the Eye of the Beholder: Customer Experience Journey with Airbnb

Authors: Nisreen N. Bahnan

Abstract:

This exploratory research is designed to inform the design of a Customer Journey Map for the vacation rental platform, Airbnb. Through the collection of exploratory survey data regarding consumer experience with the brand, the key customer touchpoints during each consumption stage were identified. The paper maps a customer journey and corresponding concrete efforts to enhance the customer experience with the brand at each important touchpoint. Some proposed strategic initiatives and service innovation strategies for each touchpoint are proposed. Further research, in collaboration with Airbnb management, hosts and guests, is required to propose more expansive recommendations for enhancing the Airbnb customer experience at each of these touchpoints.

Keywords: Airbnb, customer experience, customer journey map, service touchpoints

Procedia PDF Downloads 15
24834 Growth Performance,haematological And Serum Biochemistry Of Broilers Fed Graded Levels Of Cocoyam (Xanthosoma Sagittifolium)

Authors: Urom Scholastica Mgbo, Ifeanyichukwu, Vivian, Anaba, Uchemadu Martins, Arusiaba, Nelson Chijioke

Abstract:

The study was investigated to determine the growth performance , haematological and serum biochemistry of broiler fed graded levels of cocoyam (Xanthosoma sagittifolium). One hundred and twenty (120) day old broiler chicks of Anak strain were used for the study. The birds were randomly divided into 4 treatment groups of 30 birds per group, and each group was further divided into 3 replicates of 10 birds per replicate in group. Cooked cocoyam was used to formulate diets at inclusion levels of 0.00% for T1 (control), while T2, T3 and T4 contained 10.00%, 20.00% and 30.00% inclusion of cocoyam in partial replacement of maize in a Completely Randomized Design (CRD). At the end of the research, the haematological indices of broiler showed that packed cell volume (PCV) of birds fed diets 1(42.26%) and 3 (42.42%) were significantly (p<0.05) higher than birds fed diets 2 (39.72%) and 4 (38.78%).The Haemoglobin (Hb) of birds fed diets 3 (12.58g/dl) and 4 (12.26g/dl) were significantly (p<0.05) higher than birds fed diets 1 (11.60g/dl) and 2 (11.42g/dl). The values of the white blood cell (WBC) of the broiler chickens placed on cocoyam diet increased significantly (P<0.05) compared with the values obtained in the control (T1) . The serum protein value for birds fed diet I (5.45g/dl) were statistically (P>0.05) similar to those fed diets 2 (5.10g/dl) and 3 (5.38g/dl) but differ significantly (P<0.05) from diet 4 (4.97g/dl) which had the least protein value. Final weight of the birds showed that diet 4 (2370.85g) had the highest (P<0.05) value which was followed closely by diet 3 (2225.55g), while birds fed diets 1 (2165.70g) and diet 2 (2145.00g) recorded the least values Similar pattern was observed in the weight gain of the birds. Birds fed diet 4 (2270.30g) had higher (P<0.05) value, followed by birds on diet 3 (2125.45g), while birds fed diet 1 (2065.15g) and 2 (2044.90g) had the least values.. This study showed that birds fed diet 3 (50.60g) and diet 4 (54.05g) gave significantly (P<0.05) higher weight than the control diet (49.17g). There was significant (P<0.05) difference among the treatments for feed conversion ratio (FCR), were birds fed diet 4 (1.74) performed better, having the least feed conversion ratio. Economics of broiler chickens showed that Cost/kg of feed favored diet 4 (₦158.65) followed by diets 3 (₦165.95), 2 (₦178.52) and control diet 1 (₦197.14). From the result, the higher weight recorded in T4 4 showed that cocoyam meal can successfully replace maize up to 30% in the diet of broiler chickens. The low cost recorded in cocoyam based diets showed that the diets were more economical and beneficial compared to control diet 1. Therefore, feeding diet 4 (30%) cocoyam meal as replacement of maize in broiler chickens is recommended.

Keywords: cocoyam, growth, heamatology, serum biochemistry

Procedia PDF Downloads 108
24833 Impacts of Urban Morphologies on Air Pollutants Dispersion in Porto's Urban Area

Authors: Sandra Rafael, Bruno Vicente, Vera Rodrigues, Carlos Borrego, Myriam Lopes

Abstract:

Air pollution is an environmental and social issue at different spatial scales, especially in a climate change context, with an expected decrease of air quality. Air pollution is a combination of high emissions and unfavourable weather conditions, where wind speed and wind direction play a key role. The urban design (location and structure of buildings and trees) can both promote the air pollutants dispersion as well as promote their retention within the urban area. Today, most of the urban areas are applying measures to adapt to future extreme climatic events. Most of these measures are grounded on nature-based solutions, namely green roofs and green areas. In this sense, studies are required to evaluate how the implementation of these actions will influence the wind flow within the urban area and, consequently, how this will influence air pollutants' dispersion. The main goal of this study was to evaluate the influence of a set of urban morphologies in the wind conditions and in the dispersion of air pollutants, in a built-up area in Portugal. For that, two pollutants were analysed (NOx and PM10) and four scenarios were developed: i) a baseline scenario, which characterizes the current status of the study area, ii) an urban green scenario, which implies the implementation of a green area inside the domain, iii) a green roof scenario, which consists in the implementation of green roofs in a specific area of the domain; iv) a 'grey' scenario, which consists in a scenario with absence of vegetation. For that, two models were used, namely the Weather Research and Forecasting model (WRF) and the CFD model VADIS (pollutant dispersion in the atmosphere under variable wind conditions). The WRF model was used to initialize the CFD model, while the last was used to perform the set of numerical simulations, on an hourly basis. The implementation of the green urban area promoted a reduction of air pollutants' concentrations, 16% on average, related to the increase in the wind flow, which promotes air pollutants dispersion; while the application of green roofs showed an increase of concentrations (reaching 60% during specific time periods). Overall the results showed that a strategic placement of vegetation in cities has the potential to make an important contribution to increase air pollutants dispersion and so promote the improvement of air quality and sustainability of urban environments.

Keywords: air pollutants dispersion, wind conditions, urban morphologies, road traffic emissions

Procedia PDF Downloads 343
24832 Hot Carrier Photocurrent as a Candidate for an Intrinsic Loss in a Single Junction Solar Cell

Authors: Jonas Gradauskas, Oleksandr Masalskyi, Ihor Zharchenko

Abstract:

The advancement in improving the efficiency of conventional solar cells toward the Shockley-Queisser limit seems to be slowing down or reaching a point of saturation. The challenges hindering the reduction of this efficiency gap can be categorized into extrinsic and intrinsic losses, with the former being theoretically avoidable. Among the five intrinsic losses, two — the below-Eg loss (resulting from non-absorption of photons with energy below the semiconductor bandgap) and thermalization loss —contribute to approximately 55% of the overall lost fraction of solar radiation at energy bandgap values corresponding to silicon and gallium arsenide. Efforts to minimize the disparity between theoretically predicted and experimentally achieved efficiencies in solar cells necessitate the integration of innovative physical concepts. Hot carriers (HC) present a contemporary approach to addressing this challenge. The significance of hot carriers in photovoltaics is not fully understood. Although their excessive energy is thought to indirectly impact a cell's performance through thermalization loss — where the excess energy heats the lattice, leading to efficiency loss — evidence suggests the presence of hot carriers in solar cells. Despite their exceptionally brief lifespan, tangible benefits arise from their existence. The study highlights direct experimental evidence of hot carrier effect induced by both below- and above-bandgap radiation in a singlejunction solar cell. Photocurrent flowing across silicon and GaAs p-n junctions is analyzed. The photoresponse consists, on the whole, of three components caused by electron-hole pair generation, hot carriers, and lattice heating. The last two components counteract the conventional electron-hole generation-caused current required for successful solar cell operation. Also, a model of the temperature coefficient of the voltage change of the current–voltage characteristic is used to obtain the hot carrier temperature. The distribution of cold and hot carriers is analyzed with regard to the potential barrier height of the p-n junction. These discoveries contribute to a better understanding of hot carrier phenomena in photovoltaic devices and are likely to prompt a reevaluation of intrinsic losses in solar cells.

Keywords: solar cell, hot carriers, intrinsic losses, efficiency, photocurrent

Procedia PDF Downloads 61
24831 Usability in E-Commerce Websites: Results of Eye Tracking Evaluations

Authors: Beste Kaysı, Yasemin Topaloğlu

Abstract:

Usability is one of the most important quality attributes for web-based information systems. Specifically, for e-commerce applications, usability becomes more prominent. In this study, we aimed to explore the features that experienced users seek in e-commerce applications. We used eye tracking method in evaluations. Eye movement data are obtained from the eye-tracking method and analyzed based on task completion time, number of fixations, as well as heat map and gaze plot measures. The results of the analysis show that the eye movements of participants' are too static in certain areas and their areas of interest are scattered in many different places. It has been determined that this causes users to fail to complete their transactions. According to the findings, we outlined the issues to improve the usability of e-commerce websites. Then we propose solutions to identify the issues. In this way, it is expected that e-commerce sites will be developed which will make experienced users more satisfied.

Keywords: e-commerce websites, eye tracking method, usability, website evaluations

Procedia PDF Downloads 178
24830 Practical Application of Simulation of Business Processes

Authors: Markéta Gregušová, Vladimíra Schindlerová, Ivana Šajdlerová, Petr Mohyla, Jan Kedroň

Abstract:

Company managers are always looking for more and more opportunities to succeed in today's fiercely competitive market. To maintain your place among the successful companies on the market today or to come up with a revolutionary business idea is much more difficult than before. Each new or improved method, tool, or approach that can improve the functioning of business processes or even of the entire system is worth checking and verification. The use of simulation in the design of manufacturing systems and their management in practice is one of the ways without increased risk, which makes it possible to find the optimal parameters of manufacturing processes and systems. The paper presents an example of use of simulation for solution of the bottleneck problem in the concrete company.

Keywords: practical applications, business processes, systems, simulation

Procedia PDF Downloads 539
24829 Voices of Dissent: Case Study of a Digital Archive of Testimonies of Political Oppression

Authors: Andrea Scapolo, Zaya Rustamova, Arturo Matute Castro

Abstract:

The “Voices in Dissent” initiative aims at collecting and making available in a digital format, testimonies, letters, and other narratives produced by victims of political oppression from different geographical spaces across the Atlantic. By recovering silenced voices behind the official narratives, this open-access online database will provide indispensable tools for rewriting the history of authoritarian regimes from the margins as memory debates continue to provoke controversy among academic and popular transnational circles. In providing an extensive database of non-hegemonic discourses in a variety of political and social contexts, the project will complement the existing European and Latin-American studies, and invite further interdisciplinary and trans-national research. This digital resource will be available to academic communities and the general audience and will be organized geographically and chronologically. “Voices in Dissent” will offer a first comprehensive study of these personal accounts of persecution and repression against determined historical backgrounds and their impact on collective memory formation in contemporary societies. The digitalization of these texts will allow to run metadata analyses and adopt comparatist approaches for a broad range of research endeavors. Most of the testimonies included in our archive are testimonies of trauma: the trauma of exile, imprisonment, torture, humiliation, censorship. The research on trauma has now reached critical mass and offers a broad spectrum of critical perspectives. By putting together testimonies from different geographical and historical contexts, our project will provide readers and scholars with an extraordinary opportunity to investigate how culture shapes individual and collective memories and provides or denies resources to make sense and cope with the trauma. For scholars dealing with the epistemological and rhetorical analysis of testimonies, an online open-access archive will prove particularly beneficial to test theories on truth status and the formation of belief as well as to study the articulation of discourse. An important aspect of this project is also its pedagogical applications since it will contribute to the creation of Open Educational Resources (OER) to support students and educators worldwide. Through collaborations with our Library System, the archive will form part of the Digital Commons database. The texts collected in this online archive will be made available in the original languages as well as in English translation. They will be accompanied by a critical apparatus that will contextualize them historically by providing relevant background information and bibliographical references. All these materials can serve as a springboard for a broad variety of educational projects and classroom activities. They can also be used to design specific content courses or modules. In conclusion, the desirable outcomes of the “Voices in Dissent” project are: 1. the collections and digitalization of political dissent testimonies; 2. the building of a network of scholars, educators, and learners involved in the design, development, and sustainability of the digital archive; 3. the integration of the content of the archive in both research and teaching endeavors, such as publication of scholarly articles, design of new upper-level courses, and integration of the materials in existing courses.

Keywords: digital archive, dissent, open educational resources, testimonies, transatlantic studies

Procedia PDF Downloads 102
24828 A Machine Learning-Assisted Crime and Threat Intelligence Hunter

Authors: Mohammad Shameel, Peter K. K. Loh, James H. Ng

Abstract:

Cybercrime is a new category of crime which poses a different challenge for crime investigators and incident responders. Attackers can mask their identities using a suite of tools and with the help of the deep web, which makes them difficult to track down. Scouring the deep web manually takes time and is inefficient. There is a growing need for a tool to scour the deep web to obtain useful evidence or intel automatically. In this paper, we will explain the background and motivation behind the research, present a survey of existing research on related tools, describe the design of our own crime/threat intelligence hunting tool prototype, demonstrate its capability with some test cases and lastly, conclude with proposals for future enhancements.

Keywords: cybercrime, deep web, threat intelligence, web crawler

Procedia PDF Downloads 169
24827 Navigating through Organizational Change: TAM-Based Manual for Digital Skills and Safety Transitions

Authors: Margarida Porfírio Tomás, Paula Pereira, José Palma Oliveira

Abstract:

Robotic grasping is advancing rapidly, but transferring techniques from rigid to deformable objects remains a challenge. Deformable and flexible items, such as food containers, demand nuanced handling due to their changing shapes. Bridging this gap is crucial for applications in food processing, surgical robotics, and household assistance. AGILEHAND, a Horizon project, focuses on developing advanced technologies for sorting, handling, and packaging soft and deformable products autonomously. These technologies serve as strategic tools to enhance flexibility, agility, and reconfigurability within the production and logistics systems of European manufacturing companies. Key components include intelligent detection, self-adaptive handling, efficient sorting, and agile, rapid reconfiguration. The overarching goal is to optimize work environments and equipment, ensuring both efficiency and safety. As new technologies emerge in the food industry, there will be some implications, such as labour force, safety problems and acceptance of the new technologies. To overcome these implications, AGILEHAND emphasizes the integration of social sciences and humanities, for example, the application of the Technology Acceptance Model (TAM). The project aims to create a change management manual, that will outline strategies for developing digital skills and managing health and safety transitions. It will also provide best practices and models for organizational change. Additionally, AGILEHAND will design effective training programs to enhance employee skills and knowledge. This information will be obtained through a combination of case studies, structured interviews, questionnaires, and a comprehensive literature review. The project will explore how organizations adapt during periods of change and identify factors influencing employee motivation and job satisfaction. This project received funding from European Union’s Horizon 2020/Horizon Europe research and innovation program under grant agreement No101092043 (AGILEHAND).

Keywords: change management, technology acceptance model, organizational change, health and safety

Procedia PDF Downloads 39
24826 Economic Empowerment before Political Participation: Peacebuilding from the Perspective of Women Activists in the Post-Yugoslav Area

Authors: Emilie Fort

Abstract:

Two major pitfalls emerge at the intersection of gender and peacebuilding literature: the comprehension of women as a homogeneous category and a focus on women's participation in formal peace processes and state structures. However, women belong (and identify) to distinct ethnic, religious, or social groups, and the variety of their social location impacts their ability to mobilize, to participate in peace processes as well as the way they envision peace. This study is based on interviews conducted (remotely) with women activists from the post-Yugoslav area. It shows that women's economic empowerment and education are central issues that must be addressed for women political participation being effective. This has implications for peace projects –their priorities, scales of implementation, etc.– and the allocation of civil society’s funds.

Keywords: ex-Yugoslavia, gender-based issues, peacebuilding, women activism

Procedia PDF Downloads 190
24825 What Constitutes Pre-School Mathematics and How It Look Like in the Classroom?

Authors: Chako G. Chako

Abstract:

This study reports on an ongoing research that explores pre-school mathematics. Participants in the study includes three pre-school teachers and their pre-school learners from one school in Gaborone. The school was purposefully selected based on its performance in Botswana’s 2019 national examinations. Specifically, the study is interested on teachers’ explanations of mathematics concepts embedded in pre-school mathematics tasks. The interest on explanations was informed by the view that suggests that, the mathematics learners get to learn, resides in teachers’ explanations. Recently, Botswana’s basic education has integrated pre-school education into the mainstream public primary school education. This move is part of the government’s drive to elevate Botswana to a knowledge-based-economy. It is believed that provision of pre-school education to all Batswana children will contribute immensely towards a knowledge-based-economy. Since pre-school is now a new phenomenon in our education, there is limited research at this level of education in Botswana. In particular, there is limited knowledge about what and how the teaching is conducted in Pre-Schools in Botswana. Hence, the study seeks to gain insight into what constitutes mathematics in tasks that learners are given, and how concepts are made accessible to Pre-school learners. The research question of interest for this study is stated as: What is the nature Pre-school teachers’ explanations of mathematics concepts embedded in tasks given to learners. Casting some light into what and how pre-school mathematics tasks are enacted is critical for policy and Pre-school teacher professional development. The sociocultural perspective framed the research. Adler and Rhonda’s (2014) notion of exemplification and explanatory communication are used to analyze tasks given to learners and teachers’ explanations respectively.

Keywords: classroom, explanation, mathematics, pre-school, tasks

Procedia PDF Downloads 154
24824 A Review of Travel Data Collection Methods

Authors: Muhammad Awais Shafique, Eiji Hato

Abstract:

Household trip data is of crucial importance for managing present transportation infrastructure as well as to plan and design future facilities. It also provides basis for new policies implemented under Transportation Demand Management. The methods used for household trip data collection have changed with passage of time, starting with the conventional face-to-face interviews or paper-and-pencil interviews and reaching to the recent approach of employing smartphones. This study summarizes the step-wise evolution in the travel data collection methods. It provides a comprehensive review of the topic, for readers interested to know the changing trends in the data collection field.

Keywords: computer, smartphone, telephone, travel survey

Procedia PDF Downloads 307
24823 COSMO-RS Prediction for Choline Chloride/Urea Based Deep Eutectic Solvent: Chemical Structure and Application as Agent for Natural Gas Dehydration

Authors: Tayeb Aissaoui, Inas M. AlNashef

Abstract:

In recent years, green solvents named deep eutectic solvents (DESs) have been found to possess significant properties and to be applicable in several technologies. Choline chloride (ChCl) mixed with urea at a ratio of 1:2 and 80 °C was the first discovered DES. In this article, chemical structure and combination mechanism of ChCl: urea based DES were investigated. Moreover, the implementation of this DES in water removal from natural gas was reported. Dehydration of natural gas by ChCl:urea shows significant absorption efficiency compared to triethylene glycol. All above operations were retrieved from COSMOthermX software. This article confirms the potential application of DESs in gas industry.

Keywords: COSMO-RS, deep eutectic solvents, dehydration, natural gas, structure, organic salt

Procedia PDF Downloads 288
24822 Language Errors Used in “The Space between Us” Movie and Their Effects on Translation Quality: Translation Study toward Discourse Analysis Approach

Authors: Mochamad Nuruz Zaman, Mangatur Rudolf Nababan, M. A. Djatmika

Abstract:

Both society and education areas teach to have good communication for building the interpersonal skills up. Everyone has the capacity to understand something new, either well comprehension or worst understanding. Worst understanding makes the language errors when the interactions are done by someone in the first meeting, and they do not know before it because of distance area. “The Space between Us” movie delivers the love-adventure story between Mars Boy and Earth Girl. They are so many missing conversations because of the different climate and environment. As the moviegoer also must be focused on the subtitle in order to enjoy well the movie. Furthermore, Indonesia subtitle and English conversation on the movie still have overlapping understanding in the translation. Translation hereby consists of source language -SL- (English conversation) and target language -TL- (Indonesia subtitle). These research gap above is formulated in research question by how the language errors happened in that movie and their effects on translation quality which is deepest analyzed by translation study toward discourse analysis approach. The research goal is to expand the language errors and their translation qualities in order to create a good atmosphere in movie media. The research is studied by embedded research in qualitative design. The research locations consist of setting, participant, and event as focused determined boundary. Sources of datum are “The Space between Us” movie and informant (translation quality rater). The sampling is criterion-based sampling (purposive sampling). Data collection techniques use content analysis and questioner. Data validation applies data source and method triangulation. Data analysis delivers domain, taxonomy, componential, and cultural theme analysis. Data findings on the language errors happened in the movie are referential, register, society, textual, receptive, expressive, individual, group, analogical, transfer, local, and global errors. Data discussions on their effects to translation quality are concentrated by translation techniques on their data findings; they are amplification, borrowing, description, discursive creation, established equivalent, generalization, literal, modulation, particularization, reduction, substitution, and transposition.

Keywords: discourse analysis, language errors, The Space between Us movie, translation techniques, translation quality instruments

Procedia PDF Downloads 217
24821 Focus Group Study Exploring Researchers Perspective on Open Science Policy

Authors: E. T. Svahn

Abstract:

Knowledge about the factors that influence the exchange between research and society is of the utmost importance for developing collaboration between different actors, especially in future science policy development and the creation of support structures for researchers. Among other things, how researchers look at the surrounding open science policy environment and what conditions and attitudes they have for interacting with it. This paper examines the Finnish researchers' attitudes towards open science policies in 2020. Open science is an integrated part of researchers' daily lives and supports not only the effectiveness of research outputs but also the quality of research. Open science policy in ideal situation is seen as a supporting structure that enables the exchange between research and society, but in other situation, it can end up being red tape generating obstacles and hindering possibilities of making science in an efficient way. Results of this study were carried out through focus group interviews. This qualitative research method was selected because it aims to understand the phenomenon under study. In addition, focus group interviews produce diverse and rich material that would not be available with other research methods. Focus group interviews have well-established applications in social science, especially in understanding the perspectives and experiences of research subjects. In this study, focus groups were used in studying the mindset and actions of researchers. Each group's size was between 4-10 people, and the aim was to bring out different perspectives on the subject. The interviewer enabled the presentation of different perceptions and opinions, and the focus group interviews were recorded and written as text. The material was analysed using grounded theory method. The results are presented as thematic areas, theoretical model, and as direct quotations. Attitudes towards open science policy can vary greatly depending on the research area. This study shows that the open science policy demands in medicine, technology, and natural sciences compared to social sciences, educational sciences, and the humanities, varies somewhat. The variation in attitudes between different research areas can thus be largely explained by the fact that the research output and ethical code vary significantly between certain subjects. This study aims to increase understanding of the nuances to what extent open science policies should be tailored for different disciplines and research areas.

Keywords: focus group interview, grounded theory, open science policy, science policy

Procedia PDF Downloads 152