Search results for: cloud service models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10572

Search results for: cloud service models

9312 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 71
9311 Predict Suspended Sediment Concentration Using Artificial Neural Networks Technique: Case Study Oued El Abiod Watershed, Algeria

Authors: Adel Bougamouza, Boualam Remini, Abd El Hadi Ammari, Feteh Sakhraoui

Abstract:

The assessment of sediments being carried by a river is importance for planning and designing of various water resources projects. In this study, Artificial Neural Network Techniques are used to estimate the daily suspended sediment concentration for the corresponding daily discharge flow in the upstream of Foum El Gherza dam, Biskra, Algeria. The FFNN, GRNN, and RBNN models are established for estimating current suspended sediment values. Some statistics involving RMSE and R2 were used to evaluate the performance of applied models. The comparison of three AI models showed that the RBNN model performed better than the FFNN and GRNN models with R2 = 0.967 and RMSE= 5.313 mg/l. Therefore, the ANN model had capability to improve nonlinear relationships between discharge flow and suspended sediment with reasonable precision.

Keywords: artificial neural network, Oued Abiod watershed, feedforward network, generalized regression network, radial basis network, sediment concentration

Procedia PDF Downloads 420
9310 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques

Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev

Abstract:

Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.

Keywords: data analysis, demand modeling, healthcare, medical facilities

Procedia PDF Downloads 145
9309 Hybrid Velocity Control Approach for Tethered Aerial Vehicle

Authors: Lovesh Goyal, Pushkar Dave, Prajyot Jadhav, GonnaYaswanth, Sakshi Giri, Sahil Dharme, Rushika Joshi, Rishabh Verma, Shital Chiddarwar

Abstract:

With the rising need for human-robot interaction, researchers have proposed and tested multiple models with varying degrees of success. A few of these models performed on aerial platforms are commonly known as Tethered Aerial Systems. These aerial vehicles may be powered continuously by a tether cable, which addresses the predicament of the short battery life of quadcopters. This system finds applications to minimize humanitarian efforts for industrial, medical, agricultural, and service uses. However, a significant challenge in employing such systems is that it necessities attaining smooth and secure robot-human interaction while ensuring that the forces from the tether remain within the standard comfortable range for the humans. To tackle this problem, a hybrid control method that could switch between two control techniques: constant control input and the steady-state solution, is implemented. The constant control approach is implemented when a person is far from the target location, and error is thought to be eventually constant. The controller switches to the steady-state approach when the person reaches within a specific range of the goal position. Both strategies take into account human velocity feedback. This hybrid technique enhances the outcomes by assisting the person to reach the desired location while decreasing the human's unwanted disturbance throughout the process, thereby keeping the interaction between the robot and the subject smooth.

Keywords: unmanned aerial vehicle, tethered system, physical human-robot interaction, hybrid control

Procedia PDF Downloads 98
9308 Animal Modes of Surgical or Other External Causes of Trauma Wound Infection

Authors: Ojoniyi Oluwafeyekikunmi Okiki

Abstract:

Notwithstanding advances in disturbing wound care and control, infections remain a main motive of mortality, morbidity, and financial disruption in tens of millions of wound sufferers around the sector. Animal models have become popular gear for analyzing a big selection of outside worrying wound infections and trying out new antimicrobial techniques. This evaluation covers experimental infections in animal models of surgical wounds, pores and skin abrasions, burns, lacerations, excisional wounds, and open fractures. Animal modes of external stressful wound infections stated via extraordinary investigators vary in animal species used, microorganism traces, the quantity of microorganisms carried out, the dimensions of the wounds, and, for burn infections, the period of time the heated object or liquid is in contact with the skin. As antibiotic resistance continues to grow, new antimicrobial procedures are urgently needed. Those have to be examined using popular protocols for infections in external stressful wounds in animal models.

Keywords: surgical wounds, animals, wound infections, burns, wound models, colony-forming gadgets, lacerated wounds

Procedia PDF Downloads 14
9307 A Framework for Auditing Multilevel Models Using Explainability Methods

Authors: Debarati Bhaumik, Diptish Dey

Abstract:

Multilevel models, increasingly deployed in industries such as insurance, food production, and entertainment within functions such as marketing and supply chain management, need to be transparent and ethical. Applications usually result in binary classification within groups or hierarchies based on a set of input features. Using open-source datasets, we demonstrate that popular explainability methods, such as SHAP and LIME, consistently underperform inaccuracy when interpreting these models. They fail to predict the order of feature importance, the magnitudes, and occasionally even the nature of the feature contribution (negative versus positive contribution to the outcome). Besides accuracy, the computational intractability of SHAP for binomial classification is a cause of concern. For transparent and ethical applications of these hierarchical statistical models, sound audit frameworks need to be developed. In this paper, we propose an audit framework for technical assessment of multilevel regression models focusing on three aspects: (i) model assumptions & statistical properties, (ii) model transparency using different explainability methods, and (iii) discrimination assessment. To this end, we undertake a quantitative approach and compare intrinsic model methods with SHAP and LIME. The framework comprises a shortlist of KPIs, such as PoCE (Percentage of Correct Explanations) and MDG (Mean Discriminatory Gap) per feature, for each of these three aspects. A traffic light risk assessment method is furthermore coupled to these KPIs. The audit framework will assist regulatory bodies in performing conformity assessments of AI systems using multilevel binomial classification models at businesses. It will also benefit businesses deploying multilevel models to be future-proof and aligned with the European Commission’s proposed Regulation on Artificial Intelligence.

Keywords: audit, multilevel model, model transparency, model explainability, discrimination, ethics

Procedia PDF Downloads 95
9306 Taiwanese Pre-Service Elementary School EFL Teachers’ Perception and Practice of Station Teaching in English Remedial Education

Authors: Chien Chin-Wen

Abstract:

Collaborative teaching has different teaching models and station teaching is one type of collaborative teaching. Station teaching is not commonly practiced in elementary school English education and introduced in language teacher education programs in Taiwan. In station teaching, each teacher takes a small part of instructional content, working with a small number of students. Students rotate between stations where they receive the assignments and instruction from different teachers. The teachers provide the same content to each group, but the instructional method can vary based upon the needs of each group of students. This study explores thirty-four Taiwanese pre-service elementary school English teachers’ knowledge about station teaching and their competence demonstrated in designing activities for and delivering of station teaching in an English remedial education to six sixth graders in a local elementary school in northern Taiwan. The participants simultaneously enrolled in this Elementary School English Teaching Materials and Methods class, a part of an elementary school teacher education program in a northern Taiwan city. The instructor (Jennifer, pseudonym) in this Elementary School English Teaching Materials and Methods class collaborated with an English teacher (Olivia, pseudonym) in Maureen Elementary School (pseudonym), an urban elementary school in a northwestern Taiwan city. Of Olivia’s students, four male and two female sixth graders needed to have remedial English education. Olivia chose these six elementary school students because they were in the lowest 5 % of their class in terms of their English proficiency. The thirty-four pre-service English teachers signed up for and took turns in teaching these six sixth graders every Thursday afternoon from four to five o’clock for twelve weeks. While three participants signed up as a team and taught these six sixth graders, the last team consisted of only two pre-service teachers. Each team designed a 40-minute lesson plan on the given language focus (words, sentence patterns, dialogue, phonics) of the assigned unit. Data in this study included the KWLA chart, activity designs, and semi-structured interviews. Data collection lasted for four months, from September to December 2014. Data were analyzed as follows. First, all the notes were read and marked with appropriate codes (e.g., I don’t know, co-teaching etc.). Second, tentative categories were labeled (e.g., before, after, process, future implication, etc.). Finally, the data were sorted into topics that reflected the research questions on the basis of their relevance. This study has the following major findings. First of all, the majority of participants knew nothing about station teaching at the beginning of the study. After taking the course Elementary School English Teaching Materials and Methods and after designing and delivering the station teaching in an English remedial education program to six sixth graders, they learned that station teaching is co-teaching, and that it includes activity designs for different stations and students’ rotating from station to station. They demonstrated knowledge and skills in activity designs for vocabulary, sentence patterns, dialogue, and phonics. Moreover, they learned to interact with individual learners and guided them step by step in learning vocabulary, sentence patterns, dialogue, and phonics. However, they were still incompetent in classroom management, time management, English, and designing diverse and meaningful activities for elementary school students at different English proficiency levels. Hence, language teacher education programs are recommended to integrate station teaching to help pre-service teachers be equipped with eight knowledge and competences, including linguistic knowledge, content knowledge, general pedagogical knowledge, curriculum knowledge, knowledge of learners and their characteristics, pedagogical content knowledge, knowledge of education content, and knowledge of education’s ends and purposes.

Keywords: co-teaching, competence, knowledge, pre-service teachers, station teaching

Procedia PDF Downloads 428
9305 Probabilistic Models to Evaluate Seismic Liquefaction In Gravelly Soil Using Dynamic Penetration Test and Shear Wave Velocity

Authors: Nima Pirhadi, Shao Yong Bo, Xusheng Wan, Jianguo Lu, Jilei Hu

Abstract:

Although gravels and gravelly soils are assumed to be non-liquefiable because of high conductivity and small modulus; however, the occurrence of this phenomenon in some historical earthquakes, especially recently earthquakes during 2008 Wenchuan, Mw= 7.9, 2014 Cephalonia, Greece, Mw= 6.1 and 2016, Kaikoura, New Zealand, Mw = 7.8, has been promoted the essential consideration to evaluate risk assessment and hazard analysis of seismic gravelly soil liquefaction. Due to the limitation in sampling and laboratory testing of this type of soil, in situ tests and site exploration of case histories are the most accepted procedures. Of all in situ tests, dynamic penetration test (DPT), Which is well known as the Chinese dynamic penetration test, and shear wave velocity (Vs) test, have been demonstrated high performance to evaluate seismic gravelly soil liquefaction. However, the lack of a sufficient number of case histories provides an essential limitation for developing new models. This study at first investigates recent earthquakes that caused liquefaction in gravelly soils to collect new data. Then, it adds these data to the available literature’s dataset to extend them and finally develops new models to assess seismic gravelly soil liquefaction. To validate the presented models, their results are compared to extra available models. The results show the reasonable performance of the proposed models and the critical effect of gravel content (GC)% on the assessment.

Keywords: liquefaction, gravel, dynamic penetration test, shear wave velocity

Procedia PDF Downloads 201
9304 Juxtaposing South Africa’s Private Sector and Its Public Service Regarding Innovation Diffusion, to Explore the Obstacles to E-Governance

Authors: Petronella Jonck, Freda van der Walt

Abstract:

Despite the benefits of innovation diffusion in the South African public service, implementation thereof seems to be problematic, particularly with regard to e-governance which would enhance the quality of service delivery, especially accessibility, choice, and mode of operation. This paper reports on differences between the public service and the private sector in terms of innovation diffusion. Innovation diffusion will be investigated to explore identified obstacles that are hindering successful implementation of e-governance. The research inquiry is underpinned by the diffusion of innovation theory, which is premised on the assumption that innovation has a distinct channel, time, and mode of adoption within the organisation. A comparative thematic document analysis was conducted to investigate organisational differences with regard to innovation diffusion. A similar approach has been followed in other countries, where the same conceptual framework has been used to guide document analysis in studies in both the private and the public sectors. As per the recommended conceptual framework, three organisational characteristics were emphasised, namely the external characteristics of the organisation, the organisational structure, and the inherent characteristics of the leadership. The results indicated that the main difference in the external characteristics lies in the focus and the clientele of the private sector. With regard to organisational structure, private organisations have veto power, which is not the case in the public service. Regarding leadership, similarities were observed in social and environmental responsibility and employees’ attitudes towards immediate supervision. Differences identified included risk taking, the adequacy of leadership development, organisational approaches to motivation and involvement in decision making, and leadership style. Due to the organisational differences observed, it is recommended that differentiated strategies be employed to ensure effective innovation diffusion, and ultimately e-governance. It is recommended that the results of this research be used to stimulate discussion on ways to improve collaboration between the mentioned sectors, to capitalise on the benefits of each sector.

Keywords: E-governance, ICT, innovation diffusion, comparative analysis

Procedia PDF Downloads 355
9303 Predictive Models for Compressive Strength of High Performance Fly Ash Cement Concrete for Pavements

Authors: S. M. Gupta, Vanita Aggarwal, Som Nath Sachdeva

Abstract:

The work reported through this paper is an experimental work conducted on High Performance Concrete (HPC) with super plasticizer with the aim to develop some models suitable for prediction of compressive strength of HPC mixes. In this study, the effect of varying proportions of fly ash (0% to 50% at 10% increment) on compressive strength of high performance concrete has been evaluated. The mix designs studied were M30, M40 and M50 to compare the effect of fly ash addition on the properties of these concrete mixes. In all eighteen concrete mixes have been designed, three as conventional concretes for three grades under discussion and fifteen as HPC with fly ash with varying percentages of fly ash. The concrete mix designing has been done in accordance with Indian standard recommended guidelines i.e. IS: 10262. All the concrete mixes have been studied in terms of compressive strength at 7 days, 28 days, 90 days and 365 days. All the materials used have been kept same throughout the study to get a perfect comparison of values of results. The models for compressive strength prediction have been developed using Linear Regression method (LR), Artificial Neural Network (ANN) and Leave One Out Validation (LOOV) methods.

Keywords: high performance concrete, fly ash, concrete mixes, compressive strength, strength prediction models, linear regression, ANN

Procedia PDF Downloads 446
9302 A Case Study on Smart Energy City of the UK: Based on Business Model Innovation

Authors: Minzheong Song

Abstract:

The purpose of this paper is to see a case of smart energy evolution of the UK along with government projects and smart city project like 'Smart London Plan (SLP)' in 2013 with the logic of business model innovation (BMI). For this, it discusses the theoretical logic and formulates a research framework of evolving smart energy from silo to integrated system. The starting point is the silo system with no connection and in second stage, the private investment in smart meters, smart grids implementation, energy and water nexus, adaptive smart grid systems, and building marketplaces with platform leadership. As results, the UK’s smart energy sector has evolved from smart meter device installation through smart grid to new business models such as water-energy nexus and microgrid service within the smart energy city system.

Keywords: smart city, smart energy, business model, business model innovation (BMI)

Procedia PDF Downloads 162
9301 Evaluating the Suitability and Performance of Dynamic Modulus Predictive Models for North Dakota’s Asphalt Mixtures

Authors: Duncan Oteki, Andebut Yeneneh, Daba Gedafa, Nabil Suleiman

Abstract:

Most agencies lack the equipment required to measure the dynamic modulus (|E*|) of asphalt mixtures, necessitating the need to use predictive models. This study compared measured |E*| values for nine North Dakota asphalt mixes using the original Witczak, modified Witczak, and Hirsch models. The influence of temperature on the |E*| models was investigated, and Pavement ME simulations were conducted using measured |E*| and predictions from the most accurate |E*| model. The results revealed that the original Witczak model yielded the lowest Se/Sy and highest R² values, indicating the lowest bias and highest accuracy, while the poorest overall performance was exhibited by the Hirsch model. Using predicted |E*| as inputs in the Pavement ME generated conservative distress predictions compared to using measured |E*|. The original Witczak model was recommended for predicting |E*| for low-reliability pavements in North Dakota.

Keywords: asphalt mixture, binder, dynamic modulus, MEPDG, pavement ME, performance, prediction

Procedia PDF Downloads 49
9300 Model of a Context-Aware Middleware for Mobile Workers

Authors: Esraa Moustafa, Gaetan Rey, Stephane Lavirotte, Jean-Yves Tigli

Abstract:

With the development of Internet of Things and Web of Things, computing becomes more pervasive, invisible and present everywhere. In fact, in our environment, we are surrounded by multiple devices that deliver (web) services that meet the needs of the users. However, the mobility of these devices as the users has important repercussions that challenge software design of these applications because the variability of the environment cannot be anticipated at the design time. Thus, it will be interesting to dynamically discover the environment and adapt the application during its execution to the new contextual conditions. We, therefore, propose a model of a context-aware middleware that can address this issue through a monitoring service that is capable of reasoning and observation channels capable of calculating the context during the runtime. The monitoring service evaluates the pre-defined X-Query predicates in the context manager and uses Prolog to deduce the services needed to respond back. An independent Observation Channel for each different predicate is then dynamically generated by the monitoring service depending on the current state of the environment. Each channel sends its result directly to the context manager which consequently calculates the context based on all the predicates’ results while preserving the reactivity of the self-adaptive system.

Keywords: auto-adaptation, context-awareness, middleware, reasoning engine

Procedia PDF Downloads 251
9299 Approaches to Ethical Hacking: A Conceptual Framework for Research

Authors: Lauren Provost

Abstract:

The digital world remains increasingly vulnerable, making the development of effective cybersecurity approaches even more critical in supporting the success of the digital economy and national security. Although approaches to cybersecurity have shifted and improved in the last decade with new models, especially with cloud computing and mobility, a record number of high severity vulnerabilities were recorded in the National Institute of Standards and Technology (NIST), and its National Vulnerability Database (NVD) in 2020. This is due, in part, to the increasing complexity of cyber ecosystems. Security must be approached with a more comprehensive, multi-tool strategy that addresses the complexity of cyber ecosystems, including the human factor. Ethical hacking has emerged as such an approach: a more effective, multi-strategy, comprehensive approach to cyber security's most pressing needs, especially understanding the human factor. Research on ethical hacking, however, is limited in scope. The two main objectives of this work are to (1) provide highlights of case studies in ethical hacking, (2) provide a conceptual framework for research in ethical hacking that embraces and addresses both technical and nontechnical security measures. Recommendations include an improved conceptual framework for research centered on ethical hacking that addresses many factors and attributes of significant attacks that threaten computer security; a more robust, integrative multi-layered framework embracing the complexity of cybersecurity ecosystems.

Keywords: ethical hacking, literature review, penetration testing, social engineering

Procedia PDF Downloads 220
9298 Analyzing How Working From Home Can Lead to Higher Job Satisfaction for Employees Who Have Care Responsibilities Using Structural Equation Modeling

Authors: Christian Louis Kühner, Florian Pfeffel, Valentin Nickolai

Abstract:

Taking care of children, dependents, or pets can be a difficult and time-consuming task. Especially for part- and full-time employees, it can feel exhausting and overwhelming to meet these obligations besides working a job. Thus, working mostly at home and not having to drive to the company can save valuable time and stress. This study aims to show the influence that the working model has on the job satisfaction of employees with care responsibilities in comparison to employees who do not have such obligations. Using structural equation modeling (SEM), the three work models, “work from home”, “working remotely”, and a hybrid model, have been analyzed based on 13 influencing constructs on job satisfaction. These 13 factors have been further summarized into three groups “classic influencing factors”, “influencing factors changed by remote working”, and “new remote working influencing factors”. Based on the influencing factors on job satisfaction, an online survey was conducted with n = 684 employees from the service sector. Here, Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. In addition, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). The SEM-analysis has shown that the most significant influencing factor on job satisfaction is “identification with the work” with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis shows that the identification with the work is the most significant factor in all three work models and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that among the employees with care responsibilities, the higher the proportion of working from home in comparison to working from the office, the more satisfied the employees are with their job. Since the work models that meet the requirements of comprehensive care led to higher job satisfaction amongst employees with such obligations, adapting as a company to such private obligations by employees can be crucial to sustained success. Conversely, the satisfaction level of the working model where employees work at the office is higher for workers without caregiving responsibilities.

Keywords: care responsibilities, home office, job satisfaction, structural equation modeling

Procedia PDF Downloads 84
9297 Circular Economy Maturity Models: A Systematic Literature Review

Authors: Dennis Kreutzer, Sarah Müller-Abdelrazeq, Ingrid Isenhardt

Abstract:

Resource scarcity, energy transition and the planned climate neutrality pose enormous challenges for manufacturing companies. In order to achieve these goals and a holistic sustainable development, the European Union has listed the circular economy as part of the Circular Economy Action Plan. In addition to a reduction in resource consumption, reduced emissions of greenhouse gases and a reduced volume of waste, the principles of the circular economy also offer enormous economic potential for companies, such as the generation of new circular business models. However, many manufacturing companies, especially small and medium-sized enterprises, do not have the necessary capacity to plan their transformation. They need support and strategies on the path to circular transformation, because this change affects not only production but also the entire company. Maturity models offer an approach, as they enable companies to determine the current status of their transformation processes. In addition, companies can use the models to identify transformation strategies and thus promote the transformation process. While maturity models are established in other areas, e.g. IT or project management, only a few circular economy maturity models can be found in the scientific literature. The aim of this paper is to analyse the identified maturity models of the circular economy through a systematic literature review (SLR) and, besides other aspects, to check their completeness as well as their quality. Since the terms "maturity model" and "readiness model" are often used to assess the transformation process, this paper considers both types of models to provide a more comprehensive result. For this purpose, circular economy maturity models at the company (micro) level were identified from the literature, compared, and analysed with regard to their theoretical and methodological structure. A specific focus was placed, on the one hand, on the analysis of the business units considered in the respective models and, on the other hand, on the underlying metrics and indicators in order to determine the individual maturity level of the entire company. The results of the literature review show, for instance, a significant difference in the holism of their assessment framework. Only a few models include the entire company with supporting areas outside the value-creating core process, e.g. strategy and vision. Additionally, there are large differences in the number and type of indicators as well as their metrics. For example, most models often use subjective indicators and very few objective indicators in their surveys. It was also found that there are rarely well-founded thresholds between the levels. Based on the generated results, concrete ideas and proposals for a research agenda in the field of circular economy maturity models are made.

Keywords: maturity model, circular economy, transformation, metric, assessment

Procedia PDF Downloads 114
9296 Assessing the Impact of Covid-19 Pandemic on Waste Management Workers in Ghana

Authors: Mensah-Akoto Julius, Kenichi Matsui

Abstract:

This paper examines the impact of COVID-19 on waste management workers in Ghana. A questionnaire survey was conducted among 60 waste management workers in Accra metropolis, the capital region of Ghana, to understand the impact of the COVID-19 pandemic on waste generation, workers’ safety in collecting solid waste, and service delivery. To find out correlations between the pandemic and safety of waste management workers, a regression analysis was used. Regarding waste generation, the results show the pandemic led to the highest annual per capita solid waste generation, or 3,390 tons, in 2020. Regarding the safety of workers, the regression analysis shows a significant and inverse association between COVID-19 and waste management services. This means that contaminated wastes may infect field workers with COVID-19 due to their direct exposure. A rise in new infection cases would have a negative impact on the safety and service delivery of the workers. The result also shows that an increase in economic activities negatively impacts waste management workers. The analysis, however, finds no statistical relationship between workers’ service deliveries and employees’ salaries. The study then discusses how municipal waste management authorities can ensure safe and effective waste collection during the pandemic.

Keywords: Covid-19, waste management worker, waste collection, Ghana

Procedia PDF Downloads 205
9295 First Systematic Review on Aerosol Bound Water: Exploring the Existing Knowledge Domain Using the CiteSpace Software

Authors: Kamila Widziewicz-Rzonca

Abstract:

The presence of PM bound water as an integral chemical compound of suspended aerosol particles (PM) has become one of the hottest issues in recent years. The UN climate summits on climate change (COP24) indicate that PM of anthropogenic origin (released mostly from coal combustion) is directly responsible for climate change. Chemical changes at the particle-liquid (water) interface determine many phenomena occurring in the atmosphere such as visibility, cloud formation or precipitation intensity. Since water-soluble particles such as nitrates, sulfates, or sea salt easily become cloud condensation nuclei, they affect the climate for example by increasing cloud droplet concentration. Aerosol water is a master component of atmospheric aerosols and a medium that enables all aqueous-phase reactions occurring in the atmosphere. Thanks to a thorough bibliometric analysis conducted using CiteSpace Software, it was possible to identify past trends and possible future directions in measuring aerosol-bound water. This work, in fact, doesn’t aim at reviewing the existing literature in the related topic but is an in-depth bibliometric analysis exploring existing gaps and new frontiers in the topic of PM-bound water. To assess the major scientific areas related to PM-bound water and clearly define which among those are the most active topics we checked Web of Science databases from 1996 till 2018. We give an answer to the questions: which authors, countries, institutions and aerosol journals to the greatest degree influenced PM-bound water research? Obtained results indicate that the paper with the greatest citation burst was Tang In and Munklewitz H.R. 'water activities, densities, and refractive indices of aqueous sulfates and sodium nitrate droplets of atmospheric importance', 1994. The largest number of articles in this specific field was published in atmospheric chemistry and physics. An absolute leader in the quantity of publications among all research institutions is the National Aeronautics Space Administration (NASA). Meteorology and atmospheric sciences is a category with the most studies in this field. A very small number of studies on PM-bound water conduct a quantitative measurement of its presence in ambient particles or its origin. Most articles rather point PM-bound water as an artifact in organic carbon and ions measurements without any chemical analysis of its contents. This scientometric study presents the current and most actual literature regarding particulate bound water.

Keywords: systematic review, aerosol-bound water, PM-bound water, CiteSpace, knowledge domain

Procedia PDF Downloads 124
9294 Pre-Service Science Teachers’ Attitudes about Teaching Science Courses at the Faculty of Education, Lebanese University: An Exploratory Case Study

Authors: Suzanne El Takach

Abstract:

The research study explored pre-service teachers’ attitudes towards 6 courses taught in 3rd till 6th semesters at the Faculty of Education, Lebanese University, during the academic year 2015-2016. They assessed science teaching courses that are essential for teacher preparation for Science at the primary and elementary level. These courses were: Action Research I and II in Teaching Science, New trends in Teaching Science, Teaching Science I and II for the elementary level and Teaching Science for Early Childhood Education. Qualitative and Quantitative Data were gathered from a) a survey questionnaire consisting of 23 closed-ended items; some were of Likert scale type, that aimed at collecting students’ opinions on courses, in terms of teaching, assessment and class interaction (N=102 respondents) and b) a second questionnaire of 10 questions was disseminated on a sample of 39 students in their last semester in science and Mathematics, in order to know more about students’ skills gained, suggestions for new courses and improvement. Students were satisfied with science teaching courses and they have admitted that they gained a good pedagogical content knowledge, such as, lesson planning, students’ misconceptions, and use of various teaching and assessment strategies.

Keywords: assessment in higher education, LMD program, pre-service teachers’ attitudes, pre-PCK skills

Procedia PDF Downloads 148
9293 Design of Data Management Software System Supporting Rendezvous and Docking with Various Spaceships

Authors: Zhan Panpan, Lu Lan, Sun Yong, He Xiongwen, Yan Dong, Gu Ming

Abstract:

The function of the two spacecraft docking network, the communication and control of a docking target with various spacecrafts is realized in the space lab data management system. In order to solve the problem of the complex data communication mode between the space lab and various spaceships, and the problem of software reuse caused by non-standard protocol, a data management software system supporting rendezvous and docking with various spaceships has been designed. The software system is based on CCSDS Spcecraft Onboard Interface Service(SOIS). It consists of Software Driver Layer, Middleware Layer and Appliaction Layer. The Software Driver Layer hides the various device interfaces using the uniform device driver framework. The Middleware Layer is divided into three lays, including transfer layer, application support layer and system business layer. The communication of space lab plaform bus and the docking bus is realized in transfer layer. Application support layer provides the inter tasks communitaion and the function of unified time management for the software system. The data management software functions are realized in system business layer, which contains telemetry management service, telecontrol management service, flight status management service, rendezvous and docking management service and so on. The Appliaction Layer accomplishes the space lab data management system defined tasks using the standard interface supplied by the Middleware Layer. On the basis of layered architecture, rendezvous and docking tasks and the rendezvous and docking management service are independent in the software system. The rendezvous and docking tasks will be activated and executed according to the different spaceships. In this way, the communication management functions in the independent flight mode, the combination mode of the manned spaceship and the combination mode of the cargo spaceship are achieved separately. The software architecture designed standard appliction interface for the services in each layer. Different requirements of the space lab can be supported by the use of standard services per layer, and the scalability and flexibility of the data management software can be effectively improved. It can also dynamically expand the number and adapt to the protocol of visiting spaceships. The software system has been applied in the data management subsystem of the space lab, and has been verified in the flight of the space lab. The research results of this paper can provide the basis for the design of the data manage system in the future space station.

Keywords: space lab, rendezvous and docking, data management, software system

Procedia PDF Downloads 368
9292 A Study of Issues and Mitigations on Distributed Denial of Service and Medical Internet of Things Devices

Authors: Robin Singh, Jing-Chiou Liou

Abstract:

The Internet of Things (IoT) devices are being used heavily as part of our everyday routines. Through improved communication and automated procedures, its popularity has assisted users in raising the quality of work. These devices are used in healthcare in order to better collect the patient’s data for their treatment. They are generally considered safe and secure. However, there is some possibility that some loopholes do exist which manufacturers do need to identify before some hacker takes advantage of them. For this study, we focused on two medical IoT devices which are pacemakers and hearing aids. The aim of this paper is to identify if there is any likelihood of these medical devices being hijacked and used as a botnet in Distributed Denial-Of Service attacks. Moreover, some mitigation strategies are being proposed to better secure

Keywords: cybersecurity, DDoS, IoT, medical devices

Procedia PDF Downloads 86
9291 PM10 Prediction and Forecasting Using CART: A Case Study for Pleven, Bulgaria

Authors: Snezhana G. Gocheva-Ilieva, Maya P. Stoimenova

Abstract:

Ambient air pollution with fine particulate matter (PM10) is a systematic permanent problem in many countries around the world. The accumulation of a large number of measurements of both the PM10 concentrations and the accompanying atmospheric factors allow for their statistical modeling to detect dependencies and forecast future pollution. This study applies the classification and regression trees (CART) method for building and analyzing PM10 models. In the empirical study, average daily air data for the city of Pleven, Bulgaria for a period of 5 years are used. Predictors in the models are seven meteorological variables, time variables, as well as lagged PM10 variables and some lagged meteorological variables, delayed by 1 or 2 days with respect to the initial time series, respectively. The degree of influence of the predictors in the models is determined. The selected best CART models are used to forecast future PM10 concentrations for two days ahead after the last date in the modeling procedure and show very accurate results.

Keywords: cross-validation, decision tree, lagged variables, short-term forecasting

Procedia PDF Downloads 196
9290 JaCoText: A Pretrained Model for Java Code-Text Generation

Authors: Jessica Lopez Espejel, Mahaman Sanoussi Yahaya Alassan, Walid Dahhane, El Hassane Ettifouri

Abstract:

Pretrained transformer-based models have shown high performance in natural language generation tasks. However, a new wave of interest has surged: automatic programming language code generation. This task consists of translating natural language instructions to a source code. Despite the fact that well-known pre-trained models on language generation have achieved good performance in learning programming languages, effort is still needed in automatic code generation. In this paper, we introduce JaCoText, a model based on Transformer neural network. It aims to generate java source code from natural language text. JaCoText leverages the advantages of both natural language and code generation models. More specifically, we study some findings from state of the art and use them to (1) initialize our model from powerful pre-trained models, (2) explore additional pretraining on our java dataset, (3) lead experiments combining the unimodal and bimodal data in training, and (4) scale the input and output length during the fine-tuning of the model. Conducted experiments on CONCODE dataset show that JaCoText achieves new state-of-the-art results.

Keywords: java code generation, natural language processing, sequence-to-sequence models, transformer neural networks

Procedia PDF Downloads 288
9289 Sentinel-2 Based Burn Area Severity Assessment Tool in Google Earth Engine

Authors: D. Madhushanka, Y. Liu, H. C. Fernando

Abstract:

Fires are one of the foremost factors of land surface disturbance in diverse ecosystems, causing soil erosion and land-cover changes and atmospheric effects affecting people's lives and properties. Generally, the severity of the fire is calculated as the Normalized Burn Ratio (NBR) index. This is performed manually by comparing two images obtained afterward. Then by using the bitemporal difference of the preprocessed satellite images, the dNBR is calculated. The burnt area is then classified as either unburnt (dNBR<0.1) or burnt (dNBR>= 0.1). Furthermore, Wildfire Severity Assessment (WSA) classifies burnt areas and unburnt areas using classification levels proposed by USGS and comprises seven classes. This procedure generates a burn severity report for the area chosen by the user manually. This study is carried out with the objective of producing an automated tool for the above-mentioned process, namely the World Wildfire Severity Assessment Tool (WWSAT). It is implemented in Google Earth Engine (GEE), which is a free cloud-computing platform for satellite data processing, with several data catalogs at different resolutions (notably Landsat, Sentinel-2, and MODIS) and planetary-scale analysis capabilities. Sentinel-2 MSI is chosen to obtain regular processes related to burnt area severity mapping using a medium spatial resolution sensor (15m). This tool uses machine learning classification techniques to identify burnt areas using NBR and to classify their severity over the user-selected extent and period automatically. Cloud coverage is one of the biggest concerns when fire severity mapping is performed. In WWSAT based on GEE, we present a fully automatic workflow to aggregate cloud-free Sentinel-2 images for both pre-fire and post-fire image compositing. The parallel processing capabilities and preloaded geospatial datasets of GEE facilitated the production of this tool. This tool consists of a Graphical User Interface (GUI) to make it user-friendly. The advantage of this tool is the ability to obtain burn area severity over a large extent and more extended temporal periods. Two case studies were carried out to demonstrate the performance of this tool. The Blue Mountain national park forest affected by the Australian fire season between 2019 and 2020 is used to describe the workflow of the WWSAT. This site detected more than 7809 km2, using Sentinel-2 data, giving an error below 6.5% when compared with the area detected on the field. Furthermore, 86.77% of the detected area was recognized as fully burnt out, of which high severity (17.29%), moderate-high severity (19.63%), moderate-low severity (22.35%), and low severity (27.51%). The Arapaho and Roosevelt National Forest Park, California, the USA, which is affected by the Cameron peak fire in 2020, is chosen for the second case study. It was found that around 983 km2 had burned out, of which high severity (2.73%), moderate-high severity (1.57%), moderate-low severity (1.18%), and low severity (5.45%). These spots also can be detected through the visual inspection made possible by cloud-free images generated by WWSAT. This tool is cost-effective in calculating the burnt area since satellite images are free and the cost of field surveys is avoided.

Keywords: burnt area, burnt severity, fires, google earth engine (GEE), sentinel-2

Procedia PDF Downloads 238
9288 Development of a Turbulent Boundary Layer Wall-pressure Fluctuations Power Spectrum Model Using a Stepwise Regression Algorithm

Authors: Zachary Huffman, Joana Rocha

Abstract:

Wall-pressure fluctuations induced by the turbulent boundary layer (TBL) developed over aircraft are a significant source of aircraft cabin noise. Since the power spectral density (PSD) of these pressure fluctuations is directly correlated with the amount of sound radiated into the cabin, the development of accurate empirical models that predict the PSD has been an important ongoing research topic. The sound emitted can be represented from the pressure fluctuations term in the Reynoldsaveraged Navier-Stokes equations (RANS). Therefore, early TBL empirical models (including those from Lowson, Robertson, Chase, and Howe) were primarily derived by simplifying and solving the RANS for pressure fluctuation and adding appropriate scales. Most subsequent models (including Goody, Efimtsov, Laganelli, Smol’yakov, and Rackl and Weston models) were derived by making modifications to these early models or by physical principles. Overall, these models have had varying levels of accuracy, but, in general, they are most accurate under the specific Reynolds and Mach numbers they were developed for, while being less accurate under other flow conditions. Despite this, recent research into the possibility of using alternative methods for deriving the models has been rather limited. More recent studies have demonstrated that an artificial neural network model was more accurate than traditional models and could be applied more generally, but the accuracy of other machine learning techniques has not been explored. In the current study, an original model is derived using a stepwise regression algorithm in the statistical programming language R, and TBL wall-pressure fluctuations PSD data gathered at the Carleton University wind tunnel. The theoretical advantage of a stepwise regression approach is that it will automatically filter out redundant or uncorrelated input variables (through the process of feature selection), and it is computationally faster than machine learning. The main disadvantage is the potential risk of overfitting. The accuracy of the developed model is assessed by comparing it to independently sourced datasets.

Keywords: aircraft noise, machine learning, power spectral density models, regression models, turbulent boundary layer wall-pressure fluctuations

Procedia PDF Downloads 135
9287 Evaluation of Free Technologies as Tools for Business Process Management

Authors: Julio Sotomayor, Daniel Yucra, Jorge Mayhuasca

Abstract:

The article presents an evaluation of free technologies for business process automation, with emphasis only on tools compatible with the general public license (GPL). The compendium of technologies was based on promoting a service-oriented enterprise architecture (SOA) and the establishment of a business process management system (BPMS). The methodology for the selection of tools was Agile UP. This proposal allows businesses to achieve technological sovereignty and independence, in addition to the promotion of service orientation and the development of free software based on components.

Keywords: BPM, BPMS suite, open-source software, SOA, enterprise architecture, business process management

Procedia PDF Downloads 291
9286 Determinants of Conference Service Quality as Perceived by International Attendees

Authors: Shiva Hashemi, Azizan Marzuki, S. Kiumarsi

Abstract:

In recent years, conference destinations have been highly competitive; therefore, it is necessary to know about the behaviours of conference participants such as the process of their decision-making and the assessment of perceived conference quality. A conceptual research framework based on the Theory of Planned Behaviour model is presented in this research to get better understanding factors that influence it. This research study highlights key factors presented in previous studies in which behaviour intentions of participants are affected by the quality of conference. Therefore, this study is believed to provide an idea that conference participants should be encouraged to contribute to the quality and behaviour intention of the conference.

Keywords: conference, attendees, service quality, perceives value, trust, behavioural intention.

Procedia PDF Downloads 318
9285 Getting Out: A Framework for Exiting/Escaping Sex Trafficking

Authors: Amanda Noble

Abstract:

The process of exiting/escaping situations of sex trafficking can be arduous and fraught with numerous barriers. In this paper the results of a national Canadian study on escaping situations of sex trafficking is discussed. Surveys and focus groups were conducted with 201 stakeholders in 8 cities, including 50 survivors of sex trafficking, service providers, health care providers and police. The results show that survivors are both vulnerable to being exploited and experience barriers to exiting as a result of structural factors such as colonialism, poverty, and discrimination based on race and gender. Survivors also face numerous barriers within various systems such as child welfare and the legal system. In addition, survivors contend with multiple psychological and psychosocial factors when exiting including the trauma bond, complex trauma and mental health concerns, substance use, isolation, and adjusting to ‘mainstream’ life. In light of these factors, the service needs of survivors escaping sex trafficking are discussed, and promising practices, such as trauma-informed practice and working from a stages of change model are outlined. This paper is useful for service providers that work with survivors, policy makers, or anyone who has ever wondered why survivors that are not being physically detained don’t ‘just leave’ or escape their exploitative situations.

Keywords: Barriers, Exiting, Promising Practices, Sex Trafficking

Procedia PDF Downloads 97
9284 Extension of a Competitive Location Model Considering a Given Number of Servers and Proposing a Heuristic for Solving

Authors: Mehdi Seifbarghy, Zahra Nasiri

Abstract:

Competitive location problem deals with locating new facilities to provide a service (or goods) to the customers of a given geographical area where other facilities (competitors) offering the same service are already present. The new facilities will have to compete with the existing facilities for capturing the market share. This paper proposes a new model to maximize the market share in which customers choose the facilities based on traveling time, waiting time and attractiveness. The attractiveness of a facility is considered as a parameter in the model. A heuristic is proposed to solve the problem.

Keywords: competitive location, market share, facility attractiveness, heuristic

Procedia PDF Downloads 524
9283 Human Resource Utilization Models for Graceful Ageing

Authors: Chuang-Chun Chiou

Abstract:

In this study, a systematic framework of graceful ageing has been used to explore the possible human resource utilization models for graceful ageing purpose. This framework is based on the Chinese culture. We call ‘Nine-old’ target. They are ageing gracefully with feeding, accomplishment, usefulness, learning, entertainment, care, protection, dignity, and termination. This study is focused on two areas: accomplishment and usefulness. We exam the current practices of initiatives and laws of promoting labor participation. That is to focus on how to increase Labor Force Participation Rate of the middle aged as well as the elderly and try to promote the elderly to achieve graceful ageing. Then we present the possible models that support graceful ageing.

Keywords: human resource utilization model, labor participation, graceful ageing, employment

Procedia PDF Downloads 390