Search results for: project lead time
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24419

Search results for: project lead time

20069 Increased Circularity in Metals Production Using the Ausmelt TSL Process

Authors: Jacob Wood, David Wilson, Stephen Hughes

Abstract:

The Ausmelt Top Submerged Lance (TSL) Process has been widely applied for the processing of both primary and secondary copper, nickel, lead, tin, and zinc-bearing feed materials. Continual development and evolution of the technology over more than 30 years has resulted in a more intense smelting process with higher energy efficiency, improved metal recoveries, lower operating costs, and reduced fossil fuel consumption. This paper covers a number of recent advances to the technology, highlighting their positive impacts on smelter operating costs, environmental performance, and contribution towards increased circularity in metals production.

Keywords: ausmelt TSL, smelting, circular economy, energy efficiency

Procedia PDF Downloads 224
20068 The Impact of Artificial Intelligence on Journalism and Mass Communication

Authors: Saad Zagloul Shokri Melika

Abstract:

The London College of Communication is one of the only universities in the world to offer a lifestyle journalism master’s degree. A hybrid originally constructed largely out of a generic journalism program crossed with numerous cultural studies approaches, the degree has developed into a leading lifestyle journalism education attracting students worldwide. This research project seeks to present a framework for structuring the degree as well as to understand how students in this emerging field of study value the program. While some researchers have addressed questions about journalism and higher education, none have looked specifically at the increasingly important genre of lifestyle journalism, which Folker Hanusch defines as including notions of consumerism and critique among other identifying traits. Lifestyle journalism, itself poorly researched by scholars, can relate to topics including travel, fitness, and entertainment, and as such, arguably a lifestyle journalism degree should prepare students to engage with these topics. This research uses the existing Masters of Arts and Lifestyle Journalism at the London College of Communications as a case study to examine the school’s approach. Furthering Hanusch’s original definition, this master’s program attempts to characterizes lifestyle journalism by a specific voice or approach, as reflected in the diversity of student’s final projects. This framework echoes the ethos and ideas of the university, which focuses on creativity, design, and experimentation. By analyzing the current degree as well as student feedback, this research aims to assist future educators in pursuing the often neglected field of lifestyle journalism. Through a discovery of the unique mix of practical coursework, theoretical lessons, and broad scope of student work presented in this degree program, researchers strive to develop a framework for lifestyle journalism education, referring to Mark Deuze’s ten questions for journalism education development. While Hanusch began the discussion to legitimize the study of lifestyle journalism, this project strives to go one step further and open up a discussion about teaching of lifestyle journalism at the university level.

Keywords: Journalism, accountability, education, television, publicdearth, investigative, journalism, Nigeria, journalismeducation, lifestyle, university

Procedia PDF Downloads 22
20067 Different Approaches to Teaching a Database Course to Undergraduate and Graduate Students

Authors: Samah Senbel

Abstract:

Database Design is a fundamental part of the Computer Science and Information technology curricula in any school, as well as in the study of management, business administration, and data analytics. In this study, we compare the performance of two groups of students studying the same database design and implementation course at Sacred Heart University in the fall of 2018. Both courses used the same textbook and were taught by the same professor, one for seven graduate students and one for 26 undergraduate students (juniors). The undergraduate students were aged around 20 years old with little work experience, while the graduate students averaged 35 years old and all were employed in computer-related or management-related jobs. The textbook used was 'Database Systems, Design, Implementation, and Management' by Coronel and Morris, and the course was designed to follow the textbook roughly a chapter per week. The first 6 weeks covered the design aspect of a database, followed by a paper exam. The next 6 weeks covered the implementation aspect of the database using SQL followed by a lab exam. Since the undergraduate students are on a 16 week semester, we spend the last three weeks of the course covering NoSQL. This part of the course was not included in this study. After the course was over, we analyze the results of the two groups of students. An interesting discrepancy was observed: In the database design part of the course, the average grade of the graduate students was 92%, while that of the undergraduate students was 77% for the same exam. In the implementation part of the course, we observe the opposite: the average grade of the graduate students was 65% while that of the undergraduate students was 73%. The overall grades were quite similar: the graduate average was 78% and that of the undergraduates was 75%. Based on these results, we concluded that having both classes follow the same time schedule was not beneficial, and an adjustment is needed. The graduates could spend less time on design and the undergraduates would benefit from more design time. In the fall of 2019, 30 students registered for the undergraduate course and 15 students registered for the graduate course. To test our conclusion, the undergraduates spend about 67% of time (eight classes) on the design part of the course and 33% (four classes) on the implementation part, using the exact exams as the previous year. This resulted in an improvement in their average grades on the design part from 77% to 83% and also their implementation average grade from 73% to 79%. In conclusion, we recommend using two separate schedules for teaching the database design course. For undergraduate students, it is important to spend more time on the design part rather than the implementation part of the course. While for the older graduate students, we recommend spending more time on the implementation part, as it seems that is the part they struggle with, even though they have a higher understanding of the design component of databases.

Keywords: computer science education, database design, graduate and undergraduate students, pedagogy

Procedia PDF Downloads 110
20066 Concept, Design and Implementation of Power System Component Simulator Based on Thyristor Controlled Transformer and Power Converter

Authors: B. Kędra, R. Małkowski

Abstract:

This paper presents information on Power System Component Simulator – a device designed for LINTE^2 laboratory owned by Gdansk University of Technology in Poland. In this paper, we first provide an introductory information on the Power System Component Simulator and its capabilities. Then, the concept of the unit is presented. Requirements for the unit are described as well as proposed and introduced functions are listed. Implementation details are given. Hardware structure is presented and described. Information about used communication interface, data maintenance and storage solution, as well as used Simulink real-time features are presented. List and description of all measurements is provided. Potential of laboratory setup modifications is evaluated. Lastly, the results of experiments performed using Power System Component Simulator are presented. This includes simulation of under frequency load shedding, frequency and voltage dependent characteristics of groups of load units, time characteristics of group of different load units in a chosen area.

Keywords: power converter, Simulink Real-Time, Matlab, load, tap controller

Procedia PDF Downloads 229
20065 Community Participation in Health Related Activities in Ignié-Ngabé-Mayama Health District, Brazzaville, Republic of Congo

Authors: Tebeu Pierre Marie

Abstract:

Introduction: WHO defines community participation as a process in which the local population, take responsibility in planning for their health, participates in the strategy’s development for implementation and accessibility to physical, moral and social well-being. For the purpose of dealing with health, the community participation is made through the organization called health Centre committee leader (HCCL/COSA) for Integrated health Center and District hospital committee leaser (HDCL/COGES) for District Hospital. Little is known about the effective participation of the community in health related activities in Ignié-Ngabé-Mayama health district. Objective: This study aimed at assessing the involvement of community in the health system running at the Ignié-Ngabé-Mayama health district. Methods: This was a qualitative cross-sectional study conducted in the Ignié-Ngabé-Mayama health district from 15 December 2020 to 30 April 2021. The study population consisted of 10 HCCL and one District hospital committee leaser (DHCL). Data were collected using a pretested questionnaire and validated by the investigating team. The variables of interest were; effective existence of HCCL/DHCL, their involvement HCCL/DHCL in health related activities, their financing management, planning of activities and leadership. Results: A total of 11 participants were interviewed, including 10 HCCL and 1 DHCL. The Sex-Ratio was 9/11; with primary level 6/11 and were mostly farmers 6/11. Analyzing the involvement of the HDCL/DHCL in health promotion and preventive activities; this was effective only for two of them (2/11). Analyzing the barriers for their involvement, the leaders reported the lack of financial support by the state, lack NGO support. Additionally, they reported to have been very active when there was Performance Based Founding Project in the District. Conclusion: Only two of the (HDCL/HCCL) out of 11 were really functioning. Reported barriers to their running were: lack of state/NGOs support and ending of PBF Project. There is a need to organize a tripartite forum including stats, NGOs and Community for boosting the community participation in health related activities in Ignié-Ngabé-Mayama health district.

Keywords: health district committee, health Centre committee, community participation, Brazzaville, Congo

Procedia PDF Downloads 155
20064 Volatility Switching between Two Regimes

Authors: Josip Visković, Josip Arnerić, Ante Rozga

Abstract:

Based on the fact that volatility is time varying in high frequency data and that periods of high volatility tend to cluster, the most successful and popular models in modelling time varying volatility are GARCH type models. When financial returns exhibit sudden jumps that are due to structural breaks, standard GARCH models show high volatility persistence, i.e. integrated behaviour of the conditional variance. In such situations models in which the parameters are allowed to change over time are more appropriate. This paper compares different GARCH models in terms of their ability to describe structural changes in returns caused by financial crisis at stock markets of six selected central and east European countries. The empirical analysis demonstrates that Markov regime switching GARCH model resolves the problem of excessive persistence and outperforms uni-regime GARCH models in forecasting volatility when sudden switching occurs in response to financial crisis.

Keywords: central and east European countries, financial crisis, Markov switching GARCH model, transition probabilities

Procedia PDF Downloads 216
20063 An Approach of High Scalable Production Capacity by Adaption of the Concept 'Everything as a Service'

Authors: Johannes Atug, Stefan Braunreuther, Gunther Reinhart

Abstract:

Volatile markets, as well as increasing global competition in manufacturing, lead to a high demand of flexible and agile production systems. These advanced production systems in turn conduct to high capital expenditure along with high investment risks. Developments in production regarding digitalization and cyber-physical systems result to a merger of informational- and operational technology. The approach of this paper is to benefit from this merger and present a framework of a production network with scalable production capacity and low capital expenditure by adaptation of the IT concept 'everything as a service' into the production environment.

Keywords: digital manufacturing system, everything as a service, reconfigurable production, value network

Procedia PDF Downloads 329
20062 Combining Multiscale Patterns of Weather and Sea States into a Machine Learning Classifier for Mid-Term Prediction of Extreme Rainfall in North-Western Mediterranean Sea

Authors: Pinel Sebastien, Bourrin François, De Madron Du Rieu Xavier, Ludwig Wolfgang, Arnau Pedro

Abstract:

Heavy precipitation constitutes a major meteorological threat in the western Mediterranean. Research has investigated the relationship between the states of the Mediterranean Sea and the atmosphere with the precipitation for short temporal windows. However, at a larger temporal scale, the precursor signals of heavy rainfall in the sea and atmosphere have drawn little attention. Moreover, despite ongoing improvements in numerical weather prediction, the medium-term forecasting of rainfall events remains a difficult task. Here, we aim to investigate the influence of early-spring environmental parameters on the following autumnal heavy precipitations. Hence, we develop a machine learning model to predict extreme autumnal rainfall with a 6-month lead time over the Spanish Catalan coastal area, based on i) the sea pattern (main current-LPC and Sea Surface Temperature-SST) at the mesoscale scale, ii) 4 European weather teleconnection patterns (NAO, WeMo, SCAND, MO) at synoptic scale, and iii) the hydrological regime of the main local river (Rhône River). The accuracy of the developed model classifier is evaluated via statistical analysis based on classification accuracy, logarithmic and confusion matrix by comparing with rainfall estimates from rain gauges and satellite observations (CHIRPS-2.0). Sensitivity tests are carried out by changing the model configuration, such as sea SST, sea LPC, river regime, and synoptic atmosphere configuration. The sensitivity analysis suggests a negligible influence from the hydrological regime, unlike SST, LPC, and specific teleconnection weather patterns. At last, this study illustrates how public datasets can be integrated into a machine learning model for heavy rainfall prediction and can interest local policies for management purposes.

Keywords: extreme hazards, sensitivity analysis, heavy rainfall, machine learning, sea-atmosphere modeling, precipitation forecasting

Procedia PDF Downloads 117
20061 A Lifeline Vulnerability Study of Constantine, Algeria

Authors: Mounir Ait Belkacem, Mehdi Boukri, Omar Amellal, Nacim Yousfi, Abderrahmane Kibboua, Med Naboussi Farsi, Mounir Naili

Abstract:

The North of Algeria is located in a seismic zone, then earthquakes are probably the most likely natural disaster that would lead to major lifeline disruption. The adequate operation of lifelines is vital for the economic development of regions under moderate to high seismic activity. After an earthquake, the proper operation of all vital systems is necessary, for instance hospitals for medical attention of the wounded and highways for communication and assistance for victims.In this work we apply the knowledge of pipeline vulnerability to the water supply system, sanitary sewer pipelines (waste water), and telephone in Constantine (Algeria).

Keywords: lifeline, earthquake, vulnerability, pipelines

Procedia PDF Downloads 549
20060 Performance of CALPUFF Dispersion Model for Investigation the Dispersion of the Pollutants Emitted from an Industrial Complex, Daura Refinery, to an Urban Area in Baghdad

Authors: Ramiz M. Shubbar, Dong In Lee, Hatem A. Gzar, Arthur S. Rood

Abstract:

Air pollution is one of the biggest environmental problems in Baghdad, Iraq. The Daura refinery located nearest the center of Baghdad, represents the largest industrial area, which transmits enormous amounts of pollutants, therefore study the gaseous pollutants and particulate matter are very important to the environment and the health of the workers in refinery and the people whom leaving in areas around the refinery. Actually, some studies investigated the studied area before, but it depended on the basic Gaussian equation in a simple computer programs, however, that kind of work at that time is very useful and important, but during the last two decades new largest production units were added to the Daura refinery such as, PU_3 (Power unit_3 (Boiler 11&12)), CDU_1 (Crude Distillation unit_70000 barrel_1), and CDU_2 (Crude Distillation unit_70000 barrel_2). Therefore, it is necessary to use new advanced model to study air pollution at the region for the new current years, and calculation the monthly emission rate of pollutants through actual amounts of fuel which consumed in production unit, this may be lead to accurate concentration values of pollutants and the behavior of dispersion or transport in study area. In this study to the best of author’s knowledge CALPUFF model was used and examined for first time in Iraq. CALPUFF is an advanced non-steady-state meteorological and air quality modeling system, was applied to investigate the pollutants concentration of SO2, NO2, CO, and PM1-10μm, at areas adjacent to Daura refinery which located in the center of Baghdad in Iraq. The CALPUFF modeling system includes three main components: CALMET is a diagnostic 3-dimensional meteorological model, CALPUFF (an air quality dispersion model), CALPOST is a post processing package, and an extensive set of preprocessing programs produced to interface the model to standard routinely available meteorological and geophysical datasets. The targets of this work are modeling and simulation the four pollutants (SO2, NO2, CO, and PM1-10μm) which emitted from Daura refinery within one year. Emission rates of these pollutants were calculated for twelve units includes thirty plants, and 35 stacks by using monthly average of the fuel amount consumption at this production units. Assess the performance of CALPUFF model in this study and detect if it is appropriate and get out predictions of good accuracy compared with available pollutants observation. CALPUFF model was investigated at three stability classes (stable, neutral, and unstable) to indicate the dispersion of the pollutants within deferent meteorological conditions. The simulation of the CALPUFF model showed the deferent kind of dispersion of these pollutants in this region depends on the stability conditions and the environment of the study area, monthly, and annual averages of pollutants were applied to view the dispersion of pollutants in the contour maps. High values of pollutants were noticed in this area, therefore this study recommends to more investigate and analyze of the pollutants, reducing the emission rate of pollutants by using modern techniques and natural gas, increasing the stack height of units, and increasing the exit gas velocity from stacks.

Keywords: CALPUFF, daura refinery, Iraq, pollutants

Procedia PDF Downloads 185
20059 Current Harvesting Methods for Jatropha curcas L.

Authors: Luigi Pari, Alessandro Suardi, Enrico Santangelo

Abstract:

In the last decade Jatropha curcas L. (an oleaginous crop native to Central America and part of South America) has raised particular interest owing to of its properties and uses. Its capsules may contain up to 40% in oil and can be used as feedstock for biodiesel production. The harvesting phase is made difficult by the physiological traits of the specie, because fruits are in bunches and do not ripen simultaneously. Three harvesting methodologies are currently diffused and differ for the level of mechanization applied: manual picking, semi-mechanical harvesting, and mechanical harvesting. The manual picking is the most common in the developing countries but it is also the most time consuming and inefficient. Mechanical harvesting carried out with modified grape harvesters has the higher productivity, but it is very costly as initial investment and requires appropriate schemes of cultivation. The semi-mechanical harvesting method is achieved with shaker tools employed to facilitate the fruit detachment. This system resulted much cheaper than the fully mechanized one and quite flexible for small and medium scale applications, but it still requires adjustments for improving the productive performance. CRA-ING, within the European project Jatromed (http://www.jatromed.aua.gr) has carried out preliminary studies on the applicability of such approach, adapting an olive shaker to harvest Jatropha fruits. The work is a survey of the harvesting methods currently available for Jatropha, show the pros and cons of each system, and highlighting the criteria to be considered for choosing one respect another. The harvesting of Jatropha curcas L. remains a big constrains for the spread of the species as energy crop. The approach pursued by CRA-ING can be considered a good compromise between the fully mechanized harvesters and the exclusive manual intervention. It is an attempt to promote a sustainable mechanization suited to the social context of developing countries by encouraging the concrete involvement of local populations.

Keywords: jatropha curcas, energy crop, harvesting, central america, south america

Procedia PDF Downloads 375
20058 Cognitive Footprints: Analytical and Predictive Paradigm for Digital Learning

Authors: Marina Vicario, Amadeo Argüelles, Pilar Gómez, Carlos Hernández

Abstract:

In this paper, the Computer Research Network of the National Polytechnic Institute of Mexico proposes a paradigmatic model for the inference of cognitive patterns in digital learning systems. This model leads to metadata architecture useful for analysis and prediction in online learning systems; especially on MOOc's architectures. The model is in the design phase and expects to be tested through an institutional of courses project which is going to develop for the MOOc.

Keywords: cognitive footprints, learning analytics, predictive learning, digital learning, educational computing, educational informatics

Procedia PDF Downloads 465
20057 Use and Effects of Kanban Board from the Aspects of Brothers Furniture Limited

Authors: Kazi Rizvan, Yamin Rekhu

Abstract:

Due to high competitiveness in industries throughout the world, every industry is trying hard to utilize all their resources to keep their productivity as high as possible. Many tools have been being used to ensure smoother flow of an operation, to balance tasks, to maintain proper schedules for tasks, to maintain proper sequence for tasks, to reduce unproductive time. All of these tools are used to augment productivity within an industry. Kanban board is one of them and of the many important tools of lean production system. Kanban Board is a visual depiction of the status of tasks. Kanban board shows the actual status of the tasks. It conveys the progress and issues of tasks as well. Using Kanban Board, tasks can be distributed among workers and operation targets can be visually represented to them. In this paper, an example of Kanban board from the aspects of Brothers Furniture Limited was taken and how the Kanban board system was implemented, how the board was designed and how it was made easily perceivable for the less literate or illiterate workers. The Kanban board was designed for the packing section of Brothers Furniture Limited. It was implemented for the purpose of representing the tasks flow to the workers and to mitigate the time that was wasted while the workers remained wondering about what task they should start after they finish one. Kanban board subsumed seven columns and there was a column for comments where if any problem occurred during working on the tasks. Kanban board was helpful for the workers as the board showed the urgency of the tasks. It was also helpful for the store section as they could understand which products and how much of them could be delivered to store at any certain time. Kanban board had all the information centralized which is why the work-flow got paced up and idle time was minimized. Regardless of many workers being illiterate or less literate, Kanban board was still explicable for the workers as the Kanban cards were colored. Since the significance of colors can be conveniently interpretable to them, colored cards helped a great deal in that matter. Hence, the illiterate or less literate workers didn’t have to spend time wondering about the significance of the cards. Even when the workers weren’t told the significance of the colored cards, they could grow a feeling about their meaning as colors can trigger anyone’s mind to perceive the situation. As a result, the board elucidated the workers about what board required them to do, when to do and what to do next. Kanban board alleviated excessive time between tasks by setting day-plan for targeted tasks and it also reduced time during tasks as the workers were acknowledged of forthcoming tasks for a day. Being very specific to the tasks, Kanban board helped the workers become more focused on their tasks helped them do their job with more perfection. As a result, The Kanban board helped achieve a 8.75% increase in productivity than the productivity before the Kanban board was implemented.

Keywords: color, Kanban Board, Lean Tool, literacy, packing, productivity

Procedia PDF Downloads 221
20056 Weight Regulation Mechanism on Bridges

Authors: S. Siddharth, Saravana Kumar

Abstract:

All Metros across the world tend to have a large number of bridges and there have been concerns about the safety of these bridges. As the traffic in most cities in India is heterogeneous, Trucks and Heavy vehicles traverse on our roads on an everyday basis this will lead to structural damage on the long run. All bridges are designed with a maximum Load limit and this limit is seldom checked. We have hence come up with an idea to check the load of all the vehicles entering the bridge and block the bridge with barricades if the vehicle surpasses the maximum load , this is done to catch hold of the perpetrators. By doing this we can avoid further structural damage and also provide an effective way to enforce the law. If our solution is put in place structural damage and accidents would be reduced to a great deal and it would also make the law enforcement job easier.

Keywords: heterogeneous, structural, load, law, heavy, vehicles

Procedia PDF Downloads 433
20055 3-Dimensional Contamination Conceptual Site Model: A Case Study Illustrating the Multiple Applications of Developing and Maintaining a 3D Contamination Model during an Active Remediation Project on a Former Urban Gasworks Site

Authors: Duncan Fraser

Abstract:

A 3-Dimensional (3D) conceptual site model was developed using the Leapfrog Works® platform utilising a comprehensive historical dataset for a large former Gasworks site in Fitzroy, Melbourne. The gasworks had been constructed across two fractured geological units with varying hydraulic conductivities. A Newer Volcanic (basaltic) outcrop covered approximately half of the site and was overlying a fractured Melbourne formation (Siltstone) bedrock outcropping over the remaining portion. During the investigative phase of works, a dense non-aqueous phase liquid (DNAPL) plume (coal tar) was identified within both geological units in the subsurface originating from multiple sources, including gasholders, tar wells, condensers, and leaking pipework. The first stage of model development was undertaken to determine the horizontal and vertical extents of the coal tar in the subsurface and assess the potential causality between potential sources, plume location, and site geology. Concentrations of key contaminants of interest (COIs) were also interpolated within Leapfrog to refine the distribution of contaminated soils. The model was subsequently used to develop a robust soil remediation strategy and achieve endorsement from an Environmental Auditor. A change in project scope, following the removal and validation of the three former gasholders, necessitated the additional excavation of a significant volume of residual contaminated rock to allow for the future construction of two-story underground basements. To assess financial liabilities associated with the offsite disposal or thermal treatment of material, the 3D model was updated with three years of additional analytical data from the active remediation phase of works. Chemical concentrations and the residual tar plume within the rock fractures were modelled to pre-classify the in-situ material and enhance separation strategies to prevent the unnecessary treatment of material and reduce costs.

Keywords: 3D model, contaminated land, Leapfrog, remediation

Procedia PDF Downloads 118
20054 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings

Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.

Keywords: building system, time series, diagnosis, outliers, delay, data gap

Procedia PDF Downloads 234
20053 Investigation of Some Flotation Parameters and the Role of Dispersants in the Flotation of Chalcopyrite

Authors: H. A. Taner, V. Önen

Abstract:

A suitable choice of flotation parameters and reagents have a strong effect on the effectiveness of flotation process. The objective of this paper is to give an overview of the flotation of chalcopyrite with the different conditions and dispersants. Flotation parameters such as grinding time, pH, type, and dosage of dispersant were investigated. In order to understand the interaction of some dispersants, sodium silicate, sodium hexametaphosphate and sodium polyphosphate were used. The optimum results were obtained at a pH of 11.5 and a grinding time of 10 minutes. A copper concentrate was produced assaying 29.85% CuFeS2 and 65.97% flotation recovery under optimum rougher flotation conditions with sodium silicate.

Keywords: chalcopyrite, dispersant, flotation, reagent

Procedia PDF Downloads 173
20052 Effect of Cement Amount on California Bearing Ratio Values of Different Soil

Authors: Ayse Pekrioglu Balkis, Sawash Mecid

Abstract:

Due to continued growth and rapid development of road construction in worldwide, road sub-layers consist of soil layers, therefore, identification and recognition of type of soil and soil behavior in different condition help to us to select soil according to specification and engineering characteristic, also if necessary sometimes stabilize the soil and treat undesirable properties of soils by adding materials such as bitumen, lime, cement, etc. If the soil beneath the road is not done according to the standards and construction will need more construction time. In this case, a large part of soil should be removed, transported and sometimes deposited. Then purchased sand and gravel is transported to the site and full depth filled and compacted. Stabilization by cement or other treats gives an opportunity to use the existing soil as a base material instead of removing it and purchasing and transporting better fill materials. Classification of soil according to AASHTOO system and USCS help engineers to anticipate soil behavior and select best treatment method. In this study soil classification and the relation between soil classification and stabilization method is discussed, cement stabilization with different percentages have been selected for soil treatment based on NCHRP. There are different parameters to define the strength of soil. In this study, CBR will be used to define the strength of soil. Cement by percentages, 0%, 3%, 7% and 10% added to soil for evaluation effect of added cement to CBR of treated soil. Implementation of stabilization process by different cement content help engineers to select an economic cement amount for the stabilization process according to project specification and characteristics. Stabilization process in optimum moisture content (OMC) and mixing rate effect on the strength of soil in the laboratory and field construction operation have been performed to see the improvement rate in strength and plasticity. Cement stabilization is quicker than a universal method such as removing and changing field soils. Cement addition increases CBR values of different soil types by the range of 22-69%.

Keywords: California Bearing Ratio, cement stabilization, clayey soil, mechanical properties

Procedia PDF Downloads 382
20051 A Current Problem for Steel Bridges: Fatigue Assessment of Seams´ Repair

Authors: H. Pasternak, A. Chwastek

Abstract:

The paper describes the results from a research project about repair of welds. The repair was carried out by grinding the flawed seams and re-welding them. The main task was to determine the FAT classes of original state and after repair of seams according to the assessment procedures, such as nominal, structural and effective notch stress approach. The first part shows the results of the tests, the second part encloses numerical analysis and evaluation of results to determine the fatigue strength classes according to three assessment procedures.

Keywords: cyclic loading, fatigue crack, post-weld treatment, seams’ repair

Procedia PDF Downloads 248
20050 Control of Oil Content of Fried Zucchini Slices by Partial Predrying and Process Optimization

Authors: E. Karacabey, Ş. G. Özçelik, M. S. Turan, C. Baltacıoğlu, E. Küçüköner

Abstract:

Main concern about deep-fat-fried food materials is their high final oil contents absorbed during frying process and/or after cooling period, since diet including high content of oil is accepted unhealthy by consumers. Different methods have been evaluated to decrease oil content of fried food stuffs. One promising method is partially drying of food material before frying. In the present study it was aimed to control and decrease the final oil content of zucchini slices by means of partial drying and to optimize process conditions. Conventional oven drying was used to decrease moisture content of zucchini slices at a certain extent. Process performance in terms of oil uptake was evaluated by comparing oil content of predried and then fried zucchini slices with those determined for directly fried ones. For predrying and frying processes, oven temperature and weight loss and frying oil temperature and time pairs were controlled variables, respectively. Zucchini slices were also directly fried for sensory evaluations revealing preferred properties of final product in terms of surface color, moisture content, texture and taste. These properties of directly fried zucchini slices taking the highest score at the end of sensory evaluation were determined and used as targets in optimization procedure. Response surface methodology was used for process optimization. The properties, determined after sensory evaluation, were selected as targets; meanwhile oil content was aimed to be minimized. Results indicated that final oil content of zucchini slices could be reduced from 58% to 46% by controlling conditions of predrying and frying processes. As a result, it was suggested that predrying could be one choose to reduce oil content of fried zucchini slices for health diet. This project (113R015) has been supported by TUBITAK.

Keywords: health process, optimization, response surface methodology, oil uptake, conventional oven

Procedia PDF Downloads 359
20049 The Overseas Promotion of National Identity by France and Japan for Global Outreach: A Comparative and Discursive Analysis of Their Narratives on Public Diplomacy since the End of the Cold War

Authors: Natsuko D'Aprile

Abstract:

The construction of Nation-States is a historical process that produces a type of national identity and culture that States nowadays mobilise for global outreach. National culture, as a set of norms and values influencing individuals’ actions and decisions, produces a type of policy making of various strategies that impact how a Nation is promoted overseas. The 1990s were marked by a resurgence of the debates on national identity. This period is believed to have paved the way for nationalism and witnessed increased attention to analytical approaches to identity. Public diplomacy is a concrete example of how national culture is mobilised to project a favourable image of a Nation abroad, especially in the narratives on national identity mobilised by diplomatic actors. Public diplomacy is understood as providing tools for States to build and project strategic narratives that represent events and identities in an attempt to influence domestic and foreign audiences, be they domestic or foreign. France and Japan received little attention on the matter. This research hence aims to investigate how France and Japan have mobilised narratives on national identity since the 1990s in the context of their public diplomacy. To understand how identities are framed, qualitative and quantitative discourse analysis has been performed on a corpus of various speeches held by French and Japanese political actors in which they present their diplomacy goals, as well as official documents provided by both Ministries of Foreign Affairs. This analysis showed that the French discourse integrates a narrative on France’s universal vocation, relying on the expression of a Nation whose model is worldly applicable and has the legitimacy to influence international decisions. The Japanese discourse does not concretely emphasise Japanese or Asian values, except for some narratives integrating Confucian and Shintō values. It rather revolves around the need for Japan to ensure its citizens’ security and prosperity, hence the need for the Government to contribute to peace in the Asia-Pacific region and the world.

Keywords: comparative politics, culture, discourse analysis, narratives, public diplomacy

Procedia PDF Downloads 66
20048 Modification Of Rubber Swab Tool With Brush To Reduce Rubber Swab Fraction Fishing Time

Authors: T. R. Hidayat, G. Irawan, F. Kurniawan, E. H. I. Prasetya, Suharto, T. F. Ridwan, A. Pitoyo, A. Juniantoro, R. T. Hidayat

Abstract:

Swab activities is an activity to lift fluid from inside the well with the use of a sand line that aims to find out fluid influx after conducting perforation or to reduce the level of fluid as an effort to get the difference between formation pressure with hydrostatic pressure in the well for underbalanced perforation. During the swab activity, problems occur frequent problems occur with the rubber swab. The rubber swab often breaks and becomes a fish inside the well. This rubber swab fishing activity caused the rig operation takes longer, the swab result data becomes too late and create potential losses of well operation for the company. The average time needed for fishing the fractions of rubber swab plus swab work is 42 hours. Innovation made for such problems is to modify the rubber swab tool. The rubber swab tool is modified by provided a series of brushes at the end part of the tool with a thread of connection in order to improve work safety, so when the rubber swab breaks, the broken swab will be lifted by the brush underneath; therefore, it reduces the loss time for rubber swab fishing. This tool has been applied, it and is proven that with this rubber swab tool modification, the rig operation becomes more efficient because it does not carry out the rubber swab fishing activity. The fish fractions of the rubber swab are lifted up to the surface. Therefore, it saves the fuel cost, and well production potentials are obtained. The average time to do swab work after the application of this modified tool is 8 hours.

Keywords: rubber swab, modifikasi swab, brush, fishing rubber swab, saving cost

Procedia PDF Downloads 159
20047 A Team-Based Learning Game Guided by a Social Robot

Authors: Gila Kurtz, Dan Kohen Vacs

Abstract:

Social robots (SR) is an emerging field striving to deploy computers capable of resembling human shapes and mimicking human movements, gestures, and behaviors. The evolving capability of SR to interact with human offers groundbreaking ways for learning and training opportunities. Studies show that SR can offer instructional experiences for fostering creativity, entertainment, enjoyment, and curiosity. These added values are essential for empowering instructional opportunities as gamified learning experiences. We present our project focused on deploying an activity to be experienced in an escape room aimed at team-based learning scaffolded by an SR, NAO. An escape room is a well-known approach for gamified activities focused on a simulated scenario experienced by team-based participants. Usually, the simulation takes place in a physical environment where participants must complete a series of challenges in a limited amount of time. During this experience, players learn something about the assigned topic of the room. In the current learning simulation, students must "save the nation" by locating sensitive information stolen and stored in a vault of four locks. Team members have to look for hints and solve riddles mediated by NAO. Each solution provides a unique code for opening one of the four locks. NAO is also used to provide ongoing feedback on the team's performance. We captured the proceeding of our activity and used it to conduct an evaluation study among ten experts in related areas. The experts were interviewed on their overall assessment of the learning activity and their perception of the added value related to the robot. The results were very encouraging on the feasibility that NAO can serve as a motivational tutor in adults' collaborative game-based learning. We believe that this study marks the first step toward a template for developing innovative team-based training using escape rooms supported by a humanoid robot.

Keywords: social robot, NAO, learning, team based activity, escape room

Procedia PDF Downloads 59
20046 General Purpose Graphic Processing Units Based Real Time Video Tracking System

Authors: Mallikarjuna Rao Gundavarapu, Ch. Mallikarjuna Rao, K. Anuradha Bai

Abstract:

Real Time Video Tracking is a challenging task for computing professionals. The performance of video tracking techniques is greatly affected by background detection and elimination process. Local regions of the image frame contain vital information of background and foreground. However, pixel-level processing of local regions consumes a good amount of computational time and memory space by traditional approaches. In our approach we have explored the concurrent computational ability of General Purpose Graphic Processing Units (GPGPU) to address this problem. The Gaussian Mixture Model (GMM) with adaptive weighted kernels is used for detecting the background. The weights of the kernel are influenced by local regions and are updated by inter-frame variations of these corresponding regions. The proposed system has been tested with GPU devices such as GeForce GTX 280, GeForce GTX 280 and Quadro K2000. The results are encouraging with maximum speed up 10X compared to sequential approach.

Keywords: connected components, embrace threads, local weighted kernel, structuring elements

Procedia PDF Downloads 423
20045 Genetically Encoded Tool with Time-Resolved Fluorescence Readout for the Calcium Concentration Measurement

Authors: Tatiana R. Simonyan, Elena A. Protasova, Anastasia V. Mamontova, Eugene G. Maksimov, Konstantin A. Lukyanov, Alexey M. Bogdanov

Abstract:

Here, we describe two variants of the calcium indicators based on the GCaMP sensitive core and BrUSLEE fluorescent protein (GCaMP-BrUSLEE and GCaMP-BrUSLEE-145). In contrast to the conventional GCaMP6-family indicators, these fluorophores are characterized by the well-marked responsiveness of their fluorescence decay kinetics to external calcium concentration both in vitro and in cellulo. Specifically, we show that the purified GCaMP-BrUSLEE and GCaMP-BrUSLEE-145 exhibit three-component fluorescence decay kinetics, with the amplitude-normalized lifetime component (t3*A3) of GCaMP-BrUSLEE-145 changing four-fold (500-2000 a.u.) in response to a Ca²⁺ concentration shift in the range of 0—350 nM. Time-resolved fluorescence microscopy of live cells displays the two-fold change of the GCaMP-BrUSLEE-145 mean lifetime upon histamine-stimulated calcium release. The aforementioned Ca²⁺-dependence calls considering the GCaMP-BrUSLEE-145 as a prospective Ca²⁺-indicator with the signal read-out in the time domain.

Keywords: calcium imaging, fluorescence lifetime imaging microscopy, fluorescent proteins, genetically encoded indicators

Procedia PDF Downloads 140
20044 Removal of Vanadium from Industrial Effluents by Natural Ion Exchanger

Authors: Shashikant R. Kuchekar, Haribhau R. Aher, Priti M. Dhage

Abstract:

The removal vanadium from aqueous solution using natural exchanger was investigated. The effects of pH, contact time and exchanger dose were studied at ambient temperature (25 0C ± 2 0C). The equilibrium process was described by the Langmuir isotherm model with adsorption capacity for vanadium. The natural exchanger i.e. tamarindus seeds powder was treated with formaldehyde and sulpuric acid to increase the adsorptivity of metals. The maximum exchange level was attained as 80.1% at pH 3 with exchanger dose 5 g and contact time 60 min. Method is applied for removal of vanadium from industrial effluents.

Keywords: industrial effluent, natural ion exchange, Tamarindous indica, vanadium

Procedia PDF Downloads 238
20043 From Biowaste to Biobased Products: Life Cycle Assessment of VALUEWASTE Solution

Authors: Andrés Lara Guillén, José M. Soriano Disla, Gemma Castejón Martínez, David Fernández-Gutiérrez

Abstract:

The worldwide population is exponentially increasing, which causes a rising demand for food, energy and non-renewable resources. These demands must be attended to from a circular economy point of view. Under this approach, the obtention of strategic products from biowaste is crucial for the society to keep the current lifestyle reducing the environmental and social issues linked to the lineal economy. This is the main objective of the VALUEWASTE project. VALUEWASTE is about valorizing urban biowaste into proteins for food and feed and biofertilizers, closing the loop of this waste stream. In order to achieve this objective, the project validates three value chains, which begin with the anaerobic digestion of the biowaste. From the anaerobic digestion, three by-products are obtained: i) methane that is used by microorganisms, which will be transformed into microbial proteins; ii) digestate that is used by black soldier fly, producing insect proteins; and iii) a nutrient-rich effluent, which will be transformed into biofertilizers. VALUEWASTE is an innovative solution, which combines different technologies to valorize entirely the biowaste. However, it is also required to demonstrate that the solution is greener than other traditional technologies (baseline systems). On one hand, the proteins from microorganisms and insects will be compared with other reference protein production systems (gluten, whey and soybean). On the other hand, the biofertilizers will be compared to the production of mineral fertilizers (ammonium sulphate and synthetic struvite). Therefore, the aim of this study is to provide that biowaste valorization can reduce the environmental impacts linked to both traditional proteins manufacturing processes and mineral fertilizers, not only at a pilot-scale but also at an industrial one. In the present study, both baseline system and VALUEWASTE solution are evaluated through the Environmental Life Cycle Assessment (E-LCA). The E-LCA is based on the standards ISO 14040 and 14044. The Environmental Footprint methodology was the one used in this study to evaluate the environmental impacts. The results for the baseline cases show that the food proteins coming from whey have the highest environmental impact on ecosystems compared to the other proteins sources: 7.5 and 15.9 folds higher than soybean and gluten, respectively. Comparing feed soybean and gluten, soybean has an environmental impact on human health 195.1 folds higher. In the case of biofertilizers, synthetic struvite has higher impacts than ammonium sulfate: 15.3 (ecosystems) and 11.8 (human health) fold, respectively. The results shown in the present study will be used as a reference to demonstrate the better environmental performance of the bio-based products obtained through the VALUEWASTE solution. Other originalities that the E-LCA performed in the VALUEWASTE project provides are the diverse direct implications on investment and policies. On one hand, better environmental performance will serve to remove the barriers linked to these kinds of technologies, boosting the investment that is backed by the E-LCA. On the other hand, it will be a germ to design new policies fostering these types of solutions to achieve two of the key targets of the European Community: being self-sustainable and carbon neutral.

Keywords: anaerobic digestion, biofertilizers, circular economy, nutrients recovery

Procedia PDF Downloads 81
20042 An Aptasensor Based on Magnetic Relaxation Switch and Controlled Magnetic Separation for the Sensitive Detection of Pseudomonas aeruginosa

Authors: Fei Jia, Xingjian Bai, Xiaowei Zhang, Wenjie Yan, Ruitong Dai, Xingmin Li, Jozef Kokini

Abstract:

Pseudomonas aeruginosa is a Gram-negative, aerobic, opportunistic human pathogen that is present in the soil, water, and food. This microbe has been recognized as a representative food-borne spoilage bacterium that can lead to many types of infections. Considering the casualties and property loss caused by P. aeruginosa, the development of a rapid and reliable technique for the detection of P. aeruginosa is crucial. The whole-cell aptasensor, an emerging biosensor using aptamer as a capture probe to bind to the whole cell, for food-borne pathogens detection has attracted much attention due to its convenience and high sensitivity. Here, a low-field magnetic resonance imaging (LF-MRI) aptasensor for the rapid detection of P. aeruginosa was developed. The basic detection principle of the magnetic relaxation switch (MRSw) nanosensor lies on the ‘T₂-shortening’ effect of magnetic nanoparticles in NMR measurements. Briefly speaking, the transverse relaxation time (T₂) of neighboring water protons get shortened when magnetic nanoparticles are clustered due to the cross-linking upon the recognition and binding of biological targets, or simply when the concentration of the magnetic nanoparticles increased. Such shortening is related to both the state change (aggregation or dissociation) and the concentration change of magnetic nanoparticles and can be detected using NMR relaxometry or MRI scanners. In this work, two different sizes of magnetic nanoparticles, which are 10 nm (MN₁₀) and 400 nm (MN₄₀₀) in diameter, were first immobilized with anti- P. aeruginosa aptamer through 1-Ethyl-3-(3-dimethylaminopropyl) carbodiimide (EDC)/N-hydroxysuccinimide (NHS) chemistry separately, to capture and enrich the P. aeruginosa cells. When incubating with the target, a ‘sandwich’ (MN₁₀-bacteria-MN₄₀₀) complex are formed driven by the bonding of MN400 with P. aeruginosa through aptamer recognition, as well as the conjugate aggregation of MN₁₀ on the surface of P. aeruginosa. Due to the different magnetic performance of the MN₁₀ and MN₄₀₀ in the magnetic field caused by their different saturation magnetization, the MN₁₀-bacteria-MN₄₀₀ complex, as well as the unreacted MN₄₀₀ in the solution, can be quickly removed by magnetic separation, and as a result, only unreacted MN₁₀ remain in the solution. The remaining MN₁₀, which are superparamagnetic and stable in low field magnetic field, work as a signal readout for T₂ measurement. Under the optimum condition, the LF-MRI platform provides both image analysis and quantitative detection of P. aeruginosa, with the detection limit as low as 100 cfu/mL. The feasibility and specificity of the aptasensor are demonstrated in detecting real food samples and validated by using plate counting methods. Only two steps and less than 2 hours needed for the detection procedure, this robust aptasensor can detect P. aeruginosa with a wide linear range from 3.1 ×10² cfu/mL to 3.1 ×10⁷ cfu/mL, which is superior to conventional plate counting method and other molecular biology testing assay. Moreover, the aptasensor has a potential to detect other bacteria or toxins by changing suitable aptamers. Considering the excellent accuracy, feasibility, and practicality, the whole-cell aptasensor provides a promising platform for a quick, direct and accurate determination of food-borne pathogens at cell-level.

Keywords: magnetic resonance imaging, meat spoilage, P. aeruginosa, transverse relaxation time

Procedia PDF Downloads 136
20041 A Mathematical Model of Blood Perfusion Dependent Temperature Distribution in Transient Case in Human Dermal Region

Authors: Yogesh Shukla

Abstract:

Many attempts have been made to study temperature distribution problem in human tissues under normal environmental and physiological conditions at constant arterial blood temperature. But very few attempts have been made to investigate temperature distribution in human tissues under different arterial blood temperature. In view of above, a finite element model has been developed to unsteady temperature distribution in dermal region in human body. The model has been developed for one dimension unsteady state case. The variation in parameters like thermal conductivity, blood mass flow and metabolic activity with respect to position and time has been incorporated in the model. Appropriate boundary conditions have been framed. The central difference approach has been used in space variable and trapezoidal rule has been employed a long time variable. Numerical results have been obtained to study relationship among temperature and time.

Keywords: rate of metabolism, blood mass flow rate, thermal conductivity, heat generation, finite element method

Procedia PDF Downloads 343
20040 On the Dwindling Supply of the Observable Cosmic Microwave Background Radiation

Authors: Jia-Chao Wang

Abstract:

The cosmic microwave background radiation (CMB) freed during the recombination era can be considered as a photon source of small duration; a one-time event happened everywhere in the universe simultaneously. If space is divided into concentric shells centered at an observer’s location, one can imagine that the CMB photons originated from the nearby shells would reach and pass the observer first, and those in shells farther away would follow as time goes forward. In the Big Bang model, space expands rapidly in a time-dependent manner as described by the scale factor. This expansion results in an event horizon coincident with one of the shells, and its radius can be calculated using cosmological calculators available online. Using Planck 2015 results, its value during the recombination era at cosmological time t = 0.379 million years (My) is calculated to be Revent = 56.95 million light-years (Mly). The event horizon sets a boundary beyond which the freed CMB photons will never reach the observer. The photons within the event horizon also exhibit a peculiar behavior. Calculated results show that the CMB observed today was freed in a shell located at 41.8 Mly away (inside the boundary set by Revent) at t = 0.379 My. These photons traveled 13.8 billion years (Gy) to reach here. Similarly, the CMB reaching the observer at t = 1, 5, 10, 20, 40, 60, 80, 100 and 120 Gy are calculated to be originated at shells of R = 16.98, 29.96, 37.79, 46.47, 53.66, 55.91, 56.62, 56.85 and 56.92 Mly, respectively. The results show that as time goes by, the R value approaches Revent = 56.95 Mly but never exceeds it, consistent with the earlier statement that beyond Revent the freed CMB photons will never reach the observer. The difference Revert - R can be used as a measure of the remaining observable CMB photons. Its value becomes smaller and smaller as R approaching Revent, indicating a dwindling supply of the observable CMB radiation. In this paper, detailed dwindling effects near the event horizon are analyzed with the help of online cosmological calculators based on the lambda cold dark matter (ΛCDM) model. It is demonstrated in the literature that assuming the CMB to be a blackbody at recombination (about 3000 K), then it will remain so over time under cosmological redshift and homogeneous expansion of space, but with the temperature lowered (2.725 K now). The present result suggests that the observable CMB photon density, besides changing with space expansion, can also be affected by the dwindling supply associated with the event horizon. This raises the question of whether the blackbody of CMB at recombination can remain so over time. Being able to explain the blackbody nature of the observed CMB is an import part of the success of the Big Bang model. The present results cast some doubts on that and suggest that the model may have an additional challenge to deal with.

Keywords: blackbody of CMB, CMB radiation, dwindling supply of CMB, event horizon

Procedia PDF Downloads 110