Search results for: dynamic energy model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25224

Search results for: dynamic energy model

3384 Economic and Environmental Impact of the Missouri Grazing Schools

Authors: C. A. Roberts, S. L. Mascaro, J. R. Gerrish, J. L. Horner

Abstract:

Management-intensive Grazing (MiG) is a practice that rotates livestock through paddocks in a way that best matches the nutrient requirements of the animal to the yield and quality of the pasture. In the USA, MiG has been taught to livestock producers throughout the state of Missouri in 2- and 3-day workshops called “Missouri Grazing Schools.” The economic impact of these schools was quantified using IMPLAN software. The model included hectares of adoption, animal performance, carrying capacity, and input costs. To date, MiG, as taught in the Missouri Grazing Schools, has been implemented on more than 70,000 hectares in Missouri. The economic impact of these schools is presently $125 million USD per year added to the state economy. This magnitude of impact is the result not only of widespread adoption but also because of increased livestock carrying capacity; in Missouri, a capacity increase of 25 to 30% has been well documented. Additional impacts have been MiG improving forage quality and reducing the cost of feed and fertilizer. The environmental impact of MiG in the state of Missouri is currently being estimated. Environmental impact takes into account the reduction in the application of commercial fertilizers; in MiG systems, nitrogen is supplied by N fixation from legumes, and much of the P and K is recycled naturally by well-distributed manure. The environmental impact also estimates carbon sequestration and methane production; MiG can increase carbon sequestration and reduce methane production in comparison to default grazing practices and feedlot operations in the USA.

Keywords: agricultural education, forage quality, management-intensive grazing, nutrient cycling, stock density, sustainable agriculture

Procedia PDF Downloads 199
3383 Critical Factors Affecting the Implementation of Total Quality Management in the Construction Industry in U. A. E.

Authors: Firas Mohamad Al-Sabek

Abstract:

The Purpose of the paper is to examine the most critical and important factor which will affect the implementation of Total Quality Management (TQM) in the construction industry in the United Arab Emirates. It also examines the most effected Project outcome from implementing TQM. A framework was also proposed depending on the literature studies. The method used in this paper is a quantitative study. A survey with a sample of 60 respondents was created and distributed in a construction company in Abu Dhabi, which includes 15 questions to examine the most critical factor that will affect the implementation of TQM in addition to the most effected project outcome from implementing TQM. The survey showed that management commitment is the most important factor in implementing TQM in a construction company. Also it showed that Project cost is most effected outcome from the implementation of TQM. Management commitment is very important for implementing TQM in any company. If the management loose interest in quality then everyone in the organization will do so. The success of TQM will depend mostly on the top of the pyramid. Also cost is reduced and money is saved when the project team implement TQM. While if no quality measures are present within the team, the project will suffer a commercial failure. Based on literature, more factors can be examined and added to the model. In addition, more construction companies could be surveyed in order to obtain more accurate results. Also this study could be conducted outside the United Arab Emirates for further enchantment.

Keywords: construction project, total quality management, management commitment, cost, theoretical framework

Procedia PDF Downloads 418
3382 Investigating the Pedestrian Willingness to Pay to Choose Appropriate Policies for Improving the Safety of Pedestrian Facilities

Authors: Babak Mirbaha, Mahmoud Saffarzadeh, Fatemeh Mohajeri

Abstract:

Road traffic accidents lead to a higher rate of death and injury, especially in vulnerable road users such as pedestrians. Improving the safety of facilities for pedestrians is a major concern for policymakers because of the high number of pedestrian fatalities and direct and indirect costs which are imposed to the society. This study focuses on the idea of determining the willingness to pay of pedestrians for increasing their safety while crossing the street. In this study, three different scenarios including crossing the street with zebra crossing facilities, crossing the street with zebra crossing facilities and installing a pedestrian traffic light and constructing a pedestrian bridge with escalator are presented. The research was conducted based on stated preferences method. The required data were collected from a questionnaire that consisted of three parts: pedestrian’s demographic characteristics, travel characteristics and scenarios. Four different payment amounts are presented for each scenario and a logit model has been built for each proposed payment. The results show that sex, age, education, average household income and individual salary have significant effect on choosing a scenario. Among the policies that have been mentioned through the questionnaire scenarios, the scenario of crossing the street with zebra crossing facilities and installing a traffic lights is the most frequent, with willingness to pay 10,000 Rials and the scenario of crossing the street with a zebra crossing with a willingness to pay 100,000 Rials having the least frequency. For all scenarios, as the payment is increasing, the willingness to pay decreases.

Keywords: pedestrians, willingness to pay, safety, immunization

Procedia PDF Downloads 153
3381 Information Retrieval from Internet Using Hand Gestures

Authors: Aniket S. Joshi, Aditya R. Mane, Arjun Tukaram

Abstract:

In the 21st century, in the era of e-world, people are continuously getting updated by daily information such as weather conditions, news, stock exchange market updates, new projects, cricket updates, sports and other such applications. In the busy situation, they want this information on the little use of keyboard, time. Today in order to get such information user have to repeat same mouse and keyboard actions which includes time and inconvenience. In India due to rural background many people are not much familiar about the use of computer and internet also. Also in small clinics, small offices, and hotels and in the airport there should be a system which retrieves daily information with the minimum use of keyboard and mouse actions. We plan to design application based project that can easily retrieve information with minimum use of keyboard and mouse actions and make our task more convenient and easier. This can be possible with an image processing application which takes real time hand gestures which will get matched by system and retrieve information. Once selected the functions with hand gestures, the system will report action information to user. In this project we use real time hand gesture movements to select required option which is stored on the screen in the form of RSS Feeds. Gesture will select the required option and the information will be popped and we got the information. A real time hand gesture makes the application handier and easier to use.

Keywords: hand detection, hand tracking, hand gesture recognition, HSV color model, Blob detection

Procedia PDF Downloads 284
3380 Protective Role of Autophagy Challenging the Stresses of Type 2 Diabetes and Dyslipidemia

Authors: Tanima Chatterjee, Maitree Bhattacharyya

Abstract:

The global challenge of type 2 diabetes mellitus is a major health concern in this millennium, and researchers are continuously exploring new targets to develop a novel therapeutic strategy. Type 2 diabetes mellitus (T2DM) is often coupled with dyslipidemia increasing the risks for cardiovascular (CVD) complications. Enhanced oxidative and nitrosative stresses appear to be the major risk factors underlying insulin resistance, dyslipidemia, β-cell dysfunction, and T2DM pathogenesis. Autophagy emerges to be a promising defense mechanism against stress-mediated cell damage regulating tissue homeostasis, cellular quality control, and energy production, promoting cell survival. In this study, we have attempted to explore the pivotal role of autophagy in T2DM subjects with or without dyslipidemia in peripheral blood mononuclear cells and insulin-resistant HepG2 cells utilizing flow cytometric platform, confocal microscopy, and molecular biology techniques like western blotting, immunofluorescence, and real-time polymerase chain reaction. In the case of T2DM with dyslipidemia higher population of autophagy, positive cells were detected compared to patients with the only T2DM, which might have resulted due to higher stress. Autophagy was observed to be triggered both by oxidative and nitrosative stress revealing a novel finding of our research. LC3 puncta was observed in peripheral blood mononuclear cells and periphery of HepG2 cells in the case of the diabetic and diabetic-dyslipidemic conditions. Increased expression of ATG5, LC3B, and Beclin supports the autophagic pathway in both PBMC and insulin-resistant Hep G2 cells. Upon blocking autophagy by 3-methyl adenine (3MA), the apoptotic cell population increased significantly, as observed by caspase‐3 cleavage and reduced expression of Bcl2. Autophagy has also been evidenced to control oxidative stress-mediated up-regulation of inflammatory markers like IL-6 and TNF-α. To conclude, this study elucidates autophagy to play a protective role in the case of diabetes mellitus with dyslipidemia. In the present scenario, this study demands to have a significant impact on developing a new therapeutic strategy for diabetic dyslipidemic subjects by enhancing autophagic activity.

Keywords: autophagy, apoptosis, dyslipidemia, reactive oxygen species, reactive nitrogen species, Type 2 diabetes

Procedia PDF Downloads 128
3379 Colour Characteristics of Dried Cocoa Using Shallow Box Fermentation Technique

Authors: Khairul Bariah Sulaiman, Tajul Aris Yang

Abstract:

Fermentation is well known as an essential process in cocoa beans. Besides to develop the precursor of cocoa flavour, it also induce the colour changes in the beans.The fermentation process is reported to be influenced by duration of pod storage and fermentation. Therefore, this study was conducted to evaluate colour of Malaysian cocoa beans and how the pods storage and fermentation duration using shallow box technique will effect on it characteristics. There are two factors being studied ie duration of cocoa pod storage (0, 2, 4, and 6 days) and duration of cocoa fermentation (0, 1, 2, 3, 4 and 5 days). The experiment is arranged in 4 x 6 factorial design with 24 treatments and arrangement is in a Completely Randomised Design (CRD). The produced beans is inspected for colour changes under artificial light during cut test and divided into four groups of colour namely fully brown, purple brown, fully purple and slaty. Cut tests indicated that cocoa beans which are directly dried without undergone fermentation has the highest slaty percentage. However, application of pods storage before fermentation process is found to decrease the slaty percentage. In contrast, the percentages of fully brown beans start to dominate after two days of fermentation, especially from four and six days of pods storage batch. Whereas, almost all batch have percentage of fully purple less than 20%. Interestingly, the percentage of purple brown beans are scattered in the entire beans batch regardless any specific trend. Meanwhile, statistical analysis using General Linear Model showed that the pods storage has a significant effect on the colour characteristic of the Malaysian dried beans compared to fermentation duration.

Keywords: cocoa beans, colour, fermentation, shallow box

Procedia PDF Downloads 486
3378 The Impact of Climate Change on Sustainable Aquaculture Production

Authors: Peyman Mosberian-Tanha, Mona Rezaei

Abstract:

Aquaculture sector is the fastest growing food sector with annual growth rate of about 10%. The sustainability of aquaculture production, however, has been debated mainly in relation to the feed ingredients used for farmed fish. The industry has been able to decrease its dependency on marine-based ingredients in line with policies for more sustainable production. As a result, plant-based ingredients have increasingly been incorporated in aquaculture feeds, especially in feeds for popular carnivorous species, salmonids. The effect of these ingredients on salmonids’ health and performance has been widely studied. In most cases, plant-based diets are associated with varying degrees of health and performance issues across salmonids, partly depending on inclusion levels of plant ingredients and the species in question. However, aquaculture sector is facing another challenge of concern. Environmental challenges in association with climate change is another issue the aquaculture sector must deal with. Data from trials in salmonids subjected to environmental challenges of various types show adverse physiological responses, partly in relation to stress. To date, there are only a limited number of studies reporting the interactive effects of adverse environmental conditions and dietary regimens on salmonids. These studies have shown that adverse environmental conditions exacerbate the detrimental effect of plant-based diets on digestive function and health in salmonids. This indicates an additional challenge for the aquaculture sector to grow in a sustainable manner. The adverse environmental conditions often studied in farmed fish is the change in certain water quality parameters such as oxygen and/or temperature that are typically altered in response to climate change and, more specifically, global warming. In a challenge study, we observed that the in the fish fed a plant-based diet, the fish’s ability to absorb dietary energy was further reduced when reared under low oxygen level. In addition, gut health in these fish was severely impaired. Some other studies also confirm the adverse effect of environmental challenge on fish’s gut health. These effects on the digestive function and gut health of salmonids may result in less resistance to diseases and weaker performance with significant economic and ethical implications. Overall, various findings indicate the multidimensional negative effects of climate change, as a major environmental issue, in different sectors, including aquaculture production. Therefore, a comprehensive evaluation of different ways to cope with climate change is essential for planning more sustainable strategies in aquaculture sector.

Keywords: aquaculture, climate change, sustainability, salmonids

Procedia PDF Downloads 183
3377 Variability of Hydrological Modeling of the Blue Nile

Authors: Abeer Samy, Oliver C. Saavedra Valeriano, Abdelazim Negm

Abstract:

The Blue Nile Basin is the most important tributary of the Nile River. Egypt and Sudan are almost dependent on water originated from the Blue Nile. This multi-dependency creates conflicts among the three countries Egypt, Sudan, and Ethiopia making the management of these conflicts as an international issue. Good assessment of the water resources of the Blue Nile is an important to help in managing such conflicts. Hydrological models are good tool for such assessment. This paper presents a critical review of the nature and variability of the climate and hydrology of the Blue Nile Basin as a first step of using hydrological modeling to assess the water resources of the Blue Nile. Many several attempts are done to develop basin-scale hydrological modeling on the Blue Nile. Lumped and semi distributed models used averages of meteorological inputs and watershed characteristics in hydrological simulation, to analyze runoff for flood control and water resource management. Distributed models include the temporal and spatial variability of catchment conditions and meteorological inputs to allow better representation of the hydrological process. The main challenge of all used models was to assess the water resources of the basin is the shortage of the data needed for models calibration and validation. It is recommended to use distributed model for their higher accuracy to cope with the great variability and complexity of the Blue Nile basin and to collect sufficient data to have more sophisticated and accurate hydrological modeling.

Keywords: Blue Nile Basin, climate change, hydrological modeling, watershed

Procedia PDF Downloads 362
3376 Reducing Uncertainty in Climate Projections over Uganda by Numerical Models Using Bias Correction

Authors: Isaac Mugume

Abstract:

Since the beginning of the 21st century, climate change has been an issue due to the reported rise in global temperature and changes in the frequency as well as severity of extreme weather and climatic events. The changing climate has been attributed to rising concentrations of greenhouse gases, including environmental changes such as ecosystems and land-uses. Climatic projections have been carried out under the auspices of the intergovernmental panel on climate change where a couple of models have been run to inform us about the likelihood of future climates. Since one of the major forcings informing the changing climate is emission of greenhouse gases, different scenarios have been proposed and future climates for different periods presented. The global climate models project different areas to experience different impacts. While regional modeling is being carried out for high impact studies, bias correction is less documented. Yet, the regional climate models suffer bias which introduces uncertainty. This is addressed in this study by bias correcting the regional models. This study uses the Weather Research and Forecasting model under different representative concentration pathways and correcting the products of these models using observed climatic data. This study notes that bias correction (e.g., the running-mean bias correction; the best easy systematic estimator method; the simple linear regression method, nearest neighborhood, weighted mean) improves the climatic projection skill and therefore reduce the uncertainty inherent in the climatic projections.

Keywords: bias correction, climatic projections, numerical models, representative concentration pathways

Procedia PDF Downloads 116
3375 Expert Based System Design for Integrated Waste Management

Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy

Abstract:

Recently, an increasing number of researchers have been focusing on working out realistic solutions to sustainability problems. As sustainability issues gain higher importance for organisations, the management of such decisions becomes critical. Knowledge representation is a fundamental issue of complex knowledge based systems. Many types of sustainability problems would benefit from models based on experts’ knowledge. Cognitive maps have been used for analyzing and aiding decision making. A cognitive map can be made of almost any system or problem. A fuzzy cognitive map (FCM) can successfully represent knowledge and human experience, introducing concepts to represent the essential elements and the cause and effect relationships among the concepts to model the behavior of any system. Integrated waste management systems (IWMS) are complex systems that can be decomposed to non-related and related subsystems and elements, where many factors have to be taken into consideration that may be complementary, contradictory, and competitive; these factors influence each other and determine the overall decision process of the system. The goal of the present paper is to construct an efficient IWMS which considers various factors. The authors’ intention is to propose an expert based system design approach for implementing expert decision support in the area of IWMSs and introduces an appropriate methodology for the development and analysis of group FCM. A framework for such a methodology consisting of the development and application phases is presented.

Keywords: factors, fuzzy cognitive map, group decision, integrated waste management system

Procedia PDF Downloads 275
3374 Multimodal Deep Learning for Human Activity Recognition

Authors: Ons Slimene, Aroua Taamallah, Maha Khemaja

Abstract:

In recent years, human activity recognition (HAR) has been a key area of research due to its diverse applications. It has garnered increasing attention in the field of computer vision. HAR plays an important role in people’s daily lives as it has the ability to learn advanced knowledge about human activities from data. In HAR, activities are usually represented by exploiting different types of sensors, such as embedded sensors or visual sensors. However, these sensors have limitations, such as local obstacles, image-related obstacles, sensor unreliability, and consumer concerns. Recently, several deep learning-based approaches have been proposed for HAR and these approaches are classified into two categories based on the type of data used: vision-based approaches and sensor-based approaches. This research paper highlights the importance of multimodal data fusion from skeleton data obtained from videos and data generated by embedded sensors using deep neural networks for achieving HAR. We propose a deep multimodal fusion network based on a twostream architecture. These two streams use the Convolutional Neural Network combined with the Bidirectional LSTM (CNN BILSTM) to process skeleton data and data generated by embedded sensors and the fusion at the feature level is considered. The proposed model was evaluated on a public OPPORTUNITY++ dataset and produced a accuracy of 96.77%.

Keywords: human activity recognition, action recognition, sensors, vision, human-centric sensing, deep learning, context-awareness

Procedia PDF Downloads 98
3373 A Low-Cost Memristor Based on Hybrid Structures of Metal-Oxide Quantum Dots and Thin Films

Authors: Amir Shariffar, Haider Salman, Tanveer Siddique, Omar Manasreh

Abstract:

According to the recent studies on metal-oxide memristors, researchers tend to improve the stability, endurance, and uniformity of resistive switching (RS) behavior in memristors. Specifically, the main challenge is to prevent abrupt ruptures in the memristor’s filament during the RS process. To address this problem, we are proposing a low-cost hybrid structure of metal oxide quantum dots (QDs) and thin films to control the formation of filaments in memristors. We aim to use metal oxide quantum dots because of their unique electronic properties and quantum confinement, which may improve the resistive switching behavior. QDs have discrete energy spectra due to electron confinement in three-dimensional space. Because of Coulomb repulsion between electrons, only a few free electrons are contained in a quantum dot. This fact might guide the growth direction for the conducting filaments in the metal oxide memristor. As a result, it is expected that QDs can improve the endurance and uniformity of RS behavior in memristors. Moreover, we use a hybrid structure of intrinsic n-type quantum dots and p-type thin films to introduce a potential barrier at the junction that can smooth the transition between high and low resistance states. A bottom-up approach is used for fabricating the proposed memristor using different types of metal-oxide QDs and thin films. We synthesize QDs including, zinc oxide, molybdenum trioxide, and nickel oxide combined with spin-coated thin films of titanium dioxide, copper oxide, and hafnium dioxide. We employ fluorine-doped tin oxide (FTO) coated glass as the substrate for deposition and bottom electrode. Then, the active layer composed of one type of quantum dots, and the opposite type of thin films is spin-coated onto the FTO. Lastly, circular gold electrodes are deposited with a shadow mask by using electron-beam (e-beam) evaporation at room temperature. The fabricated devices are characterized using a probe station with a semiconductor parameter analyzer. The current-voltage (I-V) characterization is analyzed for each device to determine the conduction mechanism. We evaluate the memristor’s performance in terms of stability, endurance, and retention time to identify the optimal memristive structure. Finally, we assess the proposed hypothesis before we proceed to the optimization process for fabricating the memristor.

Keywords: memristor, quantum dot, resistive switching, thin film

Procedia PDF Downloads 118
3372 Preservice EFL Teachers in a Blended Professional Development Program: Learning to Teach Speech Acts

Authors: Mei-Hui Liu

Abstract:

This study examines the effectiveness of a blended professional development program on preservice EFL (English as a foreign language) teachers’ learning to teach speech acts with the advent of Information and Communication Technology, researchers and scholars underscore the significance of integrating online and face-to-face learning opportunities in the teacher education field. Yet, a paucity of evidence has been documented to investigate the extent to which such a blended professional learning model may impact real classroom practice and student learning outcome. This yearlong project involves various stakeholders, including 25 preservice teachers, 5 English professionals, and 45 secondary school students. Multiple data sources collected are surveys, interviews, reflection journals, online discussion messages, artifacts, and discourse completion tests. Relying on the theoretical lenses of Community of Inquiry, data analysis depicts the nature and process of preservice teachers’ professional development in this blended learning community, which triggers and fosters both face-to-face and synchronous/asynchronous online interactions among preservice teachers and English professionals (i.e., university faculty and in-service teachers). Also included is the student learning outcome after preservice teachers put what they learn from the support community into instructional practice. Pedagogical implications and research suggestions are further provided based on the research findings and limitations.

Keywords: blended professional development, preservice EFL teachers, speech act instruction, student learning outcome

Procedia PDF Downloads 222
3371 Apatite Flotation Using Fruits' Oil as Collector and Sorghum as Depressant

Authors: Elenice Maria Schons Silva, Andre Carlos Silva

Abstract:

The crescent demand for raw material has increased mining activities. Mineral industry faces the challenge of process more complexes ores, with very small particles and low grade, together with constant pressure to reduce production costs and environment impacts. Froth flotation deserves special attention among the concentration methods for mineral processing. Besides its great selectivity for different minerals, flotation is a high efficient method to process fine particles. The process is based on the minerals surficial physicochemical properties and the separation is only possible with the aid of chemicals such as collectors, frothers, modifiers, and depressants. In order to use sustainable and eco-friendly reagents, oils extracted from three different vegetable species (pequi’s pulp, macauba’s nut and pulp, and Jatropha curcas) were studied and tested as apatite collectors. Since the oils are not soluble in water, an alkaline hydrolysis (or saponification), was necessary before their contact with the minerals. The saponification was performed at room temperature. The tests with the new collectors were carried out at pH 9 and Flotigam 5806, a synthetic mix of fatty acids industrially adopted as apatite collector manufactured by Clariant, was used as benchmark. In order to find a feasible replacement for cornstarch the flour and starch of a graniferous variety of sorghum was tested as depressant. Apatite samples were used in the flotation tests. XRF (X-ray fluorescence), XRD (X-ray diffraction), and SEM/EDS (Scanning Electron Microscopy with Energy Dispersive Spectroscopy) were used to characterize the apatite samples. Zeta potential measurements were performed in the pH range from 3.5 to 12.5. A commercial cornstarch was used as depressant benchmark. Four depressants dosages and pH values were tested. A statistical test was used to verify the pH, dosage, and starch type influence on the minerals recoveries. For dosages equal or higher than 7.5 mg/L, pequi oil recovered almost all apatite particles. In one hand, macauba’s pulp oil showed excellent results for all dosages, with more than 90% of apatite recovery, but in the other hand, with the nut oil, the higher recovery found was around 84%. Jatropha curcas oil was the second best oil tested and more than 90% of the apatite particles were recovered for the dosage of 7.5 mg/L. Regarding the depressant, the lower apatite recovery with sorghum starch were found for a dosage of 1,200 g/t and pH 11, resulting in a recovery of 1.99%. The apatite recovery for the same conditions as 1.40% for sorghum flour (approximately 30% lower). When comparing with cornstarch at the same conditions sorghum flour produced an apatite recovery 91% lower.

Keywords: collectors, depressants, flotation, mineral processing

Procedia PDF Downloads 147
3370 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 312
3369 Educational Reforms in Algeria: Dilemmas of Globalization, Equity, and Decolonization

Authors: Fella Lahmar

Abstract:

This chapter investigates the educational reforms in Algeria, highlighting the challenges and complexities that arise in the context of globalization, equity, and decolonization. While Algeria’s education system historically had a socialist-economic model grounded in Islamic values, contemporary reforms reflect global influences and aspirations for cultural authenticity. The study employed a qualitative approach, utilizing semi-structured interviews with a diverse sample of 15 participants intimately involved in the Algerian education system. Analysis of the data reveals a discrepancy between the educational system’s pedagogical practices and students’ diverse learning needs, implying ramifications for educational equity and social justice. Furthermore, a critical tension was evident between global influences, local cultural authenticity, and the endeavor to decolonize education. In conclusion, the chapter advocates for reforms that prioritize the students’ holistic development and well-being while fostering intrinsic motivation and engagement. This entails re-evaluating curriculum frameworks, assessment strategies, and pedagogies in light of Algeria’s cultural and religious heritage. The chapter also calls for future research to explore methods for innovatively integrating cultural heritage into education in ways to cultivate learners who are both locally grounded and globally aware.

Keywords: impact of globalization on education, parental involvement in education, marketization of education, policy enactment and reform, curriculum overload, holistic approach, shadow education

Procedia PDF Downloads 92
3368 Real-Time Pedestrian Detection Method Based on Improved YOLOv3

Authors: Jingting Luo, Yong Wang, Ying Wang

Abstract:

Pedestrian detection in image or video data is a very important and challenging task in security surveillance. The difficulty of this task is to locate and detect pedestrians of different scales in complex scenes accurately. To solve these problems, a deep neural network (RT-YOLOv3) is proposed to realize real-time pedestrian detection at different scales in security monitoring. RT-YOLOv3 improves the traditional YOLOv3 algorithm. Firstly, the deep residual network is added to extract vehicle features. Then six convolutional neural networks with different scales are designed and fused with the corresponding scale feature maps in the residual network to form the final feature pyramid to perform pedestrian detection tasks. This method can better characterize pedestrians. In order to further improve the accuracy and generalization ability of the model, a hybrid pedestrian data set training method is used to extract pedestrian data from the VOC data set and train with the INRIA pedestrian data set. Experiments show that the proposed RT-YOLOv3 method achieves 93.57% accuracy of mAP (mean average precision) and 46.52f/s (number of frames per second). In terms of accuracy, RT-YOLOv3 performs better than Fast R-CNN, Faster R-CNN, YOLO, SSD, YOLOv2, and YOLOv3. This method reduces the missed detection rate and false detection rate, improves the positioning accuracy, and meets the requirements of real-time detection of pedestrian objects.

Keywords: pedestrian detection, feature detection, convolutional neural network, real-time detection, YOLOv3

Procedia PDF Downloads 140
3367 Using Reading to Learn Pedagogy to Promote Chinese Written Vocabulary Acquisition: An Evaluative Study

Authors: Mengping Cheng, John Everatt, Alison Arrow, Amanda Denston

Abstract:

Based on the available evidence, Chinese heritage language learners have a basic level of Chinese language proficiency with lower capability in literacy compared to speaking. Low levels of literacy are likely related to the lack of reading activities in current textbook-based pedagogy used in Chinese community schools. The present study aims to use Reading to Learn pedagogy which is a top-down language learning model and test the effectiveness of Reading to Learn on Chinese heritage learners’ written vocabulary acquisition. A quasi-experiment with the pre-test/post-test non-equivalent group design was conducted. The experimental group received Reading to Learn instructions and the control group had traditional textbook-based instructions. Participants were given Chinese characters tasks (a recognize-and-read task and a listen-and-point task), vocabulary tasks (a receptive vocabulary task and a productive vocabulary task) and a sentence cloze test in pre-tests and post-tests. Data collection is in progress and results will be available shortly. If the results show more improvement of Chinese written vocabulary in the experimental group than in the control group, it will be recommended that Reading to Learn pedagogy is valuable to be used to maintain and develop Chinese heritage language literacy.

Keywords: Chinese heritage language, experimental research, Reading to Learn pedagogy, vocabulary acquisition

Procedia PDF Downloads 151
3366 Enhancing Photocatalytic Hydrogen Production: Modification of TiO₂ by Coupling with Semiconductor Nanoparticles

Authors: Saud Hamdan Alshammari

Abstract:

Photocatalytic water splitting to produce hydrogen (H₂) has obtained significant attention as an environmentally friendly technology. This process, which produces hydrogen from water and sunlight, represents a renewable energy source. Titanium dioxide (TiO₂) plays a critical role in photocatalytic hydrogen production due to its chemical stability, availability, and low cost. Nevertheless, TiO₂'s wide band gap (3.2 eV) limits its visible light absorption and might affect the effectiveness of the photocatalytic. Coupling TiO₂ with other semiconductors is a strategy that can enhance TiO₂ by narrowing its band gap and improving visible light absorption. This paper studies the modification of TiO₂ by coupling it with another semiconductor such as CdS nanoparticles using a reflux reactor and autoclave reactor that helps form a core-shell structure. Characterization techniques, including TEM and UV-Vis spectroscopy, confirmed successful coating of TiO₂ on CdS core, reduction of the band gap from 3.28 eV to 3.1 eV, and enhanced light absorption in the visible region. These modifications are attributed to the heterojunction structure between TiO₂ and CdS.The essential goal of this study is to improve TiO₂ for use in photocatalytic water splitting to enhance hydrogen production. The core-shell TiO₂@CdS nanoparticles exhibited promising results, due to band gap narrowing and improved light absorption. Future work will involve adding Pt as a co-catalyst, which is known to increase surface reaction activity by enhancing proton adsorption. Evaluation of the TiO₂@CdS@Pt catalyst will include performance assessments and hydrogen productivity tests, considering factors such as effective shapes and material ratios. Moreover, the study could be enhanced by studying further modifications to the catalyst and displaying additional performance evaluations. For instance, doping TiO₂ with metals such as nickel (Ni), iron (Fe), and cobalt (Co) and non-metals such as nitrogen (N), carbon (C), and sulfur (S) could positively influence the catalyst by reducing the band gap, enhancing the separation of photogenerated electron-hole pairs, and increasing the surface area, respectively. Additionally, to further improve catalytic performance, examining different catalyst morphologies, such as nanorods, nanowires, and nanosheets, in hydrogen production could be highly beneficial. Optimizing photoreactor design for efficient photon delivery and illumination will further enhance the photocatalytic process. These strategies collectively aim to overcome current challenges and improve the efficiency of hydrogen production via photocatalysis.

Keywords: hydrogen production, photocatalytic, water spliiting, semiconductor, nanoparticles

Procedia PDF Downloads 15
3365 Retrospective Reconstruction of Time Series Data for Integrated Waste Management

Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy

Abstract:

The development, operation and maintenance of Integrated Waste Management Systems (IWMS) affects essentially the sustainable concern of every region. The features of such systems have great influence on all of the components of sustainability. In order to reach the optimal way of processes, a comprehensive mapping of the variables affecting the future efficiency of the system is needed such as analysis of the interconnections among the components and modelling of their interactions. The planning of a IWMS is based fundamentally on technical and economical opportunities and the legal framework. Modelling the sustainability and operation effectiveness of a certain IWMS is not in the scope of the present research. The complexity of the systems and the large number of the variables require the utilization of a complex approach to model the outcomes and future risks. This complex method should be able to evaluate the logical framework of the factors composing the system and the interconnections between them. The authors of this paper studied the usability of the Fuzzy Cognitive Map (FCM) approach modelling the future operation of IWMS’s. The approach requires two input data set. One is the connection matrix containing all the factors affecting the system in focus with all the interconnections. The other input data set is the time series, a retrospective reconstruction of the weights and roles of the factors. This paper introduces a novel method to develop time series by content analysis.

Keywords: content analysis, factors, integrated waste management system, time series

Procedia PDF Downloads 323
3364 Simplifying Seismic Vulnerability Analysis for Existing Reinforced Concrete Buildings

Authors: Maryam Solgi, Behzad Shahmohammadi, Morteza Raissi Dehkordi

Abstract:

One of the main steps for seismic retrofitting of buildings is to determine the vulnerability of structures. While current procedures for evaluating existing buildings are complicated, and there is no limitation between short, middle-high, and tall buildings. This research utilizes a simplified method for assessing structures, which is adequate for existing reinforced concrete buildings. To approach this aim, Simple Lateral Mechanisms Analysis (SLaMA) procedure proposed by NZSEE (New Zealand Society for Earthquake Engineering) has been carried out. In this study, three RC moment-resisting frame buildings are determined. First, these buildings have been evaluated by inelastic static procedure (Pushover) based on acceptance criteria. Then, Park-Ang Damage Index is determined for the whole members of each building by Inelastic Time History Analysis. Next, the Simple Lateral Mechanisms Analysis procedure, a hand method, is carried out to define the capacity of structures. Ultimately, existing procedures are compared with Peak Ground Acceleration caused to fail (PGAfail). The results of this comparison emphasize that the Pushover procedure and SLaMA method define a greater value of PGAfail than the Park-Ang Damage model.

Keywords: peak ground acceleration caused to fail, reinforced concrete moment-frame buildings, seismic vulnerability analysis, simple lateral mechanisms analysis

Procedia PDF Downloads 91
3363 An Analytical Approach to Assess and Compare the Vulnerability Risk of Operating Systems

Authors: Pubudu K. Hitigala Kaluarachchilage, Champike Attanayake, Sasith Rajasooriya, Chris P. Tsokos

Abstract:

Operating system (OS) security is a key component of computer security. Assessing and improving OSs strength to resist against vulnerabilities and attacks is a mandatory requirement given the rate of new vulnerabilities discovered and attacks occurring. Frequency and the number of different kinds of vulnerabilities found in an OS can be considered an index of its information security level. In the present study five mostly used OSs, Microsoft Windows (windows 7, windows 8 and windows 10), Apple’s Mac and Linux are assessed for their discovered vulnerabilities and the risk associated with each. Each discovered and reported vulnerability has an exploitability score assigned in CVSS score of the national vulnerability database. In this study the risk from vulnerabilities in each of the five Operating Systems is compared. Risk Indexes used are developed based on the Markov model to evaluate the risk of each vulnerability. Statistical methodology and underlying mathematical approach is described. Initially, parametric procedures are conducted and measured. There were, however, violations of some statistical assumptions observed. Therefore the need for non-parametric approaches was recognized. 6838 vulnerabilities recorded were considered in the analysis. According to the risk associated with all the vulnerabilities considered, it was found that there is a statistically significant difference among average risk levels for some operating systems, indicating that according to our method some operating systems have been more risk vulnerable than others given the assumptions and limitations. Relevant test results revealing a statistically significant difference in the Risk levels of different OSs are presented.

Keywords: cybersecurity, Markov chain, non-parametric analysis, vulnerability, operating system

Procedia PDF Downloads 180
3362 Toxic Dyes Removal in Aqueous Solution Using Calcined and Uncalcined Anionic Clay Zn/Al+Fe

Authors: Bessaha Hassiba, Bouraada Mohamed

Abstract:

Layered double hydroxide with Zn/(Al+Fe) molar ratio of 3:1 was synthesized by co-precipitation method and their calcined product was obtained by heating treatment of ZAF-HT at 500°C. The calcined and uncalcined materials were used to remove weak acid dyes: indigo carmine (IC) and green bezanyl-F2B (F2B) in aqueous solution. The synthesized materials were characterized by XRD, SEM, FTIR and TG/DTA analysis confirming the formation of pure layered structure of ZAF-HT, the destruction of the original structure after calcination and the intercalation of the dyes molecules. Moreover, the interlayer distance increases from 7.645 Å in ZAF-HT to 19.102 Å after the dyes sorption. The dose of the adsorbents was chosen 0.5 g/l while the initial concentrations were 250 and 750 mg/l for indigo carmine and green bezanyl-F2B respectively. The sorption experiments were carried out at ambient temperature and without adjusting the initial solution pH (pHi = 6.10 for IC and pHi = 5.01 for F2B). In addition, the maximum adsorption capacities obtained by ZAF-HT and CZAF for both dyes followed the order: CZAF-F2B (1501.4 mg.g-1) > CZAF-IC (617.3 mg.g-1) > ZAF-HT-IC (41.4 mg.g-1) > ZAF-HT-F2B (28.9 mg.g-1). The removal of indigo carmine and green bezanyl-F2B by ZAF-HT was due to the anion exchange and/or the adsorption on the surface. By using the calcined material (CZAF), the removal of the dyes was based on a particular property, called ‘memory effect’. CZAF recover the pristine structure in the presence anionic molecules such as acid dyes where they occupy the interlayer space. The sorption process was spontaneous in nature and followed pseudo-second-order. The isotherms showed that the removal of IC and F2B by ZAF-HT and CZAF were consistent with Langmiur model.

Keywords: acid dyes, adsorption, calcination, layered double hydroxides

Procedia PDF Downloads 218
3361 Impact of Integrated Signals for Doing Human Activity Recognition Using Deep Learning Models

Authors: Milagros Jaén-Vargas, Javier García Martínez, Karla Miriam Reyes Leiva, María Fernanda Trujillo-Guerrero, Francisco Fernandes, Sérgio Barroso Gonçalves, Miguel Tavares Silva, Daniel Simões Lopes, José Javier Serrano Olmedo

Abstract:

Human Activity Recognition (HAR) is having a growing impact in creating new applications and is responsible for emerging new technologies. Also, the use of wearable sensors is an important key to exploring the human body's behavior when performing activities. Hence, the use of these dispositive is less invasive and the person is more comfortable. In this study, a database that includes three activities is used. The activities were acquired from inertial measurement unit sensors (IMU) and motion capture systems (MOCAP). The main objective is differentiating the performance from four Deep Learning (DL) models: Deep Neural Network (DNN), Convolutional Neural Network (CNN), Recurrent Neural Network (RNN) and hybrid model Convolutional Neural Network-Long Short-Term Memory (CNN-LSTM), when considering acceleration, velocity and position and evaluate if integrating the IMU acceleration to obtain velocity and position represent an increment in performance when it works as input to the DL models. Moreover, compared with the same type of data provided by the MOCAP system. Despite the acceleration data is cleaned when integrating, results show a minimal increase in accuracy for the integrated signals.

Keywords: HAR, IMU, MOCAP, acceleration, velocity, position, feature maps

Procedia PDF Downloads 93
3360 Finite Element Study of Coke Shape Deep Beam to Column Moment Connection Subjected to Cyclic Loading

Authors: Robel Wondimu Alemayehu, Sihwa Jung, Manwoo Park, Young K. Ju

Abstract:

Following the aftermath of the 1994 Northridge earthquake, intensive research on beam to column connections is conducted, leading to the current design basis. The current design codes require the use of either a prequalified connection or a connection that passes the requirements of large-scale cyclic qualification test prior to use in intermediate or special moment frames. The second alternative is expensive both in terms of money and time. On the other hand, the maximum beam depth in most of the prequalified connections is limited to 900mm due to the reduced rotation capacity of deeper beams. However, for long span beams the need to use deeper beams may arise. In this study, a beam to column connection detail suitable for deep beams is presented. The connection detail comprises of thicker-tapered beam flange adjacent to the beam to column connection. Within the thicker-tapered flange region, two reduced beam sections are provided with the objective of forming two plastic hinges within the tapered-thicker flange region. In addition, the length, width, and thickness of the tapered-thicker flange region are proportioned in such a way that a third plastic hinge forms at the end of the tapered-thicker flange region. As a result, the total rotation demand is distributed over three plastic zones. Making it suitable for deeper beams that have lower rotation capacity at one plastic hinge. The effectiveness of this connection detail is studied through finite element analysis. For the study, a beam that has a depth of 1200mm is used. Additionally, comparison with welded unreinforced flange-welded web (WUF-W) moment connection and reduced beam section moment connection is made. The results show that the rotation capacity of a WUF-W moment connection is increased from 2.0% to 2.2% by applying the proposed moment connection detail. Furthermore, the maximum moment capacity, energy dissipation capacity and stiffness of the WUF-W moment connection is increased up to 58%, 49%, and 32% respectively. In contrast, applying the reduced beam section detail to the same WUF-W moment connection reduced the rotation capacity from 2.0% to 1.50% plus the maximum moment capacity and stiffness of the connection is reduced by 22% and 6% respectively. The proposed connection develops three plastic hinge regions as intended and it shows improved performance compared to both WUF-W moment connection and reduced beam section moment connection. Moreover, the achieved rotation capacity satisfies the minimum required for use in intermediate moment frames.

Keywords: connections, finite element analysis, seismic design, steel intermediate moment frame

Procedia PDF Downloads 162
3359 Sensitivity Based Robust Optimization Using 9 Level Orthogonal Array and Stepwise Regression

Authors: K. K. Lee, H. W. Han, H. L. Kang, T. A. Kim, S. H. Han

Abstract:

For the robust optimization of the manufacturing product design, there are design objectives that must be achieved, such as a minimization of the mean and standard deviation in objective functions within the required sensitivity constraints. The authors utilized the sensitivity of objective functions and constraints with respect to the effective design variables to reduce the computational burden associated with the evaluation of the probabilities. The individual mean and sensitivity values could be estimated easily by using the 9 level orthogonal array based response surface models optimized by the stepwise regression. The present study evaluates a proposed procedure from the robust optimization of rubber domes that are commonly used for keyboard switching, by using the 9 level orthogonal array and stepwise regression along with a desirability function. In addition, a new robust optimization process, i.e., the I2GEO (Identify, Integrate, Generate, Explore and Optimize), was proposed on the basis of the robust optimization in rubber domes. The optimized results from the response surface models and the estimated results by using the finite element analysis were consistent within a small margin of error. The standard deviation of objective function is decreasing 54.17% with suggested sensitivity based robust optimization. (Business for Cooperative R&D between Industry, Academy, and Research Institute funded Korea Small and Medium Business Administration in 2017, S2455569)

Keywords: objective function, orthogonal array, response surface model, robust optimization, stepwise regression

Procedia PDF Downloads 285
3358 Evaluation and Selection of Drilling Technologies: An Application of Portfolio Analysis Matrix in South Azadgan Oilfield

Authors: M. Maleki Sadabad, A. Pointing, N. Marashi

Abstract:

With respect to the role and increasing importance of technology for countries development, in recent decades technology development has paid attention in a systematic form. Nowadays the markets face with highly complicated and competitive conditions in foreign markets, therefore, evaluation and selection of technology effectiveness and also formulating technology strategy have changed into a vital subject for some organizations. The study introduces the standards of empowerment evaluation and technology attractiveness especially strategic technologies which explain the way of technology evaluation, selection and finally formulating suitable technology strategy in the field of drilling in South Azadegan oil field. The study firstly identifies the key challenges of oil fields in order to evaluate the technologies in field of drilling in South Azadegan oil field through an interview with the experts of industry and then they have been prioritised. In the following, the existing and new technologies were identified to solve the challenges of South Azadegan oil field. In order to explore the ability, availability, and attractiveness of every technology, a questionnaire based on Julie indices has been designed and distributed among the industry elites. After determining the score of ability, availability and attractiveness, every technology which has been obtained by the average of expert’s ideas, the technology package has been introduced by Morin’s model. The matrix includes four areas which will follow the especial strategy. Finally, by analysing the above matrix, the technology options have been suggested in order to select and invest.

Keywords: technology, technology identification, drilling technologies, technology capability

Procedia PDF Downloads 135
3357 A Finite Elements Model for the Study of Buried Pipelines Affected by Strike-Slip Fault

Authors: Reza Akbari, Jalal MontazeriFashtali, PeymanMomeni Taromsari

Abstract:

Pipeline systems, play an important role as a vital element in reducing or increasing the risk of earthquake damage and vulnerability. Pipelines are suitable, cheap, fast, and safe routes for transporting oil, gas, water, sewage, etc. The sepipelines must pass from a wide geographical area; hence they will structurally face different environmental and underground factors of earthquake forces’ effect. Therefore, structural engineering analysis and design for this type of lines requires the understanding of relevant parameters behavior and lack of familiarity with them can cause irreparable damages and risks to design and execution, especially in the face of earthquakes. Today, buried pipelines play an important role in human life cycle, thus, studying the vulnerability of pipeline systems is of particular importance. This study examines the behavior of buried pipelines affected by strike-slip fault. Studied fault is perpendicular to the tube axis and causes stress and deformation in the tube by sliding horizontally. In this study, the pipe-soil interaction is accurately simulated, so that one can examine the large displacements and strains, nonlinear material behavior and contact and friction conditions of soil and pipe. The results can be used for designing buried pipes and determining the amount of fault displacement that causes the failure of the buried pipes.

Keywords: pipe lines , earthquake , fault , soil-fault interaction

Procedia PDF Downloads 448
3356 Two Lessons Learnt in Defining Intersections and Interfaces in Numerical Modeling with Plaxis

Authors: Mahdi Sadeghian, Somaye Sadeghian, Reza Dinarvand

Abstract:

This paper is going to discuss two issues encountered in using PLAXIS. Both issues were monitored during application of PLAXIS to estimate the excavation-induced displacement. Column Soil Mixing (CSM) was applied to stabilise the excavation. It was understood that the estimated excavation induced deformation at the top of the CSM blocks highly depends on the material type defining pavement material adjacent to the CSM blocks. Cohesive material for pavement will result in the unrealistic connection between pavement and CSM even by defining an interface element. To find the most realistic approach, the interface defined in three different manners (1) no interface elements were applied (2) a non-cohesive soil layer was defined between pavement and CSM block to represent the friction between these materials (3) built-in interface elements in PLAXIS was used to define the boundary between the pavement and the CSM block. The result showed that the option 2 would result in more realistic results. The second issue was in the modelling of the contact line between the CSM block and an inclined layer underneath. The analysis result showed that the excavation-induced deformation highly depends on how the PLAXIS user defines the contact area. It was understood that if the contact area had defined as a point in which CSM block had intersected the layer underneath the estimated lateral displacement of CSM block would be unrealistically lower than the model in which the contact area was defined as a line.

Keywords: PLAXIS, FEM, CSM, Excavation-Induced Deformation

Procedia PDF Downloads 160
3355 The Internet of Things: A Survey of Authentication Mechanisms, and Protocols, for the Shifting Paradigm of Communicating, Entities

Authors: Nazli Hardy

Abstract:

Multidisciplinary application of computer science, interactive database-driven web application, the Internet of Things (IoT) represents a digital ecosystem that has pervasive technological, social, and economic, impact on the human population. It is a long-term technology, and its development is built around the connection of everyday objects, to the Internet. It is estimated that by 2020, with billions of people connected to the Internet, the number of connected devices will exceed 50 billion, and thus IoT represents a paradigm shift in in our current interconnected ecosystem, a communication shift that will unavoidably affect people, businesses, consumers, clients, employees. By nature, in order to provide a cohesive and integrated service, connected devices need to collect, aggregate, store, mine, process personal and personalized data on individuals and corporations in a variety of contexts and environments. A significant factor in this paradigm shift is the necessity for secure and appropriate transmission, processing and storage of the data. Thus, while benefits of the applications appear to be boundless, these same opportunities are bounded by concerns such as trust, privacy, security, loss of control, and related issues. This poster and presentation look at a multi-factor authentication (MFA) mechanisms that need to change from the login-password tuple to an Identity and Access Management (IAM) model, to the more cohesive to Identity Relationship Management (IRM) standard. It also compares and contrasts messaging protocols that are appropriate for the IoT ecosystem.

Keywords: Internet of Things (IoT), authentication, protocols, survey

Procedia PDF Downloads 297