Search results for: massive multiple input multiple output (MIMO)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8501

Search results for: massive multiple input multiple output (MIMO)

521 Low-Income African-American Fathers' Gendered Relationships with Their Children: A Study Examining the Impact of Child Gender on Father-Child Interactions

Authors: M. Lim Haslip

Abstract:

This quantitative study explores the correlation between child gender and father-child interactions. The author analyzes data from videotaped interactions between African-American fathers and their boy or girl toddler to explain how African-American fathers and toddlers interact with each other and whether these interactions differ by child gender. The purpose of this study is to investigate the research question: 'How, if at all, do fathers’ speech and gestures differ when interacting with their two-year-old sons versus daughters during free play?' The objectives of this study are to describe how child gender impacts African-American fathers’ verbal communication, examine how fathers gesture and speak to their toddler by gender, and to guide interventions for low-income African-American families and their children in early language development. This study involves a sample of 41 low-income African-American fathers and their 24-month-old toddlers. The videotape data will be used to observe 10-minute father-child interactions during free play. This study uses the already transcribed and coded data provided by Dr. Meredith Rowe, who did her study on the impact of African-American fathers’ verbal input on their children’s language development. The Child Language Data Exchange System (CHILDES program), created to study conversational interactions, was used for transcription and coding of the videotape data. The findings focus on the quantity of speech, diversity of speech, complexity of speech, and the quantity of gesture to inform the vocabulary usage, number of spoken words, length of speech, and the number of object pointings observed during father-toddler interactions in a free play setting. This study will help intervention and prevention scientists understand early language development in the African-American population. It will contribute to knowledge of the role of African-American fathers’ interactions on their children’s language development. It will guide interventions for the early language development of African-American children.

Keywords: parental engagement, early language development, African-American families, quantity of speech, diversity of speech, complexity of speech and the quantity of gesture

Procedia PDF Downloads 100
520 Household Solid Waste Generation per Capita and Management Behaviour in Mthatha City, South Africa

Authors: Vuyayo Tsheleza, Simbarashe Ndhleve, Christopher Mpundu Musampa

Abstract:

Mismanagement of waste is continuously emerging as a rising malpractice in most developing countries, especially in fast growing cities. Household solid waste in Mthatha has been reported to be one of the problems facing the city and is overwhelming local authorities, as it is beyond the environment and management capacity of the existing waste management system. This study estimates per capita waste generation, quantity of different waste types generated by inhabitants of formal and informal settlements in Mthatha as well as waste management practices in the aforementioned socio-economic stratums. A total of 206 households were systematically selected for the study using stratified random sampling categorized into formal and informal settlements. Data on household waste generation rate, composition, awareness, and household waste management behaviour and practices was gathered through mixed methods. Sampled households from both formal and informal settlements with a total of 684 people generated 1949kg per week. This translates to 2.84kg per capita per week. On average, the rate of solid waste generation per capita was 0.40 kg per day for a person living in informal settlement and 0.56 kg per day person living in formal settlement. When recorded in descending order, the proportion food waste accounted for the most generated waste at approximately 23.7%, followed by disposable nappies at 15%, papers and cardboards 13.34%, glass 13.03%, metals at 11.99%, plastics at 11.58%, residue at 5.17, textiles 3.93%, with leather and rubber at 2.28% as the least generated waste type. Different waste management practices were reported in both formal and informal settlements with formal settlements proving to be more concerned about environmental management as compared to their counterparts, informal settlement. Understanding attitudes and perceptions on waste management, waste types and per capita solid waste generation rate can help evolve appropriate waste management strategies based on the principle of reduce, re-use, recycle, environmental sound disposal and also assist in projecting future waste generation rate. These results can be utilized as input when designing growing cities’ waste management plans.

Keywords: awareness, characterisation, per capita, quantification

Procedia PDF Downloads 292
519 Vocational Projects for the Autistic and Developmentally Delayed That Are Sustainable and Eco-Friendly

Authors: Saima Haq

Abstract:

This paper presents the contribution of the Sunflowers Vocational Center, Karachi, Pakistan, by providing a platform for the students of special needs to work with recycled materials and express themselves in a more extravagant form. The concept was to create products that would generate enough income to sustain the program while keeping the students cognitively engaged through arts and crafts and tactile instructions due to their severe intellectual disabilities. Papier-mâché is an art form that is hands-on, repetitive, economical as well as beneficial for the environment. The process of tearing paper into long strips then covering them with paste and laying the strips atop the mold provides constant sensory input for our autistic students as well as the rest of our student population. Given the marginalized stance the society has on special needs, we have marketed the paper-mâché products on social media platforms and have set up booths in carnivals, festivities, open markets that are aimed towards a cause to sell. Our students in the vocational center have also made bins, baskets, and trays that are used in all classrooms. This has cut our costs on classroom materials considerably and has added a sense of accomplishment and furthered the teamwork skills in our sunflowers. The other achievement is our long clientele; orders have been placed from several persons for birthdays, parties, events, and the like. This exposure has raised awareness of the capabilities of persons of special needs and has started a conversation on the topic. And additional achievement is that we have made our teachers, their families, our students, and their families conscientious of the environment and incorporated reusing newspapers into classrooms. Situations where plastic would be bought, for example, bin, dustbins, containers, basket, trays, the paper-mâché products made by our students have been used instead. Due to the low cost of materials, this project is easily replicable and very easy to start. Piñatas are a very popular item for children’s parties everywhere and are gaining popularity through social media. This is also easily replicable in any environment and can have a great impact on the use of plastic in any work or home environment.

Keywords: vocational training, special needs, cognitive skills, teamwork

Procedia PDF Downloads 96
518 Practical Software for Optimum Bore Hole Cleaning Using Drilling Hydraulics Techniques

Authors: Abdulaziz F. Ettir, Ghait Bashir, Tarek S. Duzan

Abstract:

A proper well planning is very vital to achieve any successful drilling program on the basis of preventing, overcome all drilling problems and minimize cost operations. Since the hydraulic system plays an active role during the drilling operations, that will lead to accelerate the drilling effort and lower the overall well cost. Likewise, an improperly designed hydraulic system can slow drill rate, fail to clean the hole of cuttings, and cause kicks. In most cases, common sense and commercially available computer programs are the only elements required to design the hydraulic system. Drilling optimization is the logical process of analyzing effects and interactions of drilling variables through applied drilling and hydraulic equations and mathematical modeling to achieve maximum drilling efficiency with minimize drilling cost. In this paper, practical software adopted in this paper to define drilling optimization models including four different optimum keys, namely Opti-flow, Opti-clean, Opti-slip and Opti-nozzle that can help to achieve high drilling efficiency with lower cost. The used data in this research from vertical and horizontal wells were recently drilled in Waha Oil Company fields. The input data are: Formation type, Geopressures, Hole Geometry, Bottom hole assembly and Mud reghology. Upon data analysis, all the results from wells show that the proposed program provides a high accuracy than that proposed from the company in terms of hole cleaning efficiency, and cost break down if we consider that the actual data as a reference base for all wells. Finally, it is recommended to use the established Optimization calculations software at drilling design to achieve correct drilling parameters that can provide high drilling efficiency, borehole cleaning and all other hydraulic parameters which assist to minimize hole problems and control drilling operation costs.

Keywords: optimum keys, namely opti-flow, opti-clean, opti-slip and opti-nozzle

Procedia PDF Downloads 316
517 Biosynthesized Selenium Nanoparticles to Rescue Coccidiosis-mediated Oxidative Stress, Apoptosis and Inflammation in the Jejunum Of Mice

Authors: Esam Mohammed Al-shaebi

Abstract:

One of the most crucial approaches for treating human diseases, particularly parasite infections, is nanomedicine. One of the most significant protozoan diseases that impact farm and domestic animals is coccidiosis. While, amprolium is one of the traditional anticoccidial medication, the advent of drug-resistant strains of Eimeria necessitates the development of novel treatments. The goal of the current investigation was to determine whether biosynthesized selenium nanoparticles (Bio-SeNPs) using Azadirachta indica leaves extract might treat mice with Eimeria papillata infection in the jejunal tissue. Five groups of seven mice each were used, as follows: Group 1: Non-infected-non-treated (negative control). Group 2: Non-infected treated group with Bio-SeNPs (0.5 mg/kg of body weight). Groups 3-5 were orally inoculated with 1×103 sporulated oocysts of E. papillata. Group 3: Infected-non-treated (positive control). Group 4: Infected and treated group with Bio-SeNPs (0.5 mg/kg). Group 5: Infected and treated group with the Amprolium. Groups 4 and 5 daily received oral administration (for 5 days) of Bio-SeNPs and anticoccidial medication, respectively, after infection. Bio-SeNPs caused a considerable reduction in oocyst output in mice feces (97.21%). This was also accompanied by a significant reduction in the number of developmental parasitic stages in the jejunal tissues. Glutathione reduced (GSH), glutathione peroxidase (GPx), and superoxide dismutase (SOD) levels were dramatically reduced by the Eimeria parasite, whereas, nitric oxide (NO) and malonaldehyde (MDA) levels were markedly elevated. The amount of goblet cells and MUC2 gene expression were used as apoptotic indicators, and both were considerably downregulated by infection. However, infection markedly increased the expression of inflammatory cytokines (IL-6 and TNF-α) and the apoptotic genes (Caspase-3 and BCL2). Bio-SeNPs were administrated to mice to drastically lower body weight, oxidative stress, and inflammatory and apoptotic indicators in the jejunal tissue. Our research thus showed the involvement of Bio-SeNPs in protecting mice with E. papillata infections against jejunal damage.

Keywords: coccidiosis, nanoparticles, azadirachta indica, oxidative stress

Procedia PDF Downloads 86
516 Performance Evaluation of Routing Protocols in Vehicular Adhoc Networks

Authors: Salman Naseer, Usman Zafar, Iqra Zafar

Abstract:

This study explores the implication of Vehicular Adhoc Network (VANET) - in the rural and urban scenarios that is one domain of Mobile Adhoc Network (MANET). VANET provides wireless communication between vehicle to vehicle and also roadside units. The Federal Commission Committee of United States of American has been allocated 75 MHz of the spectrum band in the 5.9 GHz frequency range for dedicated short-range communications (DSRC) that are specifically designed to enhance any road safety applications and entertainment/information applications. There are several vehicular related projects viz; California path, car 2 car communication consortium, the ETSI, and IEEE 1609 working group that have already been conducted to improve the overall road safety or traffic management. After the critical literature review, the selection of routing protocols is determined, and its performance was well thought-out in the urban and rural scenarios. Numerous routing protocols for VANET are applied to carry out current research. Its evaluation was conceded with the help of selected protocols through simulation via performance metric i.e. throughput and packet drop. Excel and Google graph API tools are used for plotting the graphs after the simulation results in order to compare the selected routing protocols which result with each other. In addition, the sum of the output from each scenario was computed to undoubtedly present the divergence in results. The findings of the current study present that DSR gives enhanced performance for low packet drop and high throughput as compared to AODV and DSDV in an urban congested area and in rural environments. On the other hand, in low-density area, VANET AODV gives better results as compared to DSR. The worth of the current study may be judged as the information exchanged between vehicles is useful for comfort, safety, and entertainment. Furthermore, the communication system performance depends on the way routing is done in the network and moreover, the routing of the data based on protocols implement in the network. The above-presented results lead to policy implication and develop our understanding of the broader spectrum of VANET.

Keywords: AODV, DSDV, DSR, Adhoc network

Procedia PDF Downloads 277
515 An Automatic Large Classroom Attendance Conceptual Model Using Face Counting

Authors: Sirajdin Olagoke Adeshina, Haidi Ibrahim, Akeem Salawu

Abstract:

large lecture theatres cannot be covered by a single camera but rather by a multicamera setup because of their size, shape, and seating arrangements. Although, classroom capture is achievable through a single camera. Therefore, a design and implementation of a multicamera setup for a large lecture hall were considered. Researchers have shown emphasis on the impact of class attendance taken on the academic performance of students. However, the traditional method of carrying out this exercise is below standard, especially for large lecture theatres, because of the student population, the time required, sophistication, exhaustiveness, and manipulative influence. An automated large classroom attendance system is, therefore, imperative. The common approach in this system is face detection and recognition, where known student faces are captured and stored for recognition purposes. This approach will require constant face database updates due to constant changes in the facial features. Alternatively, face counting can be performed by cropping the localized faces on the video or image into a folder and then count them. This research aims to develop a face localization-based approach to detect student faces in classroom images captured using a multicamera setup. A selected Haar-like feature cascade face detector trained with an asymmetric goal to minimize the False Rejection Rate (FRR) relative to the False Acceptance Rate (FAR) was applied on Raspberry Pi 4B. A relationship between the two factors (FRR and FAR) was established using a constant (λ) as a trade-off between the two factors for automatic adjustment during training. An evaluation of the proposed approach and the conventional AdaBoost on classroom datasets shows an improvement of 8% TPR (output result of low FRR) and 7% minimization of the FRR. The average learning speed of the proposed approach was improved with 1.19s execution time per image compared to 2.38s of the improved AdaBoost. Consequently, the proposed approach achieved 97% TPR with an overhead constraint time of 22.9s compared to 46.7s of the improved Adaboost when evaluated on images obtained from a large lecture hall (DK5) USM.

Keywords: automatic attendance, face detection, haar-like cascade, manual attendance

Procedia PDF Downloads 65
514 Leveraging Information for Building Supply Chain Competitiveness

Authors: Deepika Joshi

Abstract:

Operations in automotive industry rely greatly on information shared between Supply Chain (SC) partners. This leads to efficient and effective management of SC activity. Automotive sector in India is growing at 14.2 percent per annum and has huge economic importance. We find that no study has been carried out on the role of information sharing in SC management of Indian automotive manufacturers. Considering this research gap, the present study is planned to establish the significance of information sharing in Indian auto-component supply chain activity. An empirical research was conducted for large scale auto component manufacturers from India. Twenty four Supply Chain Performance Indicators (SCPIs) were collected from existing literature. These elements belong to eight diverse but internally related areas of SC management viz., demand management, cost, technology, delivery, quality, flexibility, buyer-supplier relationship, and operational factors. A pair-wise comparison and an open ended questionnaire were designed using these twenty four SCPIs. The questionnaire was then administered among managerial level employees of twenty-five auto-component manufacturing firms. Analytic Network Process (ANP) technique was used to analyze the response of pair-wise questionnaire. Finally, twenty-five priority indexes are developed, one for each respondent. These were averaged to generate an industry specific priority index. The open-ended questions depicted strategies related to information sharing between buyers and suppliers and their influence on supply chain performance. Results show that the impact of information sharing on certain performance indicators is relatively greater than their corresponding variables. For example, flexibility, delivery, demand and cost related elements have massive impact on information sharing. Technology is relatively less influenced by information sharing but it immensely influence the quality of information shared. Responses obtained from managers reveal that timely and accurate information sharing lowers the cost, increases flexibility and on-time delivery of auto parts, therefore, enhancing the competitiveness of Indian automotive industry. Any flaw in dissemination of information can disturb the cycle time of both the parties and thus increases the opportunity cost. Due to supplier’s involvement in decisions related to design of auto parts, quality conformance is found to improve, leading to reduction in rejection rate. Similarly, mutual commitment to share right information at right time between all levels of SC enhances trust level. SC partners share information to perform comprehensive quality planning to ingrain total quality management. This study contributes to operations management literature which faces scarcity of empirical examination on this subject. It views information sharing as a building block which firms can promote and evolve to leverage the operational capability of all SC members. It will provide insights for Indian managers and researchers as every market is unique and suppliers and buyers are driven by local laws, industry status and future vision. While major emphasis in this paper is given to SC operations happening between domestic partners, placing more focus on international SC can bring in distinguished results.

Keywords: Indian auto component industry, information sharing, operations management, supply chain performance indicators

Procedia PDF Downloads 545
513 Real-Time Radiological Monitoring of the Atmosphere Using an Autonomous Aerosol Sampler

Authors: Miroslav Hyza, Petr Rulik, Vojtech Bednar, Jan Sury

Abstract:

An early and reliable detection of an increased radioactivity level in the atmosphere is one of the key aspects of atmospheric radiological monitoring. Although the standard laboratory procedures provide detection limits as low as few µBq/m³, their major drawback is the delayed result reporting: typically a few days. This issue is the main objective of the HAMRAD project, which gave rise to a prototype of an autonomous monitoring device. It is based on the idea of sequential aerosol sampling using a carrousel sample changer combined with a gamma-ray spectrometer. In our hardware configuration, the air is drawn through a filter positioned on the carrousel so that it could be rotated into the measuring position after a preset sampling interval. Filter analysis is performed via a 50% HPGe detector inside an 8.5cm lead shielding. The spectrometer output signal is then analyzed using DSP electronics and Gamwin software with preset nuclide libraries and other analysis parameters. After the counting, the filter is placed into a storage bin with a capacity of 250 filters so that the device can run autonomously for several months depending on the preset sampling frequency. The device is connected to a central server via GPRS/GSM where the user can view monitoring data including raw spectra and technological data describing the state of the device. All operating parameters can be remotely adjusted through a simple GUI. The flow rate is continuously adjustable up to 10 m³/h. The main challenge in spectrum analysis is the natural background subtraction. As detection limits are heavily influenced by the deposited activity of radon decay products and the measurement time is fixed, there must exist an optimal sample decay time (delayed spectrum acquisition). To solve this problem, we adopted a simple procedure based on sequential spectrum acquisition and optimal partial spectral sum with respect to the detection limits for a particular radionuclide. The prototyped device proved to be able to detect atmospheric contamination at the level of mBq/m³ per an 8h sampling.

Keywords: aerosols, atmosphere, atmospheric radioactivity monitoring, autonomous sampler

Procedia PDF Downloads 143
512 Study and Simulation of a Sever Dust Storm over West and South West of Iran

Authors: Saeed Farhadypour, Majid Azadi, Habibolla Sayyari, Mahmood Mosavi, Shahram Irani, Aliakbar Bidokhti, Omid Alizadeh Choobari, Ziba Hamidi

Abstract:

In the recent decades, frequencies of dust events have increased significantly in west and south west of Iran. First, a survey on the dust events during the period (1990-2013) is investigated using historical dust data collected at 6 weather stations scattered over west and south-west of Iran. After statistical analysis of the observational data, one of the most severe dust storm event that occurred in the region from 3rd to 6th July 2009, is selected and analyzed. WRF-Chem model is used to simulate the amount of PM10 and how to transport it to the areas. The initial and lateral boundary conditions for model obtained from GFS data with 0.5°×0.5° spatial resolution. In the simulation, two aerosol schemas (GOCART and MADE/SORGAM) with 3 options (chem_opt=106,300 and 303) were evaluated. Results of the statistical analysis of the historical data showed that south west of Iran has high frequency of dust events, so that Bushehr station has the highest frequency between stations and Urmia station has the lowest frequency. Also in the period of 1990 to 2013, the years 2009 and 1998 with the amounts of 3221 and 100 respectively had the highest and lowest dust events and according to the monthly variation, June and July had the highest frequency of dust events and December had the lowest frequency. Besides, model results showed that the MADE / SORGAM scheme has predicted values and trends of PM10 better than the other schemes and has showed the better performance in comparison with the observations. Finally, distribution of PM10 and the wind surface maps obtained from numerical modeling showed that the formation of dust plums formed in Iraq and Syria and also transportation of them to the West and Southwest of Iran. In addition, comparing the MODIS satellite image acquired on 4th July 2009 with model output at the same time showed the good ability of WRF-Chem in simulating spatial distribution of dust.

Keywords: dust storm, MADE/SORGAM scheme, PM10, WRF-Chem

Procedia PDF Downloads 267
511 AI/ML Atmospheric Parameters Retrieval Using the “Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN)”

Authors: Thomas Monahan, Nicolas Gorius, Thanh Nguyen

Abstract:

Exoplanet atmospheric parameters retrieval is a complex, computationally intensive, inverse modeling problem in which an exoplanet’s atmospheric composition is extracted from an observed spectrum. Traditional Bayesian sampling methods require extensive time and computation, involving algorithms that compare large numbers of known atmospheric models to the input spectral data. Runtimes are directly proportional to the number of parameters under consideration. These increased power and runtime requirements are difficult to accommodate in space missions where model size, speed, and power consumption are of particular importance. The use of traditional Bayesian sampling methods, therefore, compromise model complexity or sampling accuracy. The Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN) is a deep convolutional generative adversarial network that improves on the previous model’s speed and accuracy. We demonstrate the efficacy of artificial intelligence to quickly and reliably predict atmospheric parameters and present it as a viable alternative to slow and computationally heavy Bayesian methods. In addition to its broad applicability across instruments and planetary types, ARcGAN has been designed to function on low power application-specific integrated circuits. The application of edge computing to atmospheric retrievals allows for real or near-real-time quantification of atmospheric constituents at the instrument level. Additionally, edge computing provides both high-performance and power-efficient computing for AI applications, both of which are critical for space missions. With the edge computing chip implementation, ArcGAN serves as a strong basis for the development of a similar machine-learning algorithm to reduce the downlinked data volume from the Compact Ultraviolet to Visible Imaging Spectrometer (CUVIS) onboard the DAVINCI mission to Venus.

Keywords: deep learning, generative adversarial network, edge computing, atmospheric parameters retrieval

Procedia PDF Downloads 164
510 Transport Hubs as Loci of Multi-Layer Ecosystems of Innovation: Case Study of Airports

Authors: Carolyn Hatch, Laurent Simon

Abstract:

Urban mobility and the transportation industry are undergoing a transformation, shifting from an auto production-consumption model that has dominated since the early 20th century towards new forms of personal and shared multi-modality [1]. This is shaped by key forces such as climate change, which has induced a shift in production and consumption patterns and efforts to decarbonize and improve transport services through, for instance, the integration of vehicle automation, electrification and mobility sharing [2]. Advanced innovation practices and platforms for experimentation and validation of new mobility products and services that are increasingly complex and multi-stakeholder-oriented are shaping this new world of mobility. Transportation hubs – such as airports - are emblematic of these disruptive forces playing out in the mobility industry. Airports are emerging as the core of innovation ecosystems on and around contemporary mobility issues, and increasingly recognized as complex public/private nodes operating in many societal dimensions [3,4]. These include urban development, sustainability transitions, digital experimentation, customer experience, infrastructure development and data exploitation (for instance, airports generate massive and often untapped data flows, with significant potential for use, commercialization and social benefit). Yet airport innovation practices have not been well documented in the innovation literature. This paper addresses this gap by proposing a model of airport innovation that aims to equip airport stakeholders to respond to these new and complex innovation needs in practice. The methodology involves: 1 – a literature review bringing together key research and theory on airport innovation management, open innovation and innovation ecosystems in order to evaluate airport practices through an innovation lens; 2 – an international benchmarking of leading airports and their innovation practices, including such examples as Aéroports de Paris, Schipol in Amsterdam, Changi in Singapore, and others; and 3 – semi-structured interviews with airport managers on key aspects of organizational practice, facilitated through a close partnership with the Airport Council International (ACI), a major stakeholder in this research project. Preliminary results find that the most successful airports are those that have shifted to a multi-stakeholder, platform ecosystem model of innovation. The recent entrance of new actors in airports (Google, Amazon, Accor, Vinci, Airbnb and others) have forced the opening of organizational boundaries to share and exchange knowledge with a broader set of ecosystem players. This has also led to new forms of governance and intermediation by airport actors to connect complex, highly distributed knowledge, along with new kinds of inter-organizational collaboration, co-creation and collective ideation processes. Leading airports in the case study have demonstrated a unique capacity to force traditionally siloed activities to “think together”, “explore together” and “act together”, to share data, contribute expertise and pioneer new governance approaches and collaborative practices. In so doing, they have successfully integrated these many disruptive change pathways and forced their implementation and coordination towards innovative mobility outcomes, with positive societal, environmental and economic impacts. This research has implications for: 1 - innovation theory, 2 - urban and transport policy, and 3 - organizational practice - within the mobility industry and across the economy.

Keywords: airport management, ecosystem, innovation, mobility, platform, transport hubs

Procedia PDF Downloads 176
509 An Efficient Hardware/Software Workflow for Multi-Cores Simulink Applications

Authors: Asma Rebaya, Kaouther Gasmi, Imen Amari, Salem Hasnaoui

Abstract:

Over these last years, applications such as telecommunications, signal processing, digital communication with advanced features (Multi-antenna, equalization..) witness a rapid evaluation accompanied with an increase of user exigencies in terms of latency, the power of computation… To satisfy these requirements, the use of hardware/software systems is a common solution; where hardware is composed of multi-cores and software is represented by models of computation, synchronous data flow (SDF) graph for instance. Otherwise, the most of the embedded system designers utilize Simulink for modeling. The issue is how to simplify the c code generation, for a multi-cores platform, of an application modeled by Simulink. To overcome this problem, we propose a workflow allowing an automatic transformation from the Simulink model to the SDF graph and providing an efficient schedule permitting to optimize the number of cores and to minimize latency. This workflow goes from a Simulink application and a hardware architecture described by IP.XACT language. Based on the synchronous and hierarchical behavior of both models, the Simulink block diagram is automatically transformed into an SDF graph. Once this process is successfully achieved, the scheduler calculates the optimal cores’ number needful by minimizing the maximum density of the whole application. Then, a core is chosen to execute a specific graph task in a specific order and, subsequently, a compatible C code is generated. In order to perform this proposal, we extend Preesm, a rapid prototyping tool, to take the Simulink model as entry input and to support the optimal schedule. Afterward, we compared our results to this tool results, using a simple illustrative application. The comparison shows that our results strictly dominate the Preesm results in terms of number of cores and latency. In fact, if Preesm needs m processors and latency L, our workflow need processors and latency L'< L.

Keywords: hardware/software system, latency, modeling, multi-cores platform, scheduler, SDF graph, Simulink model, workflow

Procedia PDF Downloads 260
508 Sustainable Integrated Waste Management System

Authors: Lidia Lombardi

Abstract:

Waste management in Europe and North America is evolving towards sustainable materials management, intended as a systemic approach to using and reusing materials more productively over their entire life cycles. Various waste management strategies are prioritized and ranked from the most to the least environmentally preferred, placing emphasis on reducing, reusing, and recycling as key to sustainable materials management. However, non-recyclable materials must also be appropriately addressed, and waste-to-energy (WtE) offers a solution to manage them, especially when a WtE plant is integrated within a complex system of waste and wastewater treatment plants and potential users of the output flows. To evaluate the environmental effects of such system integration, Life Cycle Assessment (LCA) is a helpful and powerful tool. LCA has been largely applied to the waste management sector, dating back to the late 1990s, producing a large number of theoretical studies and applications to the real world as support to waste management planning. However, LCA still has a fundamental role in helping the development of waste management systems supporting decisions. Thus, LCA was applied to evaluate the environmental performances of a Municipal Solid Waste (MSW) management system, with improved separate material collection and recycling and an integrated network of treatment plants including WtE, anaerobic digestion (AD) and also wastewater treatment plant (WWTP), for a reference study case area. The proposed system was compared to the actual situation, characterized by poor recycling, large landfilling and absence of WtE. The LCA results showed that the increased recycling significantly increases the environmental performances, but there is still room for improvement through the introduction of energy recovery (especially by WtE) and through its use within the system, for instance, by feeding the heat to the AD, to sludge recovery processes and supporting the water reuse practice. WtE offers a solution to manage non-recyclable MSW and allows saving important resources (such as landfill volumes and non-renewable energy), reducing the contribution to global warming, and providing an essential contribution to fulfill the goals of really sustainable waste management.

Keywords: anaerobic digestion, life cycle assessment, waste-to-energy, municipal solid waste

Procedia PDF Downloads 53
507 An Ecofriendly Approach for the Management of Aedes aegypti L (Diptera: Culicidae) by Ocimum sanctum

Authors: Mohd Shazad, Kamal Kumar Gupta

Abstract:

Aedes aegypti (Diptera: Culicidae), commonly known as tiger mosquito is the vector of dengue fever, yellow fever, chikungunya and zika virus. In the absence of any effective vaccine against these diseases, control the mosquito population is the only promising mean to prevent the diseases. Currently used chemical insecticides cause environmental contamination, high mammalian toxicity and hazards to non-target organisms, insecticide resistance and vector resurgence. Present research work aimed to explore the potentials of phytochemicals present in the Ocimum sanctum in management of mosquito population. The leaves of Ocimum were extracted with ethanol by ‘cold extraction method’. 0-24h old fourth instar larvae of Aedes aegypti were treated with the extract of concentrations 50ppm, 100ppm, 200ppm and 400ppm for 24h. Survival, growth and development of the treated larvae were evaluated. The adults emerged from the treated larvae were used for the reproductive fitness studies. Our results indicate 77.2% mortality in the larvae exposed to 400 ppm. At lower doses, although there was no significant reduction in the survival after 24h however, it decreased during subsequent days of observations. In control experiments, no mortality was observed. It was also observed that the larvae survived after treatment showed severe growth and developmental abnormalities. There was significant increase in larval duration. In control, fourth instar moulted into pupa after 3 days while larvae treated with 400 ppm extract were moulted after 4.6 days. Larva-pupa intermediates and the pupa-adult intermediates were observed in many cases. The adults emerged from the treated larvae showed impaired mating and oviposition behaviour. The females exhibited longer preoviposition period, reduced oviposition rate and decreased egg output. GCMS analysis of the ethanol extract revealed presence of JH mimics and intermediates of JH biosynthetic pathway. Potentials of Ocimum sanctum in integrated vector management programme of Aedes aegypti were discussed.

Keywords: Aedes aegypti, Ocimum sanctum, oviposition, survival

Procedia PDF Downloads 178
506 Empirical Modeling and Optimization of Laser Welding of AISI 304 Stainless Steel

Authors: Nikhil Kumar, Asish Bandyopadhyay

Abstract:

Laser welding process is a capable technology for forming the automobile, microelectronics, marine and aerospace parts etc. In the present work, a mathematical and statistical approach is adopted to study the laser welding of AISI 304 stainless steel. A robotic control 500 W pulsed Nd:YAG laser source with 1064 nm wavelength has been used for welding purpose. Butt joints are made. The effects of welding parameters, namely; laser power, scanning speed and pulse width on the seam width and depth of penetration has been investigated using the empirical models developed by response surface methodology (RSM). Weld quality is directly correlated with the weld geometry. Twenty sets of experiments have been conducted as per central composite design (CCD) design matrix. The second order mathematical model has been developed for predicting the desired responses. The results of ANOVA indicate that the laser power has the most significant effect on responses. Microstructural analysis as well as hardness of the selected weld specimens has been carried out to understand the metallurgical and mechanical behaviour of the weld. Average micro-hardness of the weld is observed to be higher than the base metal. Higher hardness of the weld is the resultant of grain refinement and δ-ferrite formation in the weld structure. The result suggests that the lower line energy generally produce fine grain structure and improved mechanical properties than the high line energy. The combined effects of input parameters on responses have been analyzed with the help of developed 3-D response surface and contour plots. Finally, multi-objective optimization has been conducted for producing weld joint with complete penetration, minimum seam width and acceptable welding profile. Confirmatory tests have been conducted at optimum parametric conditions to validate the applied optimization technique.

Keywords: ANOVA, laser welding, modeling and optimization, response surface methodology

Procedia PDF Downloads 291
505 Cost-Effective and Optimal Control Analysis for Mitigation Strategy to Chocolate Spot Disease of Faba Bean

Authors: Haileyesus Tessema Alemneh, Abiyu Enyew Molla, Oluwole Daniel Makinde

Abstract:

Introduction: Faba bean is one of the most important grown plants worldwide for humans and animals. Several biotic and abiotic elements have limited the output of faba beans, irrespective of their diverse significance. Many faba bean pathogens have been reported so far, of which the most important yield-limiting disease is chocolate spot disease (Botrytis fabae). The dynamics of disease transmission and decision-making processes for intervention programs for disease control are now better understood through the use of mathematical modeling. Currently, a lot of mathematical modeling researchers are interested in plant disease modeling. Objective: In this paper, a deterministic mathematical model for chocolate spot disease (CSD) on faba bean plant with an optimal control model was developed and analyzed to examine the best strategy for controlling CSD. Methodology: Three control interventions, quarantine (u2), chemical control (u3), and prevention (u1), are employed that would establish the optimal control model. The optimality system, characterization of controls, the adjoint variables, and the Hamiltonian are all generated employing Pontryagin’s maximum principle. A cost-effective approach is chosen from a set of possible integrated strategies using the incremental cost-effectiveness ratio (ICER). The forward-backward sweep iterative approach is used to run numerical simulations. Results: The Hamiltonian, the optimality system, the characterization of the controls, and the adjoint variables were established. The numerical results demonstrate that each integrated strategy can reduce the diseases within the specified period. However, due to limited resources, an integrated strategy of prevention and uprooting was found to be the best cost-effective strategy to combat CSD. Conclusion: Therefore, attention should be given to the integrated cost-effective and environmentally eco-friendly strategy by stakeholders and policymakers to control CSD and disseminate the integrated intervention to the farmers in order to fight the spread of CSD in the Faba bean population and produce the expected yield from the field.

Keywords: CSD, optimal control theory, Pontryagin’s maximum principle, numerical simulation, cost-effectiveness analysis

Procedia PDF Downloads 77
504 Constraints on Source Rock Organic Matter Biodegradation in the Biogenic Gas Fields in the Sanhu Depression, Qaidam Basin, Northwestern China: A Study of Compound Concentration and Concentration Ratio Changes Using GC-MS Data

Authors: Mengsha Yin

Abstract:

Extractable organic matter (EOM) from thirty-six biogenic gas source rocks from the Sanhu Depression in Qaidam Basin in northwestern China were obtained via Soxhlet extraction. Twenty-nine of them were conducted SARA (Saturates, Aromatics, Resins and Asphaltenes) separation for bulk composition analysis. Saturated and aromatic fractions of all the extractions were analyzed by Gas Chromatography-Mass Spectrometry (GC-MS) to investigate the compound compositions. More abundant n-alkanes, naphthalene, phenanthrene, dibenzothiophene and their alkylated products occur in samples in shallower depths. From 2000m downward, concentrations of these compounds increase sharply, and concentration ratios of more-over-less biodegradation susceptible compounds coincidently decrease dramatically. ∑iC15-16, 18-20/∑nC15-16, 18-20 and hopanoids/∑n-alkanes concentration ratios and mono- and tri-aromatic sterane concentrations and concentration ratios frequently fluctuate with depth rather than trend with it, reflecting effects from organic input and paleoenvironments other than biodegradation. Saturated and aromatic compound distributions on the saturates and aromatics total ion chromatogram (TIC) traces of samples display different degrees of biodegradation. Dramatic and simultaneous variations in compound concentrations and their ratios at 2000m and their changes with depth underneath cooperatively justified the crucial control of burial depth on organic matter biodegradation scales in source rocks and prompted the proposition that 2000m is the bottom depth boundary for active microbial activities in this study. The study helps to better curb the conditions where effective source rocks occur in terms of depth in the Sanhu biogenic gas fields and calls for additional attention to source rock pore size estimation during biogenic gas source rock appraisals.

Keywords: pore space, Sanhu depression, saturated and aromatic hydrocarbon compound concentration, source rock organic matter biodegradation, total ion chromatogram

Procedia PDF Downloads 150
503 Performance and Specific Emissions of an SI Engine Using Anhydrous Ethanol–Gasoline Blends in the City of Bogota

Authors: Alexander García Mariaca, Rodrigo Morillo Castaño, Juan Rolón Ríos

Abstract:

The government of Colombia has promoted the use of biofuels in the last 20 years through laws and resolutions, which regulate their use, with the objective to improve the atmospheric air quality and to promote Colombian agricultural industry. However, despite the use of blends of biofuels with fossil fuels, the air quality in large cities does not get better, this deterioration in the air is mainly caused by mobile sources that working with spark ignition internal combustion engines (SI-ICE), operating with a mixture in volume of 90 % gasoline and 10 % ethanol called E10, that for the case of Bogota represent 84 % of the fleet. Another problem is that Colombia has big cities located above 2200 masl and there are no accurate studies on the impact that the E10 mixture could cause in the emissions and performance of SI-ICE. This study aims to establish the optimal blend between gasoline ethanol in which an SI engine operates more efficiently in urban centres located at 2600 masl. The test was developed on SI engine four-stroke, single cylinder, naturally aspirated and with carburettor for the fuel supply using blends of gasoline and anhydrous ethanol in different ratios E10, E15, E20, E40, E60, E85 and E100. These tests were conducted in the city of Bogota, which is located at 2600 masl, with the engine operating at 3600 rpm and at 25, 50, 75 and 100% of load. The results show that the performance variables as engine brake torque, brake power and brake thermal efficiency decrease, while brake specific fuel consumption increases with the rise in the percentage of ethanol in the mixture. On the other hand, the specific emissions of CO2 and NOx present increases while specific emissions of CO and HC decreases compared to those produced by gasoline. From the tests, it is concluded that the SI-ICE worked more efficiently with the E40 mixture, where was obtained an increases of the brake power of 8.81 % and a reduction on brake specific fuel consumption of 2.5 %, coupled with a reduction in the specific emissions of CO2, HC and CO in 9.72, 52.88 and 76.66 % respectively compared to the results obtained with the E10 blend. This behaviour is because the E40 mixture provides the appropriate amount of the oxygen for the combustion process, which leads to better utilization of available energy in this process, thus generating a comparable power output to the E10 mixing and producing lower emissions CO and HC with the other test blends. Nevertheless, the emission of NOx increases in 106.25 %.

Keywords: emissions, ethanol, gasoline, engine, performance

Procedia PDF Downloads 320
502 Focus Group Study Exploring Researchers Perspective on Open Science Policy

Authors: E. T. Svahn

Abstract:

Knowledge about the factors that influence the exchange between research and society is of the utmost importance for developing collaboration between different actors, especially in future science policy development and the creation of support structures for researchers. Among other things, how researchers look at the surrounding open science policy environment and what conditions and attitudes they have for interacting with it. This paper examines the Finnish researchers' attitudes towards open science policies in 2020. Open science is an integrated part of researchers' daily lives and supports not only the effectiveness of research outputs but also the quality of research. Open science policy in ideal situation is seen as a supporting structure that enables the exchange between research and society, but in other situation, it can end up being red tape generating obstacles and hindering possibilities of making science in an efficient way. Results of this study were carried out through focus group interviews. This qualitative research method was selected because it aims to understand the phenomenon under study. In addition, focus group interviews produce diverse and rich material that would not be available with other research methods. Focus group interviews have well-established applications in social science, especially in understanding the perspectives and experiences of research subjects. In this study, focus groups were used in studying the mindset and actions of researchers. Each group's size was between 4-10 people, and the aim was to bring out different perspectives on the subject. The interviewer enabled the presentation of different perceptions and opinions, and the focus group interviews were recorded and written as text. The material was analysed using grounded theory method. The results are presented as thematic areas, theoretical model, and as direct quotations. Attitudes towards open science policy can vary greatly depending on the research area. This study shows that the open science policy demands in medicine, technology, and natural sciences compared to social sciences, educational sciences, and the humanities, varies somewhat. The variation in attitudes between different research areas can thus be largely explained by the fact that the research output and ethical code vary significantly between certain subjects. This study aims to increase understanding of the nuances to what extent open science policies should be tailored for different disciplines and research areas.

Keywords: focus group interview, grounded theory, open science policy, science policy

Procedia PDF Downloads 150
501 Simon Says: What Should I Study?

Authors: Fonteyne Lot

Abstract:

SIMON (Study capacities and Interest Monitor is a freely accessible online self-assessment tool that allows secondary education pupils to evaluate their interests and capacities in order to choose a post-secondary major that maximally suits their potential. The tool consists of two broad domains that correspond with two general questions pupils ask: 'What study fields interest me?' and 'Am I capable to succeed in this field of study?'. The first question is addressed by a RIASEC-type interest inventory that links personal interests to post-secondary majors. Pupils are provided with a personal profile and an overview of majors with their degree of congruence. The output is dynamic: respondents can manipulate their score and they can compare their results to the profile of all fields of study. That way they are stimulated to explore the broad range of majors. To answer whether pupils are capable of succeeding in a preferred major, a battery of tests is provided. This battery comprises a range of factors that are predictive of academic success. Traditional predictors such as (educational) background and cognitive variables (mathematical and verbal skills) are included. Moreover, non-cognitive predictors of academic success (such as 'motivation', 'test anxiety', 'academic self-efficacy' and 'study skills') are assessed. These non-cognitive factors are generally not included in admission decisions although research shows they are incrementally predictive of success and are less discriminating. These tests inform pupils on potential causes of success and failure. More important, pupils receive their personal chances of success per major. These differential probabilities are validated through the underlying research on academic success of students. For example, the research has shown that we can identify 22 % of the failing students in psychology and educational sciences. In this group, our prediction is 95% accurate. SIMON leads more students to a suitable major which in turn alleviates student success and retention. Apart from these benefits, the instrument grants insight into risk factors of academic failure. It also supports and fosters the development of evidence-based remedial interventions and therefore gives way to a more efficient use of means.

Keywords: academic success, online self-assessment, student retention, vocational choice

Procedia PDF Downloads 397
500 Non-intrusive Hand Control of Drone Using an Inexpensive and Streamlined Convolutional Neural Network Approach

Authors: Evan Lowhorn, Rocio Alba-Flores

Abstract:

The purpose of this work is to develop a method for classifying hand signals and using the output in a drone control algorithm. To achieve this, methods based on Convolutional Neural Networks (CNN) were applied. CNN's are a subset of deep learning, which allows grid-like inputs to be processed and passed through a neural network to be trained for classification. This type of neural network allows for classification via imaging, which is less intrusive than previous methods using biosensors, such as EMG sensors. Classification CNN's operate purely from the pixel values in an image; therefore they can be used without additional exteroceptive sensors. A development bench was constructed using a desktop computer connected to a high-definition webcam mounted on a scissor arm. This allowed the camera to be pointed downwards at the desk to provide a constant solid background for the dataset and a clear detection area for the user. A MATLAB script was created to automate dataset image capture at the development bench and save the images to the desktop. This allowed the user to create their own dataset of 12,000 images within three hours. These images were evenly distributed among seven classes. The defined classes include forward, backward, left, right, idle, and land. The drone has a popular flip function which was also included as an additional class. To simplify control, the corresponding hand signals chosen were the numerical hand signs for one through five for movements, a fist for land, and the universal “ok” sign for the flip command. Transfer learning with PyTorch (Python) was performed using a pre-trained 18-layer residual learning network (ResNet-18) to retrain the network for custom classification. An algorithm was created to interpret the classification and send encoded messages to a Ryze Tello drone over its 2.4 GHz Wi-Fi connection. The drone’s movements were performed in half-meter distance increments at a constant speed. When combined with the drone control algorithm, the classification performed as desired with negligible latency when compared to the delay in the drone’s movement commands.

Keywords: classification, computer vision, convolutional neural networks, drone control

Procedia PDF Downloads 203
499 Effect of Sulphur Concentration on Microbial Population and Performance of a Methane Biofilter

Authors: Sonya Barzgar, J. Patrick, A. Hettiaratchi

Abstract:

Methane (CH4) is reputed as the second largest contributor to greenhouse effect with a global warming potential (GWP) of 34 related to carbon dioxide (CO2) over the 100-year horizon, so there is a growing interest in reducing the emissions of this gas. Methane biofiltration (MBF) is a cost effective technology for reducing low volume point source emissions of methane. In this technique, microbial oxidation of methane is carried out by methane-oxidizing bacteria (methanotrophs) which use methane as carbon and energy source. MBF uses a granular medium, such as soil or compost, to support the growth of methanotrophic bacteria responsible for converting methane to carbon dioxide (CO₂) and water (H₂O). Even though the biofiltration technique has been shown to be an efficient, practical and viable technology, the design and operational parameters, as well as the relevant microbial processes have not been investigated in depth. In particular, limited research has been done on the effects of sulphur on methane bio-oxidation. Since bacteria require a variety of nutrients for growth, to improve the performance of methane biofiltration, it is important to establish the input quantities of nutrients to be provided to the biofilter to ensure that nutrients are available to sustain the process. The study described in this paper was conducted with the aim of determining the influence of sulphur on methane elimination in a biofilter. In this study, a set of experimental measurements has been carried out to explore how the conversion of elemental sulphur could affect methane oxidation in terms of methanotrophs growth and system pH. Batch experiments with different concentrations of sulphur were performed while keeping the other parameters i.e. moisture content, methane concentration, oxygen level and also compost at their optimum level. The study revealed the tolerable limit of sulphur without any interference to the methane oxidation as well as the particular sulphur concentration leading to the greatest methane elimination capacity. Due to the sulphur oxidation, pH varies in a transient way which affects the microbial growth behavior. All methanotrophs are incapable of growth at pH values below 5.0 and thus apparently are unable to oxidize methane. Herein, the certain pH for the optimal growth of methanotrophic bacteria is obtained. Finally, monitoring methane concentration over time in the presence of sulphur is also presented for laboratory scale biofilters.

Keywords: global warming, methane biofiltration (MBF), methane oxidation, methanotrophs, pH, sulphur

Procedia PDF Downloads 230
498 A Study on ZnO Nanoparticles Properties: An Integration of Rietveld Method and First-Principles Calculation

Authors: Kausar Harun, Ahmad Azmin Mohamad

Abstract:

Zinc oxide (ZnO) has been extensively used in optoelectronic devices, with recent interest as photoanode material in dye-sensitize solar cell. Numerous methods employed to experimentally synthesized ZnO, while some are theoretically-modeled. Both approaches provide information on ZnO properties, but theoretical calculation proved to be more accurate and timely effective. Thus, integration between these two methods is essential to intimately resemble the properties of synthesized ZnO. In this study, experimentally-grown ZnO nanoparticles were prepared by sol-gel storage method with zinc acetate dihydrate and methanol as precursor and solvent. A 1 M sodium hydroxide (NaOH) solution was used as stabilizer. The optimum time to produce ZnO nanoparticles were recorded as 12 hours. Phase and structural analysis showed that single phase ZnO produced with wurtzite hexagonal structure. Further work on quantitative analysis was done via Rietveld-refinement method to obtain structural and crystallite parameter such as lattice dimensions, space group, and atomic coordination. The lattice dimensions were a=b=3.2498Å and c=5.2068Å which were later used as main input in first-principles calculations. By applying density-functional theory (DFT) embedded in CASTEP computer code, the structure of synthesized ZnO was built and optimized using several exchange-correlation functionals. The generalized-gradient approximation functional with Perdew-Burke-Ernzerhof and Hubbard U corrections (GGA-PBE+U) showed the structure with lowest energy and lattice deviations. In this study, emphasize also given to the modification of valence electron energy level to overcome the underestimation in DFT calculation. Both Zn and O valance energy were fixed at Ud=8.3 eV and Up=7.3 eV, respectively. Hence, the following electronic and optical properties of synthesized ZnO were calculated based on GGA-PBE+U functional within ultrasoft-pseudopotential method. In conclusion, the incorporation of Rietveld analysis into first-principles calculation was valid as the resulting properties were comparable with those reported in literature. The time taken to evaluate certain properties via physical testing was then eliminated as the simulation could be done through computational method.

Keywords: density functional theory, first-principles, Rietveld-refinement, ZnO nanoparticles

Procedia PDF Downloads 304
497 The Golden Bridge for Better Farmers Life

Authors: Giga Rahmah An-Nafisah, Lailatus Syifa Kamilah

Abstract:

Agriculture today, especially in Indonesia have globally improved. Since the election of the new president, who in the program of work priority the food self-sufficiency. Many ways and attempts have been planned carefully. All this is done to maximize agricultural production for the future. But if we look from another side, there is something missing. Yes! Improvement of life safety of the farmers, useless we fix all agricultural processing systems to maximize agricultural output, but the Hero of agriculture itself it does not change towards a better life. Yes, broker or middleman system agriculture results. Broker system or middleman this is the real problem facing farmers for their welfare. How come? As much as agriculture result, but if farmers were sell into middlemen with very low prices, then there will be no progress for their welfare. Broker system who do the actual middlemen should not happen in the current agricultural system, because the agriculture condition currently being concern, they would still be able to reap a profit as much as possible, no matter how miserable farmers manage the farm and currently face import competition this cannot be avoided anymore. This phenomenon is already visible plain sight all, who see it. Why? Because farmers those who fell victim cannot do anything to change this system. It is true, if only these middlemen who want to receive it for the sale of agricultural products, or arguably the only system that is the bridge realtor economic life of the farmers. The problem is that we should strive for the welfare of the heroes of our food. A golden bridge that could save them that, are the government. Why? Because the government can more easily with the powers to stop this broker system compared to other parties. The government supposed to be a bridge connecting the farmers with consumers or the people themselves. Yes, with improved broker system becomes: buy agricultural produce with highest prices to farmers and selling of agricultural products with lowest price to the consumer or the people themselves. And then the next question about the fate of middlemen? The system indirectly realtor is like system corruption. Why? Because the definition of corruption is an activity that is detrimental to the victim without being noticed by anyone continue to enrich himself and his victim's life miserable. Government may transfer performance of the middlemen into the idea of a new bridge that is done by the government itself. The government could lift them into this new bridge system employs them to remain a distributor of agricultural products themselves, but under the new policy made by the government to keep improving the welfare of farmers. This idea is made is not going to have much effect would improve the welfare of farmers, but most/least this idea will bring around many people for helping conscience farmers to the government, through the daily chatter, as well as celebrity gossip can quickly know too many people.

Keywords: broker system, farmers live, government, agricultural economics

Procedia PDF Downloads 287
496 PWM Harmonic Injection and Frequency-Modulated Triangular Carrier to Improve the Lives of the Transformers

Authors: Mario J. Meco-Gutierrez, Francisco Perez-Hidalgo, Juan R. Heredia-Larrubia, Antonio Ruiz-Gonzalez, Francisco Vargas-Merino

Abstract:

More and more applications power inverters connected to transformers, for example, the connection facilities to the power grid renewable generation. It is well known that the quality of signal power inverters it is not a pure sine. The harmonic content produced negative effects, one of which is the heating of electrical machines and therefore, affects the life of the machines. The decrease of life of transformers can be calculated by Arrhenius or Montsinger equation. Analyzing this expression any (long-term) decrease of a transformer temperature for 6º C - 7º C means doubles its life-expectancy. Methodologies: This work presents the technique of pulse width modulation (PWM) with an injection of harmonic and triangular frequency carrier modulated in frequency. This technique is used to improve the quality of the output voltage signal of the power inverters controlled PWM. The proposed technique increases in the fundamental term and a significant reduction in low order harmonics with the same commutations per time that control sine PWM. To achieve this, the modulating wave is compared to a triangular carrier with variable frequency over the period of the modulator. Therefore, it is, advantageous for the modulating signal to have a large amount of sinusoidal “information” in the areas of greater sampling. A triangular signal with a frequency that varies over the modulator’s period is used as a carrier, for obtaining more samples in the area with the greatest slope. A power inverter controlled by PWM proposed technique is connected to a transformer. Results: In order to verify the derived thermal parameters under different operation conditions, another ambient and loading scenario is involved for a further verification, which was sampled from the same power transformer. Temperatures of different parts of the transformer will be exposed for each PWM control technique analyzed. An assessment of the temperature be done with different techniques PWM control and hence the life of the transformer is calculated for each technique. Conclusion: This paper analyzes such as transformer heating produced by this technique and compared with other forms of PWM control. In it can be seen as a reduction the harmonic content produces less heat transformer and therefore, an increase in the life of the transformer.

Keywords: heating, power-inverter, PWM, transformer

Procedia PDF Downloads 406
495 Evaluation of Public Library Adult Programs: Use of Servqual and Nippa Assessment Standards

Authors: Anna Ching-Yu Wong

Abstract:

This study aims to identify the quality and effectiveness of the adult programs provided by the public library using the ServQUAL Method and the National Library Public Programs Assessment guidelines (NIPPA, June 2019). ServQUAl covers several variables, namely: tangible, reliability, responsiveness, assurance, and empathy. NIPPA guidelines focus on program characteristics, particularly on the outcomes – the level of satisfaction from program participants. The reached populations were adults who participated in library adult programs at a small-town public library in Kansas. This study was designed as quantitative evaluative research which analyzed the quality and effectiveness of the library adult programs by analyzing the role of each factor based on ServQUAL and the NIPPA's library program assessment guidelines. Data were collected from November 2019 to January 2020 using a questionnaire with a Likert Scale. The data obtained were analyzed in a descriptive quantitative manner. The impact of this research can provide information about the quality and effectiveness of existing programs and can be used as input to develop strategies for developing future adult programs. Overall the result of ServQUAL measurement is in very good quality, but still, areas need improvement and emphasis in each variable: Tangible Variables still need improvement in indicators of the temperature and space of the meeting room. Reliability Variable still needs improvement in the timely delivery of the programs. Responsiveness Variable still needs improvement in terms of the ability of the presenters to convey trust and confidence from participants. Assurance Variables still need improvement in the indicator of knowledge and skills of program presenters. Empathy Variable still needs improvement in terms of the presenters' willingness to provide extra assistance. The result of program outcomes measurement based on NIPPA guidelines is very positive. Over 96% of participants indicated that the programs were informative and fun. They learned new knowledge and new skills and would recommend the programs to their friends and families. They believed that together, the library and participants build stronger and healthier communities.

Keywords: ServQual model, ServQual in public libraries, library program assessment, NIPPA library programs assessment

Procedia PDF Downloads 92
494 The Physical and Physiological Profile of Professional Muay Thai Boxers

Authors: Lucy Horrobin, Rebecca Fores

Abstract:

Background: Muay Thai is an increasingly popular combat sport worldwide. Further academic research in the sport will contribute to its professional development. This research sought to produce normative data in relation to the physical and physiological characteristics of professional Muay Thai boxers, as, currently no such data exists. The ultimate aim being to inform appropriate training programs and to facilitate coaching. Methods: N = 9 professional, adult, male Muay Thai boxers were assessed for the following anthropometric, physical and physiological characteristics, using validated methods of assessment: body fat, hamstring flexibility, maximal dynamic upper body strength, lower limb peak power, upper body muscular endurance and aerobic capacity. Raw data scores were analysed for mean, range and SD and where applicable were expressed relative to body mass (BM). Results: Results showed similar characteristics to those found in other combat sports. Low percentages of body fat (mean±SD) 8.54 ± 1.16 allow for optimal power to weight ratios. Highly developed aerobic capacity (mean ±SD) 61.56 ± 5.13 ml.min.kg facilitate recovery and power maintenance throughout bouts. Lower limb peak power output values of (mean ± SD) 12.60 ± 2.09 W/kg indicate that Muay Thai boxers are amongst the most powerful of combat sport athletes. However, maximal dynamic upper body strength scores of (mean±SD) 1.14 kg/kg ± 0.18 were in only the 60th percentile of normative data for the general population and muscular endurance scores (mean±SD) 31.55 ± 11.95 and flexibility scores (mean±SD) 19.55 ± 11.89 cm expressed wide standard deviation. These results might suggest that these characteristics are insignificant in Muay Thai or under-developed, perhaps due to deficient training programs. Implications: This research provides the first normative data of physical and physiological characteristics of Muay Thai boxers. The findings of this study would aid trainers and coaches when designing effective evidence-based training programs. Furthermore, it provides a foundation for further research relating to physiology in Muay Thai. Areas of further study could be determining the physiological demands of a full rules bout and the effects of evidence-based training programs on performance.

Keywords: fitness testing, Muay Thai, physiology, strength and conditioning

Procedia PDF Downloads 213
493 Delineation of Oil– Polluted Sites in Ibeno LGA, Nigeria

Authors: Ime R. Udotong, Ofonime U. M. John, Justina I. R. Udotong

Abstract:

Ibeno, Nigeria hosts the operational base of Mobil Producing Nigeria Unlimited (MPNU), a subsidiary of ExxonMobil and the current highest oil and condensate producer in Nigeria. Besides MPNU, other multinational oil companies like Shell Petroleum Development Company Ltd, Elf Petroleum Nigeria Ltd and Nigerian Agip Energy, a subsidiary of ENI E&P operate onshore, on the continental shelf and deep offshore of the Atlantic Ocean in Ibeno, Nigeria, respectively. This study was designed to carry out the survey of the oil impacted sites in Ibeno, Nigeria. A combinations of electrical resistivity (ER), ground penetrating radar (GPR) and physico-chemical as well as microbiological characterization of soils and water samples from the area were carried out. Results obtained revealed that there have been hydrocarbon contaminations of this environment by past crude oil spills as observed from significant concentrations of THC, BTEX and heavy metal contents in the environment. Also, high resistivity values and GPR profiles clearly showing the distribution, thickness and lateral extent of hydrocarbon contamination as represented on the radargram reflector tones corroborates previous significant oil input. Contaminations were of varying degrees, ranging from slight to high, indicating levels of substantial attenuation of crude oil contamination over time. Hydrocarbon pollution of the study area was confirmed by the results of soil and water physico-chemical and microbiological analysis. The levels of THC contamination observed in this study are indicative of high levels of crude oil contamination. Moreover, the display of relatively lower resistivities of locations outside the impacted areas compared to resistivity values within the impacted areas, the 3-D Cartesian images of oil contaminant plume depicted by red, light brown and magenta for high, low and very low oil impacted areas, respectively as well as the high counts of hydrocarbonoclastic microorganisms in excess of 1% confirmed significant recent pollution of the study area.

Keywords: oil-polluted sites, physico-chemical analyses, microbiological characterization, geotechnical investigations, total hydrocarbon content

Procedia PDF Downloads 386
492 Implementation of Quality Function Development to Incorporate Customer’s Value in the Conceptual Design Stage of a Construction Projects

Authors: Ayedh Alqahtani

Abstract:

Many construction firms in Saudi Arabia dedicated to building projects agree that the most important factor in the real estate market is the value that they can give to their customer. These firms understand the value of their client in different ways. Value can be defined as the size of the building project in relationship to the cost or the design quality of the materials utilized in finish work or any other features of building rooms such as the bathroom. Value can also be understood as something suitable for the money the client is investing for the new property. A quality tool is required to support companies to achieve a solution for the building project and to understand and manage the customer’s needs. Quality Function Development (QFD) method will be able to play this role since the main difference between QFD and other conventional quality management tools is QFD a valuable and very flexible tool for design and taking into the account the VOC. Currently, organizations and agencies are seeking suitable models able to deal better with uncertainty, and that is flexible and easy to use. The primary aim of this research project is to incorporate customer’s requirements in the conceptual design of construction projects. Towards this goal, QFD is selected due to its capability to integrate the design requirements to meet the customer’s needs. To develop QFD, this research focused upon the contribution of the different (significantly weighted) input factors that represent the main variables influencing QFD and subsequent analysis of the techniques used to measure them. First of all, this research will review the literature to determine the current practice of QFD in construction projects. Then, the researcher will review the literature to define the current customers of residential projects and gather information on customers’ requirements for the design of the residential building. After that, qualitative survey research will be conducted to rank customer’s needs and provide the views of stakeholder practitioners about how these needs can affect their satisfy. Moreover, a qualitative focus group with the members of the design team will be conducted to determine the improvements level and technical details for the design of residential buildings. Finally, the QFD will be developed to establish the degree of significance of the design’s solution.

Keywords: quality function development, construction projects, Saudi Arabia, quality tools

Procedia PDF Downloads 117