Search results for: technical efficiency
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8055

Search results for: technical efficiency

1815 The Moment of the Optimal Average Length of the Multivariate Exponentially Weighted Moving Average Control Chart for Equally Correlated Variables

Authors: Edokpa Idemudia Waziri, Salisu S. Umar

Abstract:

The Hotellng’s T^2 is a well-known statistic for detecting a shift in the mean vector of a multivariate normal distribution. Control charts based on T have been widely used in statistical process control for monitoring a multivariate process. Although it is a powerful tool, the T statistic is deficient when the shift to be detected in the mean vector of a multivariate process is small and consistent. The Multivariate Exponentially Weighted Moving Average (MEWMA) control chart is one of the control statistics used to overcome the drawback of the Hotellng’s T statistic. In this paper, the probability distribution of the Average Run Length (ARL) of the MEWMA control chart when the quality characteristics exhibit substantial cross correlation and when the process is in-control and out-of-control was derived using the Markov Chain algorithm. The derivation of the probability functions and the moments of the run length distribution were also obtained and they were consistent with some existing results for the in-control and out-of-control situation. By simulation process, the procedure identified a class of ARL for the MEWMA control when the process is in-control and out-of-control. From our study, it was observed that the MEWMA scheme is quite adequate for detecting a small shift and a good way to improve the quality of goods and services in a multivariate situation. It was also observed that as the in-control average run length ARL0¬ or the number of variables (p) increases, the optimum value of the ARL0pt increases asymptotically and as the magnitude of the shift σ increases, the optimal ARLopt decreases. Finally, we use the example from the literature to illustrate our method and demonstrate its efficiency.

Keywords: average run length, markov chain, multivariate exponentially weighted moving average, optimal smoothing parameter

Procedia PDF Downloads 390
1814 Using Environmental Life Cycle Assessment to Design Sustainable Packaging

Authors: Timothy Francis Grant

Abstract:

There are conflicting purposes at play with the design of sustainable packaging which include material reduction, recycling compatibility, use of secondary content and performance of the package in protecting and delivering the product. Life Cycle Assessment (LCA) is able to evaluate these different strategies against environmental metrics such as climate change, land and water use and marine litter pollution. However, LCA has traditionally been too time consuming and expensive to be used effectively in packaging design process. To make LCA practical for packaging technologist and designers a simplified tool is needed to make LCA possible for non-environmental specialists. The Packaging Quick Evaluation Tool (PIQET) is a web-based solution for undertaking LCA of new and existing packaging designs considering the global supply chain and impacts from cradle to grave. PIQET is based on a pre-calculated LCA database covering the materials and processes involved in the packaging lifecycle from cradle to grave. This includes both virgin materials and recycled content, conversion of materials into packaging, and the transportation of packaging to the product filling. In addition, PIQET assesses the impacts once the package is filled looking at storage, transport and product loss through the supply chain. When applied to consumer packaging light weight packages which are note recyclable have lower impacts than more recyclable packages which have a higher mass. Its also apparent that for many products the impacts of product failure and product loss are more important environmentally compared to packaging material efficiency.

Keywords: Climate change, Life Cycle Assessment, Marine litter, Packaging sustainability

Procedia PDF Downloads 100
1813 Thermal Method for Testing Small Chemisorbent Samples on the Base of Potassium Superoxide

Authors: Pavel V. Balabanov, Daria A. Liubimova, Aleksandr P. Savenkov

Abstract:

The increase of technogenic and natural accidents, accompanied by air pollution, for example, by combustion products, leads to the necessity of respiratory protection. This work is devoted to the development of a calorimetric method and a device which allow investigating quickly the kinetics of carbon dioxide sorption by chemo-sorbents on the base of potassium superoxide in order to assess the protective properties of respiratory protective closed-circuit apparatus. The features of the traditional approach for determining the sorption properties in a thin layer of chemo-sorbent are described, as well as methods and devices, which can be used for the sorption kinetics study. The authors of the paper developed an approach (as opposed to the traditional approach) based on the power measurement of internal heat sources in the chemo-sorbent layer. The emergence of the heat sources is a result of the exothermic reaction of carbon dioxide sorption. This approach eliminates the necessity of chemical analysis of samples and can significantly reduce the time and material expenses during chemo-sorbents testing. The error of determining the volume fraction of adsorbed carbon dioxide by the developed method does not exceed 12%. Taking into account the efficiency of the method, we consider that it is a good alternative to traditional methods of chemical analysis under the assessment of the protection sorbents quality.

Keywords: carbon dioxide chemisorption, exothermic reaction, internal heat sources, respiratory protective apparatus

Procedia PDF Downloads 380
1812 The Effect of Sulfur and Calcium on the Formation of Dioxin in a Bubbling Fluidized Bed Incinerator

Authors: Chien-Song Chyang, Wei-Chih Wang

Abstract:

For the incineration process, the inhibition of dioxin formation is an important issue. Many investigations indicate that adding sulfur compounds in the combustion process can be an effectively inhibition for the dioxin formation. In the process, the ratio of sulfur-to-chlorine plays an important role for the reduction efficiency of dioxin formation. Ca-base sorbent is also a common used for the acid gas removing. Moreover, that is also the indirectly way for dioxin inhibition. Although sulfur and calcium can reduce the dioxin formation, it still have some confusion exists between these additives. To understand and clarify the relationship between the dioxin and simultaneous addition of sulfur and calcium are presented in this study. The experimental data conducted in a pilot scale fluidized bed combustion system at various operating conditions are analysis comprehensively. The focus is on the dioxin of fly ash in this study. The experimental data in this study showed that the PCDD/Fs concentration in the fly ash collected from the baghouse is increased slightly as the simultaneous addition of sulfur and calcium. This work described the CO concentration with the addition of sulfur and calcium at the freeboard temperature from 800°C to 900°C, which is raised by the fuel complexity. The positive correlation exists between the dioxin concentration and CO concentration and carbon contained in the fly ash.. At the same sulfur/chlorine ratio, the toxic equivalent quantity (TEQ) can be reduced by increasing the actual concentration of sulfur and calcium. The homologue profiles showed that the P₅CDD and P₅CDF were the two major sources for the toxicity of dioxin. 2,3,7,8-TCDD and 2,3,7,8-TCDF reduced by the addition of pyrite and hydrated lime. The experimental results showed that the trend of PCDD/Fs concentration in the fly ash was different by the different sulfur/chlorine ratio with the addition of sulfur at 800°C.

Keywords: reduction of dioxin emissions, sulfur-to-chlorine ratio, de-chlorination, Ca-based sorbent

Procedia PDF Downloads 119
1811 Temporal and Spatial Distribution Prediction of Patinopecten yessoensis Larvae in Northern China Yellow Sea

Authors: RuiJin Zhang, HengJiang Cai, JinSong Gui

Abstract:

It takes Patinopecten yessoensis larvae more than 20 days from spawning to settlement. Due to the natural environmental factors such as current, Patinopecten yessoensis larvae are transported to a distance more than hundreds of kilometers, leading to a high instability of their spatial and temporal distribution and great difficulties in the natural spat collection. Therefore predicting the distribution is of great significance to improve the operating efficiency of the collecting. Hydrodynamic model of Northern China Yellow Sea was established and the motions equations of physical oceanography and verified by the tidal harmonic constants and the measured data velocities of Dalian Bay. According to the passivity drift characteristics of the larvae, combined with the hydrodynamic model and the particle tracking model, the spatial and temporal distribution prediction model was established and the spatial and temporal distribution of the larvae under the influence of flow and wind were simulated. It can be concluded from the model results: ocean currents have greatest impacts on the passive drift path and diffusion of Patinopecten yessoensis larvae; the impact of wind is also important, which changed the direction and speed of the drift. Patinopecten yessoensis larvae were generated in the sea along Zhangzi Island and Guanglu-Dachangshan Island, but after two months, with the impact of wind and currents, the larvae appeared in the west of Dalian and the southern of Lvshun, and even in Bohai Bay. The model results are consistent with the relevant literature on qualitative analysis, and this conclusion explains where the larvae come from in the perspective of numerical simulation.

Keywords: numerical simulation, Patinopecten yessoensis larvae, predicting model, spatial and temporal distribution

Procedia PDF Downloads 266
1810 Dynamic-cognition of Strategic Mineral Commodities; An Empirical Assessment

Authors: Carlos Tapia Cortez, Serkan Saydam, Jeff Coulton, Claude Sammut

Abstract:

Strategic mineral commodities (SMC) both energetic and metals have long been fundamental for human beings. There is a strong and long-run relation between the mineral resources industry and society's evolution, with the provision of primary raw materials, becoming one of the most significant drivers of economic growth. Due to mineral resources’ relevance for the entire economy and society, an understanding of the SMC market behaviour to simulate price fluctuations has become crucial for governments and firms. For any human activity, SMC price fluctuations are affected by economic, geopolitical, environmental, technological and psychological issues, where cognition has a major role. Cognition is defined as the capacity to store information in memory, processing and decision making for problem-solving or human adaptation. Thus, it has a significant role in those systems that exhibit dynamic equilibrium through time, such as economic growth. Cognition allows not only understanding past behaviours and trends in SCM markets but also supports future expectations of demand/supply levels and prices, although speculations are unavoidable. Technological developments may also be defined as a cognitive system. Since the Industrial Revolution, technological developments have had a significant influence on SMC production costs and prices, likewise allowing co-integration between commodities and market locations. It suggests a close relation between structural breaks, technology and prices evolution. SCM prices forecasting have been commonly addressed by econometrics and Gaussian-probabilistic models. Econometrics models may incorporate the relationship between variables; however, they are statics that leads to an incomplete approach of prices evolution through time. Gaussian-probabilistic models may evolve through time; however, price fluctuations are addressed by the assumption of random behaviour and normal distribution which seems to be far from the real behaviour of both market and prices. Random fluctuation ignores the evolution of market events and the technical and temporal relation between variables, giving the illusion of controlled future events. Normal distribution underestimates price fluctuations by using restricted ranges, curtailing decisions making into a pre-established space. A proper understanding of SMC's price dynamics taking into account the historical-cognitive relation between economic, technological and psychological factors over time is fundamental in attempting to simulate prices. The aim of this paper is to discuss the SMC market cognition hypothesis and empirically demonstrate its dynamic-cognitive capacity. Three of the largest and traded SMC's: oil, copper and gold, will be assessed to examine the economic, technological and psychological cognition respectively.

Keywords: commodity price simulation, commodity price uncertainties, dynamic-cognition, dynamic systems

Procedia PDF Downloads 433
1809 Effect of Organizational Competitive Climate on Organizational Prosocial Behavior: Workplace Envy as a Mediator

Authors: Armaghan Eslami, Nasrin Arshadi

Abstract:

Scarce resources are the inseparable part of organization life. This fact that only small number of the employees can have these resources such as promotion, raise, and recognition can cause competition among employees, which create competitive climate. As well as any other competition, small number wins the reward, and a great number loses, one of the possible emotional reactions to this loss is negative emotions like malicious envy. In this case, the envious person may try to harm the envied person by reducing the prosocial behavior. Prosocial behavior is a behavior that aimed to benefit others. The main propose of this action is to maintain and increase well-being and well-fare of others. Therefore, one of the easiest ways for harming envied one is to suppress prosocial behavior. Prosocial behavior has positive and important implication for organizational efficiency. Our results supported our model and suggested that competitive climate has a significant effect on increasing workplace envy and on the other hand envy has significant negative impact on prosocial behavior. Our result also indicated that envy is the mediator in the relation between competitive climate and prosocial behavior. Organizational competitive climate can cause employees respond envy with negative emotion and hostile and damaging behavior toward envied person. Competition can lead employees to look out for proof of their self-worthiness; and, furthermore, they measure their self-worth, value and respect by the superiority that they gain in competitions. As a result, loss in competitions can harm employee’s self-definition and they try to protect themselves by devaluating envied other and being ‘less friendly’ to them. Some employees may find it inappropriate to engage in the harming behavior, but they may believe there is nothing against withholding the prosocial behavior.

Keywords: competitive climate, mediator, prosocial behavior, workplace envy

Procedia PDF Downloads 336
1808 Image Recognition Performance Benchmarking for Edge Computing Using Small Visual Processing Unit

Authors: Kasidis Chomrat, Nopasit Chakpitak, Anukul Tamprasirt, Annop Thananchana

Abstract:

Internet of Things devices or IoT and Edge Computing has become one of the biggest things happening in innovations and one of the most discussed of the potential to improve and disrupt traditional business and industry alike. With rises of new hang cliff challenges like COVID-19 pandemic that posed a danger to workforce and business process of the system. Along with drastically changing landscape in business that left ruined aftermath of global COVID-19 pandemic, looming with the threat of global energy crisis, global warming, more heating global politic that posed a threat to become new Cold War. How emerging technology like edge computing and usage of specialized design visual processing units will be great opportunities for business. The literature reviewed on how the internet of things and disruptive wave will affect business, which explains is how all these new events is an effect on the current business and how would the business need to be adapting to change in the market and world, and example test benchmarking for consumer marketed of newer devices like the internet of things devices equipped with new edge computing devices will be increase efficiency and reducing posing a risk from a current and looming crisis. Throughout the whole paper, we will explain the technologies that lead the present technologies and the current situation why these technologies will be innovations that change the traditional practice through brief introductions to the technologies such as cloud computing, edge computing, Internet of Things and how it will be leading into future.

Keywords: internet of things, edge computing, machine learning, pattern recognition, image classification

Procedia PDF Downloads 125
1807 Statistical Feature Extraction Method for Wood Species Recognition System

Authors: Mohd Iz'aan Paiz Bin Zamri, Anis Salwa Mohd Khairuddin, Norrima Mokhtar, Rubiyah Yusof

Abstract:

Effective statistical feature extraction and classification are important in image-based automatic inspection and analysis. An automatic wood species recognition system is designed to perform wood inspection at custom checkpoints to avoid mislabeling of timber which will results to loss of income to the timber industry. The system focuses on analyzing the statistical pores properties of the wood images. This paper proposed a fuzzy-based feature extractor which mimics the experts’ knowledge on wood texture to extract the properties of pores distribution from the wood surface texture. The proposed feature extractor consists of two steps namely pores extraction and fuzzy pores management. The total number of statistical features extracted from each wood image is 38 features. Then, a backpropagation neural network is used to classify the wood species based on the statistical features. A comprehensive set of experiments on a database composed of 5200 macroscopic images from 52 tropical wood species was used to evaluate the performance of the proposed feature extractor. The advantage of the proposed feature extraction technique is that it mimics the experts’ interpretation on wood texture which allows human involvement when analyzing the wood texture. Experimental results show the efficiency of the proposed method.

Keywords: classification, feature extraction, fuzzy, inspection system, image analysis, macroscopic images

Procedia PDF Downloads 398
1806 In vivo Estimation of Mutation Rate of the Aleutian Mink Disease Virus

Authors: P.P. Rupasinghe, A.H. Farid

Abstract:

The Aleutian mink disease virus (AMDV, Carnivore amdoparvovirus 1) causes persistent infection, plasmacytosis, and formation and deposition of immune complexes in various organs in adult mink, leading to glomerulonephritis, arteritis and sometimes death. The disease has no cure nor an effective vaccine, and identification and culling of mink positive for anti-AMDV antibodies have not been successful in controlling the infection in many countries. The failure to eradicate the virus from infected farms may be caused by keeping false-negative individuals on the farm, virus transmission from wild animals, or neighboring farms. The identification of sources of infection, which can be performed by comparing viral sequences, is important in the success of viral eradication programs. High mutation rates could cause inaccuracies when viral sequences are used to trace back an infection to its origin. There is no published information on the mutation rate of AMDV either in vivo or in vitro. The in vivo estimation is the most accurate method, but it is difficult to perform because of the inherent technical complexities, namely infecting live animals, the unknown numbers of viral generations (i.e., infection cycles), the removal of deleterious mutations over time and genetic drift. The objective of this study was to determine the mutation rate of AMDV on which no information was available. A homogenate was prepared from the spleen of one naturally infected American mink (Neovison vison) from Nova Scotia, Canada (parental template). The near full-length genome of this isolate (91.6%, 4,143 bp) was bidirectionally sequenced. A group of black mink was inoculated with this homogenate (descendant mink). Spleen sampled were collected from 10 descendant mink after 16 weeks post-inoculation (wpi) and from anther 10 mink after 176 wpi, and their near-full length genomes were bi-directionally sequenced. Sequences of these mink were compared with each other and with the sequence of the parental template. The number of nucleotide substitutions at 176 wpi was 3.1 times greater than that at 16 wpi (113 vs 36) whereas the estimates of mutation rate at 176 wpi was 3.1 times lower than that at 176 wpi (2.85×10-3 vs 9.13×10-4 substitutions/ site/ year), showing a decreasing trend in the mutation rate per unit of time. Although there is no report on in vivo estimate of the mutation rate of DNA viruses in animals using the same method which was used in the current study, these estimates are at the higher range of reported values for DNA viruses determined by various techniques. These high estimates are logical based on the wide range of diversity and pathogenicity of AMDV isolates. The results suggest that increases in the number of nucleotide substitutions over time and subsequent divergence make it difficult to accurately trace back AMDV isolates to their origin when several years elapsed between the two samplings.

Keywords: Aleutian mink disease virus, American mink, mutation rate, nucleotide substitution

Procedia PDF Downloads 97
1805 Development of Distance Training Packages for Teacher on Education Management for Learners with Special Needs

Authors: Jareeluk Ratanaphan

Abstract:

The purposed of this research were; 1. To survey the teacher’s needs on knowledge about special education management for special needs student 2. Development of distance training packages for teacher on special education management for special needs student 3. to study the effects of using the packages on trainee’s achievement 4. to study the effects of using the packages on trainee’s opinion on the distance training packages. The design of the experiment was research and development. The research sample for survey were 86 teachers, and 22 teachers for study the effects of using the packages on achievement and opinion. The research instrument comprised: 1) training packages on special education management for special needs student 2) achievement test 3) questionnaire. Mean, percentage, standard deviation, t-test and content analysis were used for data analysis. The findings of the research were as follows: 1. The teacher’s needs on knowledge about teaching for a learner with learning disability, mental retardation, autism, physical and health impairment and research in special education. 2. The package composed of special education management for special needs student document and manual of distance training packages. The document consisted by the name of packages, the explanation for the educator, content’s structure, concept, objectives, content and activities. Manual of distance training packages consisted by the explanation about a document, objectives, explanation about using the package, training schedule, and evaluation. The efficiency of packages was established at 79.50/81.35. 3. The results of using the packages were the posttest average scores of trainee’s achievement were higher than the pretest. 4. The trainee’s opinion on the package was at the highest level.

Keywords: distance training package, teacher, learner with special needs

Procedia PDF Downloads 462
1804 Lime Based Products as a Maintainable Option for Repair And Restoration of Historic Buildings in India

Authors: Adedayo Jeremiah Adeyekun, Samuel Oluwagbemiga Ishola

Abstract:

This research aims to study the use of traditional building materials for the repair and refurbishment of historic buildings in India and to provide an authentic treatment of historical buildings that will be highly considered by taking into consideration the new standards of rehabilitating process. This can be proven to be an effective solution over modern impervious material due to its compatibility with traditional building methods and materials. For example, their elastoplastic properties allow accommodating movement due to settlement or moisture/temperature changes without cracking. The use of lime also enhances workability, water retention and bond characteristics. Lime is considered to be a natural, traditional material, but it is also sustainable and energy-efficient, with production powered by biomass and emissions up to 25% less than cementitious materials. However, there is a lack of comprehensive data on the impact of lime‐based materials on the energy efficiency and thermal properties of traditional buildings and structures. Although lime mortars, renders and plasters were largely superseded by cement-based products in the first half of the 20th century, lime has a long and proven track record dating back to ancient times. This was used by the Egyptians in 4000BC to construct the pyramids. This doesn't mean that lime is an outdated technology, nor is it difficult to be used as a material. In fact, lime has a growing place in modern construction, with increasing numbers of designers choosing to use lime-based products because of their special properties. To carry out this research, some historic buildings will be surveyed and information will be derived from the textbooks and journals related to Architectural restoration.

Keywords: lime, materials, historic, buildings, sustainability

Procedia PDF Downloads 141
1803 Modeling of Large Elasto-Plastic Deformations by the Coupled FE-EFGM

Authors: Azher Jameel, Ghulam Ashraf Harmain

Abstract:

In the recent years, the enriched techniques like the extended finite element method, the element free Galerkin method, and the Coupled finite element-element free Galerkin method have found wide application in modeling different types of discontinuities produced by cracks, contact surfaces, and bi-material interfaces. The extended finite element method faces severe mesh distortion issues while modeling large deformation problems. The element free Galerkin method does not have mesh distortion issues, but it is computationally more demanding than the finite element method. The coupled FE-EFGM proves to be an efficient numerical tool for modeling large deformation problems as it exploits the advantages of both FEM and EFGM. The present paper employs the coupled FE-EFGM to model large elastoplastic deformations in bi-material engineering components. The large deformation occurring in the domain has been modeled by using the total Lagrangian approach. The non-linear elastoplastic behavior of the material has been represented by the Ramberg-Osgood model. The elastic predictor-plastic corrector algorithms are used for the evaluation stresses during large deformation. Finally, several numerical problems are solved by the coupled FE-EFGM to illustrate its applicability, efficiency and accuracy in modeling large elastoplastic deformations in bi-material samples. The results obtained by the proposed technique are compared with the results obtained by XFEM and EFGM. A remarkable agreement was observed between the results obtained by the three techniques.

Keywords: XFEM, EFGM, coupled FE-EFGM, level sets, large deformation

Procedia PDF Downloads 416
1802 Fractional, Component and Morphological Composition of Ambient Air Dust in the Areas of Mining Industry

Authors: S.V. Kleyn, S.Yu. Zagorodnov, А.А. Kokoulina

Abstract:

Technogenic emissions of the mining and processing complex are characterized by a high content of chemical components and solid dust particles. However, each industrial enterprise and the surrounding area have features that require refinement and parameterization. Numerous studies have shown the negative impact of fine dust PM10 and PM2.5 on the health, as well as the possibility of toxic components absorption, including heavy metals by dust particles. The target of the study was the quantitative assessment of the fractional and particle size composition of ambient air dust in the area of impact by primary magnesium production complex. Also, we tried to describe the morphology features of dust particles. Study methods. To identify the dust emission sources, the analysis of the production process has been carried out. The particulate composition of the emissions was measured using laser particle analyzer Microtrac S3500 (covered range of particle size is 20 nm to 2000 km). Particle morphology and the component composition were established by electron microscopy by scanning microscope of high resolution (magnification rate - 5 to 300 000 times) with X-ray fluorescence device S3400N ‘HITACHI’. The chemical composition was identified by X-ray analysis of the samples using an X-ray diffractometer XRD-700 ‘Shimadzu’. Determination of the dust pollution level was carried out using model calculations of emissions in the atmosphere dispersion. The calculations were verified by instrumental studies. Results of the study. The results demonstrated that the dust emissions of different technical processes are heterogeneous and fractional structure is complicated. The percentage of particle sizes up to 2.5 micrometres inclusive was ranged from 0.00 to 56.70%; particle sizes less than 10 microns inclusive – 0.00 - 85.60%; particle sizes greater than 10 microns - 14.40% -100.00%. During microscopy, the presence of nanoscale size particles has been detected. Studied dust particles are round, irregular, cubic and integral shapes. The composition of the dust includes magnesium, sodium, potassium, calcium, iron, chlorine. On the base of obtained results, it was performed the model calculations of dust emissions dispersion and establishment of the areas of fine dust РМ 10 and РМ 2.5 distribution. It was found that the dust emissions of fine powder fractions PM10 and PM2.5 are dispersed over large distances and beyond the border of the industrial site of the enterprise. The population living near the enterprise is exposed to the risk of diseases associated with dust exposure. Data are transferred to the economic entity to make decisions on the measures to minimize the risks. Exposure and risks indicators on the health are used to provide named patient health and preventive care to the citizens living in the area of negative impact of the facility.

Keywords: dust emissions, еxposure assessment, PM 10, PM 2.5

Procedia PDF Downloads 236
1801 Development of Biosurfactant-Based Adjuvant for Enhancing Biocontrol Efficiency

Authors: Kanyarat Sikhao, Nichakorn Khondee

Abstract:

Adjuvant is commonly mixed with agricultural spray solution during foliar application to improve the performance of microbial-based biological control, including better spreading, absorption, and penetration on a plant leaf. This research aims to replace chemical surfactants in adjuvant by biosurfactants for reducing a negative impact on antagonistic microorganisms and crops. Biosurfactant was produced from Brevibacterium casei NK8 and used as a cell-free broth solution containing a biosurfactant concentration of 3.7 g/L. The studies of microemulsion formation and phase behavior were applied to obtain the suitable composition of biosurfactant-based adjuvant, consisting of cell-free broth (70-80%), coconut oil-based fatty alcohol C12-14 (3) ethoxylate (1-7%), and sodium chloride (8-30%). The suitable formula, achieving Winsor Type III microemulsion (bicontinuous), was 80% of cell-free broth, 7% of fatty alcohol C12-14 (3) ethoxylate, and 8% sodium chloride. This formula reduced the contact angle of water on parafilm from 70 to 31 degrees. The non-phytotoxicity against plant seed of Oryza sativa and Brassica rapa subsp. pekinensis were obtained from biosurfactant-based adjuvant (germination index equal and above 80%), while sodium dodecyl sulfate and tween80 showed phytotoxic effects to these plant seeds. The survival of Bacillus subtilis in biosurfactant-based adjuvant was higher than sodium dodecyl sulfate and tween80. The mixing of biosurfactant and plant-based surfactant could be considered as a viable, safer, and acceptable alternative to chemical adjuvant for sustainable organic farming.

Keywords: biosurfactant, microemulsion, bio-adjuvant, antagonistic microorganisms

Procedia PDF Downloads 114
1800 A Design of Elliptic Curve Cryptography Processor based on SM2 over GF(p)

Authors: Shiji Hu, Lei Li, Wanting Zhou, DaoHong Yang

Abstract:

The data encryption, is the foundation of today’s communication. On this basis, how to improve the speed of data encryption and decryption is always a problem that scholars work for. In this paper, we proposed an elliptic curve crypto processor architecture based on SM2 prime field. In terms of hardware implementation, we optimized the algorithms in different stages of the structure. In finite field modulo operation, we proposed an optimized improvement of Karatsuba-Ofman multiplication algorithm, and shorten the critical path through pipeline structure in the algorithm implementation. Based on SM2 recommended prime field, a fast modular reduction algorithm is used to reduce 512-bit wide data obtained from the multiplication unit. The radix-4 extended Euclidean algorithm was used to realize the conversion between affine coordinate system and Jacobi projective coordinate system. In the parallel scheduling of point operations on elliptic curves, we proposed a three-level parallel structure of point addition and point double based on the Jacobian projective coordinate system. Combined with the scalar multiplication algorithm, we added mutual pre-operation to the point addition and double point operation to improve the efficiency of the scalar point multiplication. The proposed ECC hardware architecture was verified and implemented on Xilinx Virtex-7 and ZYNQ-7 platforms, and each 256-bit scalar multiplication operation took 0.275ms. The performance for handling scalar multiplication is 32 times that of CPU(dual-core ARM Cortex-A9).

Keywords: Elliptic curve cryptosystems, SM2, modular multiplication, point multiplication.

Procedia PDF Downloads 61
1799 Revitalization of Industrial Brownfields in Historical Districts

Authors: Adel Menchawy, Noha Labib

Abstract:

Many cities have quarters that confer on them sense of identity and place through its cultural history. They are often vital part of the cities charm and appeal, their functional and visual qualities are important to the city’s image and identity. Brownfield sites present an important part of our built landscape. They provide tangible and intangible links to our past and have great potential to play significant roles in the future of our cities, towns and rural environments. Brownfield sites are places that were previously industrial factories or areas that might have had waste kept at that location or been exposed to many types of hazards. Thus its redevelopment revitalizes and strengthens towns and communities as it helps in economic growth, builds community pride and protects public health and the environment Three case studies are discussed in this paper; the first one is the city of Sterling which was developed and revitalized entirely and became a city with identity after it was derelict, the Second is the city of Castlefield with was a place no one was eager to visit now it became a touristic area. And finally the city of Cleveland which adopted a strategy that transferred it from being a polluted, derelict place into a mixed use development city Brownfield revitalization offers a great opportunity to transfer the city from being derelict, useless and contaminated into a place where tourists would love to come. Also it will increase the economy of the place, increase the social level, it can improve energy efficiency, reduce natural consumption, clean air, water and land and take advantage of existing buildings and sites and transfers them into an adaptive reuse after being remediated

Keywords: Brownfield Revitalization, Sustainable Brownfield, Historical conservation, Adaptive reuse

Procedia PDF Downloads 235
1798 Scale up of Isoniazid Preventive Therapy: A Quality Management Approach in Nairobi County, Kenya

Authors: E. Omanya, E. Mueni, G. Makau, M. Kariuki

Abstract:

HIV infection is the strongest risk factor for a person to develop TB. Isoniazid preventive therapy (IPT) for People Living with HIV (PLWHIV) not only reduces the individual patients’ risk of developing active TB but mitigates cross infection. In Kenya, IPT for six months was recommended through the National TB, Leprosy and Lung Disease Program to treat latent TB. In spite of this recommendation by the national government, uptake of IPT among PLHIV remained low in Kenya by the end of 2015. The USAID/Kenya and East Africa Afya Jijini project, which supports 42 TBHIV health facilities in Nairobi County, began addressing low uptake of IPT through Quality Improvement (QI) teams set up at the facility level. Quality is characterized by WHO as one of the four main connectors between health systems building blocks and health systems outputs. Afya Jijini implements the Kenya Quality Model for Health, which involves QI teams being formed at the county, sub-county and facility levels. The teams review facility performance to identify gaps in service delivery and use QI tools to monitor and improve performance. Afya Jijini supported the formation of these teams in 42 facilities and built the teams’ capacity to review data and use QI principles to identify and address performance gaps. When the QI teams began working on improving IPT uptake among PLHIV, uptake was at 31.8%. The teams first conducted a root cause analysis using cause and effect diagrams, which help the teams to brainstorm on and to identify barriers to IPT uptake among PLHIV at the facility level. This is a participatory process where program staff provides technical support to the QI teams in problem identification and problem-solving. The gaps identified were inadequate knowledge and skills on the use of IPT among health care workers, lack of awareness of IPT by patients, inadequate monitoring and evaluation tools, and poor quantification and forecasting of IPT commodities. In response, Afya Jijini trained over 300 health care workers on the administration of IPT, supported patient education, supported quantification and forecasting of IPT commodities, and provided IPT data collection tools to help facilities monitor their performance. The facility QI teams conducted monthly meetings to monitor progress on implementation of IPT and took corrective action when necessary. IPT uptake improved from 31.8% to 61.2% during the second year of the Afya Jijini project and improved to 80.1% during the third year of the project’s support. Use of QI teams and root cause analysis to identify and address service delivery gaps, in addition to targeted program interventions and continual performance reviews, can be successful in increasing TB related service delivery uptake at health facilities.

Keywords: isoniazid, quality, health care workers, people leaving with HIV

Procedia PDF Downloads 75
1797 Conceptual Design of Gravity Anchor Focusing on Anchor Towing and Lowering

Authors: Vinay Kumar Vanjakula, Frank Adam, Nils Goseberg

Abstract:

Wind power is one of the leading renewable energy generation methods. Due to abundant higher wind speeds far away from shore, the construction of offshore wind turbines began in the last decades. However, installation of offshore foundation-based (monopiles) wind turbines in deep waters are often associated with technical and financial challenges. To overcome such challenges, the concept of floating wind turbines is expanded as the basis from the oil and gas industry. The unfolding of Universal heavyweight gravity anchor (UGA) for floating based foundation for floating Tension Leg Platform (TLP) sub-structures is developed in this research work. It is funded by the German Federal Ministry of Education and Research) for a three-year (2019-2022) research program called “Offshore Wind Solutions Plus (OWSplus) - Floating Offshore Wind Solutions Mecklenburg-Vorpommern.” It’s a group consists of German institutions (Universities, laboratories, and consulting companies). The part of the project is focused on the numerical modeling of gravity anchor that involves to analyze and solve fluid flow problems. Compared to gravity-based torpedo anchors, these UGA will be towed and lowered via controlled machines (tug boats) at lower speeds. This kind of installation of UGA are new to the offshore wind industry, particularly for TLP, and very few research works have been carried out in recent years. Conventional methods for transporting the anchor requires a large transportation crane vessel which involves a greater cost. This conceptual UGA anchors consists of ballasting chambers which utilizes the concept of buoyancy forces; the inside chambers are filled with the required amount of water in a way that they can float on the water for towing. After reaching the installation site, those chambers are ballasted with water for lowering. After it’s lifetime, these UGA can be unballasted (for erection or replacement) results in self-rising to the sea surface; buoyancy chambers give an advantage for using an UGA without the need of heavy machinery. However, while lowering/rising the UGA towards/away from the seabed, it experiences difficult, harsh marine environments due to the interaction of waves and currents. This leads to drifting of the anchor from the desired installation position and damage to the lowering machines. To overcome such harsh environments problems, a numerical model is built to investigate the influences of different outer contours and other fluid governing shapes that can be installed on the UGA to overcome the turbulence and drifting. The presentation will highlight the importance of the Computational Fluid Dynamics (CFD) numerical model in OpenFOAM, which is open-source programming software.

Keywords: anchor lowering, towing, waves, currrents, computational fluid dynamics

Procedia PDF Downloads 143
1796 Data Analytics in Energy Management

Authors: Sanjivrao Katakam, Thanumoorthi I., Antony Gerald, Ratan Kulkarni, Shaju Nair

Abstract:

With increasing energy costs and its impact on the business, sustainability today has evolved from a social expectation to an economic imperative. Therefore, finding methods to reduce cost has become a critical directive for Industry leaders. Effective energy management is the only way to cut costs. However, Energy Management has been a challenge because it requires a change in old habits and legacy systems followed for decades. Today exorbitant levels of energy and operational data is being captured and stored by Industries, but they are unable to convert these structured and unstructured data sets into meaningful business intelligence. It must be noted that for quick decisions, organizations must learn to cope with large volumes of operational data in different formats. Energy analytics not only helps in extracting inferences from these data sets, but also is instrumental in transformation from old approaches of energy management to new. This in turn assists in effective decision making for implementation. It is the requirement of organizations to have an established corporate strategy for reducing operational costs through visibility and optimization of energy usage. Energy analytics play a key role in optimization of operations. The paper describes how today energy data analytics is extensively used in different scenarios like reducing operational costs, predicting energy demands, optimizing network efficiency, asset maintenance, improving customer insights and device data insights. The paper also highlights how analytics helps transform insights obtained from energy data into sustainable solutions. The paper utilizes data from an array of segments such as retail, transportation, and water sectors.

Keywords: energy analytics, energy management, operational data, business intelligence, optimization

Procedia PDF Downloads 334
1795 Korean Smart Cities: Strategic Foci, Characteristics and Effects

Authors: Sang Ho Lee, Yountaik Leem

Abstract:

This paper reviews Korean cases of smart cities through the analysis framework of strategic foci, characteristics and effects. Firstly, national strategies including c(cyber), e(electronic), u(ubiquitous) and s(smart) Korea strategies were considered from strategic angles. Secondly, the characteristics of smart cities in Korea were looked through the smart cities examples such as Seoul, Busan, Songdo and Sejong cities etc. from the views on the by STIM (Service, Technology, Infrastructure and Management) analysis. Finally, the effects of smart cities on socio-economies were investigated from industrial perspective using the input-output model and structural path analysis. Korean smart city strategies revealed that there were different kinds of strategic foci. c-Korea strategy focused on information and communications network building and user IT literacy. e-Korea strategy encouraged e-government and e-business through utilizing high-speed information and communications network. u-Korea strategy made ubiquitous service as well as integrated information and communication operations center. s-Korea strategy is propelling 4th industrial platform. Smart cities in Korea showed their own features and trends such as eco-intelligence, high efficiency and low cost oriented IoT, citizen sensored city, big data city. Smart city progress made new production chains fostering ICTs (Information Communication Technologies) and knowledge intermediate inputs to industries.

Keywords: Korean smart cities, Korean smart city strategies, STIM, smart service, infrastructure, technologies, management, effect of smart city

Procedia PDF Downloads 346
1794 Bluetooth Communication Protocol Study for Multi-Sensor Applications

Authors: Joao Garretto, R. J. Yarwood, Vamsi Borra, Frank Li

Abstract:

Bluetooth Low Energy (BLE) has emerged as one of the main wireless communication technologies used in low-power electronics, such as wearables, beacons, and Internet of Things (IoT) devices. BLE’s energy efficiency characteristic, smart mobiles interoperability, and Over the Air (OTA) capabilities are essential features for ultralow-power devices, which are usually designed with size and cost constraints. Most current research regarding the power analysis of BLE devices focuses on the theoretical aspects of the advertising and scanning cycles, with most results being presented in the form of mathematical models and computer software simulations. Such computer modeling and simulations are important for the comprehension of the technology, but hardware measurement is essential for the understanding of how BLE devices behave in real operation. In addition, recent literature focuses mostly on the BLE technology, leaving possible applications and its analysis out of scope. In this paper, a coin cell battery-powered BLE Data Acquisition Device, with a 4-in-1 sensor and one accelerometer, is proposed and evaluated with respect to its Power Consumption. First, evaluations of the device in advertising mode with the sensors turned off completely, followed by the power analysis when each of the sensors is individually turned on and data is being transmitted, and concluding with the power consumption evaluation when both sensors are on and respectively broadcasting the data to a mobile phone. The results presented in this paper are real-time measurements of the electrical current consumption of the BLE device, where the energy levels that are demonstrated are matched to the BLE behavior and sensor activity.

Keywords: bluetooth low energy, power analysis, BLE advertising cycle, wireless sensor node

Procedia PDF Downloads 64
1793 Efficient Fuzzy Classified Cryptographic Model for Intelligent Encryption Technique towards E-Banking XML Transactions

Authors: Maher Aburrous, Adel Khelifi, Manar Abu Talib

Abstract:

Transactions performed by financial institutions on daily basis require XML encryption on large scale. Encrypting large volume of message fully will result both performance and resource issues. In this paper a novel approach is presented for securing financial XML transactions using classification data mining (DM) algorithms. Our strategy defines the complete process of classifying XML transactions by using set of classification algorithms, classified XML documents processed at later stage using element-wise encryption. Classification algorithms were used to identify the XML transaction rules and factors in order to classify the message content fetching important elements within. We have implemented four classification algorithms to fetch the importance level value within each XML document. Classified content is processed using element-wise encryption for selected parts with "High", "Medium" or “Low” importance level values. Element-wise encryption is performed using AES symmetric encryption algorithm and proposed modified algorithm for AES to overcome the problem of computational overhead, in which substitute byte, shift row will remain as in the original AES while mix column operation is replaced by 128 permutation operation followed by add round key operation. An implementation has been conducted using data set fetched from e-banking service to present system functionality and efficiency. Results from our implementation showed a clear improvement in processing time encrypting XML documents.

Keywords: XML transaction, encryption, Advanced Encryption Standard (AES), XML classification, e-banking security, fuzzy classification, cryptography, intelligent encryption

Procedia PDF Downloads 377
1792 The Determination of Stress Experienced by Nursing Undergraduate Students during Their Education

Authors: Gülden Küçükakça, Şefika Dilek Güven, Rahşan Kolutek, Seçil Taylan

Abstract:

Objective: Nursing students face with stress factors affecting academic performance and quality of life as from first moments of their educational life. Stress causes health problems in students such as physical, psycho-social, and behavioral disorders and might damage formation of professional identity by decreasing efficiency of education. In addition to determination of stress experienced by nursing students during their education, it was aimed to help review theoretical and clinical education settings for bringing stress of nursing students into positive level and to raise awareness of educators concerning their own professional behaviors. Methods: The study was conducted with 315 students studying at nursing department of Semra and Vefa Küçük Health High School, Nevşehir Hacı Bektaş Veli University in the academic year of 2015-2016 and agreed to participate in the study. “Personal Information Form” prepared by the researchers upon the literature review and “Nursing Education Stress Scale (NESS)” were used in this study. Data were assessed with analysis of variance and correlation analysis. Results: Mean NESS Scale score of the nursing students was estimated to be 66.46±16.08 points. Conclusions: As a result of this study, stress level experienced by nursing undergraduate students during their education was determined to be high. In accordance with this result, it can be recommended to determine sources of stress experienced by nursing undergraduate students during their education and to develop approaches to eliminate these stress sources.

Keywords: stress, nursing education, nursing student, nursing education stress

Procedia PDF Downloads 440
1791 A Study of Basic and Reactive Dyes Removal from Synthetic and Industrial Wastewater by Electrocoagulation Process

Authors: Almaz Negash, Dessie Tibebe, Marye Mulugeta, Yezbie Kassa

Abstract:

Large-scale textile industries use large amounts of toxic chemicals, which are very hazardous to human health and environmental sustainability. In this study, the removal of various dyes from effluents of textile industries using the electrocoagulation process was investigated. The studied dyes were Reactive Red 120 (RR-120), Basic Blue 3 (BB-3), and Basic Red 46 (BR-46), which were found in samples collected from effluents of three major textile factories in the Amhara region, Ethiopia. For maximum removal, the dye BB-3 required an acidic pH 3, RR120 basic pH 11, while BR-46 neutral pH 7 conditions. BB-3 required a longer treatment time of 80 min than BR46 and RR-120, which required 30 and 40 min, respectively. The best removal efficiency of 99.5%, 93.5%, and 96.3% was achieved for BR-46, BB-3, and RR-120, respectively, from synthetic wastewater containing 10 mg L1of each dye at an applied potential of 10 V. The method was applied to real textile wastewaters and 73.0 to 99.5% removal of the dyes was achieved, Indicating Electrocoagulation can be used as a simple, and reliable method for the treatment of real wastewater from textile industries. It is used as a potentially viable and inexpensive tool for the treatment of textile dyes. Analysis of the electrochemically generated sludge by X-ray Diffraction, Scanning Electron Microscope, and Fourier Transform Infrared Spectroscopy revealed the expected crystalline aluminum oxides (bayerite (Al(OH)3 diaspore (AlO(OH)) found in the sludge. The amorphous phase was also found in the floc. Textile industry owners should be aware of the impact of the discharge of effluents on the Ecosystem and should use the investigated electrocoagulation method for effluent treatment before discharging into the environment.

Keywords: electrocoagulation, aluminum electrodes, Basic Blue 3, Basic Red 46, Reactive Red 120, textile industry, wastewater

Procedia PDF Downloads 11
1790 Generation Mechanism of Opto-Acoustic Wave from in vivo Imaging Agent

Authors: Hiroyuki Aoki

Abstract:

The optoacoustic effect is the energy conversion phenomenon from light to sound. In recent years, this optoacoustic effect has been utilized for an imaging agent to visualize a tumor site in a living body. The optoacoustic imaging agent absorbs the light and emits the sound signal. The sound wave can propagate in a living organism with a small energy loss; therefore, the optoacoustic imaging method enables the molecular imaging of the deep inside of the body. In order to improve the imaging quality of the optoacoustic method, the more signal intensity is desired; however, it has been difficult to enhance the signal intensity of the optoacoustic imaging agent because the fundamental mechanism of the signal generation is unclear. This study deals with the mechanism to generate the sound wave signal from the optoacoustic imaging agent following the light absorption by experimental and theoretical approaches. The optoacoustic signal efficiency for the nano-particles consisting of metal and polymer were compared, and it was found that the polymer particle was better. The heat generation and transfer process for optoacoustic agents of metal and polymer were theoretically examined. It was found that heat generated in the metal particle rapidly transferred to the water medium, whereas the heat in the polymer particle was confined in itself. The confined heat in the small particle induces the massive volume expansion, resulting in the large optoacoustic signal for the polymeric particle agent. Thus, we showed that heat confinement is a crucial factor in designing the highly efficient optoacoustic imaging agent.

Keywords: nano-particle, opto-acoustic effect, in vivo imaging, molecular imaging

Procedia PDF Downloads 103
1789 Experimental Study of Particle Deposition on Leading Edge of Turbine Blade

Authors: Yang Xiao-Jun, Yu Tian-Hao, Hu Ying-Qi

Abstract:

Breathing in foreign objects during the operation of the aircraft engine, impurities in the aircraft fuel and products of incomplete combustion can produce deposits on the surface of the turbine blades. These deposits reduce not only the turbine's operating efficiency but also the life of the turbine blades. Based on the small open wind tunnel, the simulation of deposits on the leading edge of the turbine has been carried out in this work. The effect of film cooling on particulate deposition was investigated. Based on the analysis, the adhesive mechanism for the molten pollutants’ reaching to the turbine surface was simulated by matching the Stokes number, TSP (a dimensionless number characterizing particle phase transition) and Biot number of the test facility and that of the real engine. The thickness distribution and growth trend of the deposits have been observed by high power microscope and infrared camera under different temperature of the main flow, the solidification temperature of the particulate objects, and the blowing ratio. The experimental results from the leading edge particulate deposition demonstrate that the thickness of the deposition increases with time until a quasi-stable thickness is reached, showing a striking effect of the blowing ratio on the deposition. Under different blowing ratios, there exists a large difference in the thickness distribution of the deposition, and the deposition is minimal at the specific blow ratio. In addition, the temperature of main flow and the solidification temperature of the particulate have a great influence on the deposition.

Keywords: deposition, experiment, film cooling, leading edge, paraffin particles

Procedia PDF Downloads 122
1788 Behavioral and Electroantennographic Responses of the Tea Shot Hole Borer, Euwallacea fornicatus, Eichhoff (Scolytidae: Coleoptera) to Volatiles Compounds of Montanoa bipinnatifida (Compositae: Asteraceae) and Development of a Kairomone Trap

Authors: Sachin Paul James, Selvasundaram Rajagopal, Muraleedharan Nair, Babu Azariah

Abstract:

The shot hole borer (SHB), Euwallacea fornicatus (= Xyleborus fornicatus) (Scolytidae: Coleoptera) is one of the major pests of tea in southern India and Sri Lanka. The partially dried cut stem of a jungle plant, Montanoa bipinnatifida (C.Koch) (Compositae: Asteraceae) reported to attract shot hole borer beetles in the field. Collection, isolation, identification and quantification of the emitted volatiles from the partially dried cut stems of M. bipinnatifida using dynamic head space and GC-MS revealed the presence of seven compounds viz. α- pinene, β- phellandrene, β - pinene, D- limonene, trans-caryophyllene, iso- caryophyllene and germacrene– D. Behavioural bioassays using electroantennogram (EAG) and wind tunnel proved that, among these identified compounds only α - pinene, trans-caryophyllene, β – phellandrene and germacrene-D evoked significant behavioral response and maximum response was obtained to a specific blend of these four compounds @ 10:1:0.1:3. Field trapping experiments of this blend conducted in the SHB infested field using multiple funnel traps further proved the efficiency of the blend with a mean trap catch of 176.7 ± 13.1 beetles. Mass trapping studies in the field helped to develop a kairomone trap for the management of SHB in the tea fields of southern India.

Keywords: electroantennogram, kairomone trap, Montanoa bipinnatifida, tea shot hole borer

Procedia PDF Downloads 198
1787 Phytoextraction of Copper and Zinc by Willow Varieties in a Pot Experiment

Authors: Muhammad Mohsin, Mir Md Abdus Salam, Pertti Pulkkinen, Ari Pappinen

Abstract:

Soil and water contamination by heavy metals is a major challenging issue for the environment. Phytoextraction is an emerging, environmentally friendly and cost-efficient technology in which plants are used to eliminate pollutants from the soil and water. We aimed to assess the copper (Cu) and zinc (Zn) removal efficiency by two willow varieties such as Klara (S. viminalis x S. schwerinii x S. dasyclados) and Karin ((S.schwerinii x S. viminalis) x (S. viminalis x S.burjatica)) under different soil treatments (control/unpolluted, polluted, lime with polluted, wood ash with polluted). In 180 days of pot experiment, these willow varieties were grown in a highly polluted soil collected from Pyhasalmi mining area in Finland. The lime and wood ash were added to the polluted soil to improve the soil pH and observe their effects on metals accumulation in plant biomass. The Inductively Coupled Plasma Optical Emission Spectrometer (ELAN 6000 ICP-EOS, Perkin-Elmer Corporation) was used in this study to assess the heavy metals concentration in the plant biomass. The result shows that both varieties of willow have the capability to accumulate the considerable amount of Cu and Zn varying from 36.95 to 314.80 mg kg⁻¹ and 260.66 to 858.70 mg kg⁻¹, respectively. The application of lime and wood ash substantially affected the stimulation of the plant height, dry biomass and deposition of Cu and Zn into total plant biomass. Besides, the lime application appeared to upsurge Cu and Zn concentrations in the shoots and leaves in both willow varieties when planted in polluted soil. However, wood ash application was found more efficient to mobilize the metals in the roots of both varieties. The study recommends willow plantations to rehabilitate the Cu and Zn polluted soils.

Keywords: heavy metals, lime, phytoextraction, wood ash, willow

Procedia PDF Downloads 204
1786 Applications of Digital Tools, Satellite Images and Geographic Information Systems in Data Collection of Greenhouses in Guatemala

Authors: Maria A. Castillo H., Andres R. Leandro, Jose F. Bienvenido B.

Abstract:

During the last 20 years, the globalization of economies, population growth, and the increase in the consumption of fresh agricultural products have generated greater demand for ornamentals, flowers, fresh fruits, and vegetables, mainly from tropical areas. This market situation has demanded greater competitiveness and control over production, with more efficient protected agriculture technologies, which provide greater productivity and allow us to guarantee the quality and quantity that is required in a constant and sustainable way. Guatemala, located in the north of Central America, is one of the largest exporters of agricultural products in the region and exports fresh vegetables, flowers, fruits, ornamental plants, and foliage, most of which were grown in greenhouses. Although there are no official agricultural statistics on greenhouse production, several thesis works, and congress reports have presented consistent estimates. A wide range of protection structures and roofing materials are used, from the most basic and simple ones for rain control to highly technical and automated structures connected with remote sensors for monitoring and control of crops. With this breadth of technological models, it is necessary to analyze georeferenced data related to the cultivated area, to the different existing models, and to the covering materials, integrated with altitude, climate, and soil data. The georeferenced registration of the production units, the data collection with digital tools, the use of satellite images, and geographic information systems (GIS) provide reliable tools to elaborate more complete, agile, and dynamic information maps. This study details a methodology proposed for gathering georeferenced data of high protection structures (greenhouses) in Guatemala, structured in four phases: diagnosis of available information, the definition of the geographic frame, selection of satellite images, and integration with an information system geographic (GIS). It especially takes account of the actual lack of complete data in order to obtain a reliable decision-making system; this gap is solved through the proposed methodology. A summary of the results is presented in each phase, and finally, an evaluation with some improvements and tentative recommendations for further research is added. The main contribution of this study is to propose a methodology that allows to reduce the gap of georeferenced data in protected agriculture in this specific area where data is not generally available and to provide data of better quality, traceability, accuracy, and certainty for the strategic agricultural decision öaking, applicable to other crops, production models and similar/neighboring geographic areas.

Keywords: greenhouses, protected agriculture, GIS, Guatemala, satellite image, digital tools, precision agriculture

Procedia PDF Downloads 165