Search results for: three dimensional data acquisition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26674

Search results for: three dimensional data acquisition

20464 Measurement and Modelling of HIV Epidemic among High Risk Groups and Migrants in Two Districts of Maharashtra, India: An Application of Forecasting Software-Spectrum

Authors: Sukhvinder Kaur, Ashok Agarwal

Abstract:

Background: For the first time in 2009, India was able to generate estimates of HIV incidence (the number of new HIV infections per year). Analysis of epidemic projections helped in revealing that the number of new annual HIV infections in India had declined by more than 50% during the last decade (GOI Ministry of Health and Family Welfare, 2010). Then, National AIDS Control Organisation (NACO) planned to scale up its efforts in generating projections through epidemiological analysis and modelling by taking recent available sources of evidence such as HIV Sentinel Surveillance (HSS), India Census data and other critical data sets. Recently, NACO generated current round of HIV estimates-2012 through globally recommended tool “Spectrum Software” and came out with the estimates for adult HIV prevalence, annual new infections, number of people living with HIV, AIDS-related deaths and treatment needs. State level prevalence and incidence projections produced were used to project consequences of the epidemic in spectrum. In presence of HIV estimates generated at state level in India by NACO, USIAD funded PIPPSE project under the leadership of NACO undertook the estimations and projections to district level using same Spectrum software. In 2011, adult HIV prevalence in one of the high prevalent States, Maharashtra was 0.42% ahead of the national average of 0.27%. Considering the heterogeneity of HIV epidemic between districts, two districts of Maharashtra – Thane and Mumbai were selected to estimate and project the number of People-Living-with-HIV/AIDS (PLHIV), HIV-prevalence among adults and annual new HIV infections till 2017. Methodology: Inputs in spectrum included demographic data from Census of India since 1980 and sample registration system, programmatic data on ‘Alive and on ART (adult and children)’,‘Mother-Baby pairs under PPTCT’ and ‘High Risk Group (HRG)-size mapping estimates’, surveillance data from various rounds of HSS, National Family Health Survey–III, Integrated Biological and Behavioural Assessment and Behavioural Sentinel Surveillance. Major Findings: Assuming current programmatic interventions in these districts, an estimated decrease of 12% points in Thane and 31% points in Mumbai among new infections in HRGs and migrants is observed from 2011 by 2017. Conclusions: Project also validated decrease in HIV new infection among one of the high risk groups-FSWs using program cohort data since 2012 to 2016. Though there is a decrease in HIV prevalence and new infections in Thane and Mumbai, further decrease is possible if appropriate programme response, strategies and interventions are envisaged for specific target groups based on this evidence. Moreover, evidence need to be validated by other estimation/modelling techniques; and evidence can be generated for other districts of the state, where HIV prevalence is high and reliable data sources are available, to understand the epidemic within the local context.

Keywords: HIV sentinel surveillance, high risk groups, projections, new infections

Procedia PDF Downloads 206
20463 Avoiding Medication Errors in Juvenile Facilities

Authors: Tanja Salary

Abstract:

This study uncovers a gap in the research and adds to the body of knowledge regarding medication errors in a juvenile justice facility. The study includes an introduction to data collected about medication errors in a juvenile justice facility and explores contributing factors that relate to those errors. The data represent electronic incident records of the medication errors that were documented from the years 2011 through 2019. In addition, this study reviews both current and historical research of empirical data about patient safety standards and quality care comparing traditional healthcare facilities to juvenile justice residential facilities. The theoretical/conceptual framework for the research study pertains to Bandura and Adams’s (1977) framework of self-efficacy theory of behavioral change and Mark Friedman’s results-based accountability theory (2005). Despite the lack of evidence in previous studies about addressing medication errors in juvenile justice facilities, this presenter will relay information that adds to the body of knowledge to note the importance of how assessing the potential relationship between medication errors. Implications for more research include recommendations for more education and training regarding increased communication among juvenile justice staff, including nurses, who administer medications to juveniles to ensure adherence to patient safety standards. There are several opportunities for future research concerning other characteristics about factors that may affect medication administration errors within the residential juvenile justice facility.

Keywords: juvenile justice, medication errors, psychotropic medications, behavioral health, juveniles, incarcerated youth, recidivism, patient safety

Procedia PDF Downloads 55
20462 Use of Chlorophyll Meters to Assess In-Season Wheat Nitrogen Fertilizer Requirements in the Southern San Joaquin Valley

Authors: Brian Marsh

Abstract:

Nitrogen fertilizer is the most used and often the most mismanaged nutrient input. Nitrogen management has tremendous implications on crop productivity, quality and environmental stewardship. Sufficient nitrogen is needed to optimum yield and quality. Soil and in-season plant tissue testing for nitrogen status are a time consuming and expensive process. Real time sensing of plant nitrogen status can be a useful tool in managing nitrogen inputs. The objectives of this project were to assess the reliability of remotely sensed non-destructive plant nitrogen measurements compared to wet chemistry data from sampled plant tissue, develop in-season nitrogen recommendations based on remotely sensed data for improved nitrogen use efficiency and assess the potential for determining yield and quality from remotely sensed data. Very good correlations were observed between early-season remotely sensed crop nitrogen status and plant nitrogen concentrations and subsequent in-season fertilizer recommendations. The transmittance/absorbance type meters gave the most accurate readings. Early in-season fertilizer recommendation would be to apply 40 kg nitrogen per hectare plus 16 kg nitrogen per hectare for each unit difference measured with the SPAD meter between the crop and reference area or 25 kg plus 13 kg per hectare for each unit difference measured with the CCM 200. Once the crop was sufficiently fertilized meter readings became inconclusive and were of no benefit for determining nitrogen status, silage yield and quality and grain yield and protein.

Keywords: wheat, nitrogen fertilization, chlorophyll meter

Procedia PDF Downloads 385
20461 A Comparative Study between Different Techniques of Off-Page and On-Page Search Engine Optimization

Authors: Ahmed Ishtiaq, Maeeda Khalid, Umair Sajjad

Abstract:

In the fast-moving world, information is the key to success. If information is easily available, then it makes work easy. The Internet is the biggest collection and source of information nowadays, and with every single day, the data on internet increases, and it becomes difficult to find required data. Everyone wants to make his/her website at the top of search results. This can be possible when you have applied some techniques of SEO inside your application or outside your application, which are two types of SEO, onsite and offsite SEO. SEO is an abbreviation of Search Engine Optimization, and it is a set of techniques, methods to increase users of a website on World Wide Web or to rank up your website in search engine indexing. In this paper, we have compared different techniques of Onpage and Offpage SEO, and we have suggested many things that should be changed inside webpage, outside web page and mentioned some most powerful and search engine considerable elements and techniques in both types of SEO in order to gain high ranking on Search Engine.

Keywords: auto-suggestion, search engine optimization, SEO, query, web mining, web crawler

Procedia PDF Downloads 140
20460 Adaptive Beamforming with Steering Error and Mutual Coupling between Antenna Sensors

Authors: Ju-Hong Lee, Ching-Wei Liao

Abstract:

Owing to close antenna spacing between antenna sensors within a compact space, a part of data in one antenna sensor would outflow to other antenna sensors when the antenna sensors in an antenna array operate simultaneously. This phenomenon is called mutual coupling effect (MCE). It has been shown that the performance of antenna array systems can be degraded when the antenna sensors are in close proximity. Especially, in a systems equipped with massive antenna sensors, the degradation of beamforming performance due to the MCE is significantly inevitable. Moreover, it has been shown that even a small angle error between the true direction angle of the desired signal and the steering angle deteriorates the effectiveness of an array beamforming system. However, the true direction vector of the desired signal may not be exactly known in some applications, e.g., the application in land mobile-cellular wireless systems. Therefore, it is worth developing robust techniques to deal with the problem due to the MCE and steering angle error for array beamforming systems. In this paper, we present an efficient technique for performing adaptive beamforming with robust capabilities against the MCE and the steering angle error. Only the data vector received by an antenna array is required by the proposed technique. By using the received array data vector, a correlation matrix is constructed to replace the original correlation matrix associated with the received array data vector. Then, the mutual coupling matrix due to the MCE on the antenna array is estimated through a recursive algorithm. An appropriate estimate of the direction angle of the desired signal can also be obtained during the recursive process. Based on the estimated mutual coupling matrix, the estimated direction angle, and the reconstructed correlation matrix, the proposed technique can effectively cure the performance degradation due to steering angle error and MCE. The novelty of the proposed technique is that the implementation procedure is very simple and the resulting adaptive beamforming performance is satisfactory. Simulation results show that the proposed technique provides much better beamforming performance without requiring complicated complexity as compared with the existing robust techniques.

Keywords: adaptive beamforming, mutual coupling effect, recursive algorithm, steering angle error

Procedia PDF Downloads 315
20459 The Quality Assurance on the Standard of Private Schools in Bangkok

Authors: Autjira Songjan, Poramatdha Chutimant

Abstract:

This research is intended to study the operational quality assurance of private schools in Bangkok according to the opinions of administrators and teachers. Second is comparing the opinions of administrators and teachers about operating quality assurance process by gender, job and work experience. The sample include administrators and teachers of private schools in the Education School in Bangkok by using a proportion random technic. The questionnaire are used as query operations quality assurance to collect the data of private schools, the statistics that are used to analyze the data using the percentage, mean, standard deviation and Test the difference value and test of variance. The research found that the administrators and teachers have different sex, positions and duties have the different opinions about quality assurance in different statistically insignificant level 0.05 in the elements of performance management and the quality of the service that provided to students in the school.

Keywords: educational quality assurance, performance management, private schools in Bangkok, quality of the service

Procedia PDF Downloads 221
20458 Towards an Environmental Knowledge System in Water Management

Authors: Mareike Dornhoefer, Madjid Fathi

Abstract:

Water supply and water quality are key problems of mankind at the moment and - due to increasing population - in the future. Management disciplines like water, environment and quality management therefore need to closely interact, to establish a high level of water quality and to guarantee water supply in all parts of the world. Groundwater remediation is one aspect in this process. From a knowledge management perspective it is only possible to solve complex ecological or environmental problems if different factors, expert knowledge of various stakeholders and formal regulations regarding water, waste or chemical management are interconnected in form of a knowledge base. In general knowledge management focuses the processes of gathering and representing existing and new knowledge in a way, which allows for inference or deduction of knowledge for e.g. a situation where a problem solution or decision support are required. A knowledge base is no sole data repository, but a key element in a knowledge based system, thus providing or allowing for inference mechanisms to deduct further knowledge from existing facts. In consequence this knowledge provides decision support. The given paper introduces an environmental knowledge system in water management. The proposed environmental knowledge system is part of a research concept called Green Knowledge Management. It applies semantic technologies or concepts such as ontology or linked open data to interconnect different data and information sources about environmental aspects, in this case, water quality, as well as background material enriching an established knowledge base. Examples for the aforementioned ecological or environmental factors threatening water quality are among others industrial pollution (e.g. leakage of chemicals), environmental changes (e.g. rise in temperature) or floods, where all kinds of waste are merged and transferred into natural water environments. Water quality is usually determined with the help of measuring different indicators (e.g. chemical or biological), which are gathered with the help of laboratory testing, continuous monitoring equipment or other measuring processes. During all of these processes data are gathered and stored in different databases. Meanwhile the knowledge base needs to be established through interconnecting data of these different data sources and enriching its semantics. Experts may add their knowledge or experiences of previous incidents or influencing factors. In consequence querying or inference mechanisms are applied for the deduction of coherence between indicators, predictive developments or environmental threats. Relevant processes or steps of action may be modeled in form of a rule based approach. Overall the environmental knowledge system supports the interconnection of information and adding semantics to create environmental knowledge about water environment, supply chain as well as quality. The proposed concept itself is a holistic approach, which links to associated disciplines like environmental and quality management. Quality indicators and quality management steps need to be considered e.g. for the process and inference layers of the environmental knowledge system, thus integrating the aforementioned management disciplines in one water management application.

Keywords: water quality, environmental knowledge system, green knowledge management, semantic technologies, quality management

Procedia PDF Downloads 212
20457 Private Coded Computation of Matrix Multiplication

Authors: Malihe Aliasgari, Yousef Nejatbakhsh

Abstract:

The era of Big Data and the immensity of real-life datasets compels computation tasks to be performed in a distributed fashion, where the data is dispersed among many servers that operate in parallel. However, massive parallelization leads to computational bottlenecks due to faulty servers and stragglers. Stragglers refer to a few slow or delay-prone processors that can bottleneck the entire computation because one has to wait for all the parallel nodes to finish. The problem of straggling processors, has been well studied in the context of distributed computing. Recently, it has been pointed out that, for the important case of linear functions, it is possible to improve over repetition strategies in terms of the tradeoff between performance and latency by carrying out linear precoding of the data prior to processing. The key idea is that, by employing suitable linear codes operating over fractions of the original data, a function may be completed as soon as enough number of processors, depending on the minimum distance of the code, have completed their operations. The problem of matrix-matrix multiplication in the presence of practically big sized of data sets faced with computational and memory related difficulties, which makes such operations are carried out using distributed computing platforms. In this work, we study the problem of distributed matrix-matrix multiplication W = XY under storage constraints, i.e., when each server is allowed to store a fixed fraction of each of the matrices X and Y, which is a fundamental building of many science and engineering fields such as machine learning, image and signal processing, wireless communication, optimization. Non-secure and secure matrix multiplication are studied. We want to study the setup, in which the identity of the matrix of interest should be kept private from the workers and then obtain the recovery threshold of the colluding model, that is, the number of workers that need to complete their task before the master server can recover the product W. The problem of secure and private distributed matrix multiplication W = XY which the matrix X is confidential, while matrix Y is selected in a private manner from a library of public matrices. We present the best currently known trade-off between communication load and recovery threshold. On the other words, we design an achievable PSGPD scheme for any arbitrary privacy level by trivially concatenating a robust PIR scheme for arbitrary colluding workers and private databases and the proposed SGPD code that provides a smaller computational complexity at the workers.

Keywords: coded distributed computation, private information retrieval, secret sharing, stragglers

Procedia PDF Downloads 111
20456 Meeting the Energy Balancing Needs in a Fully Renewable European Energy System: A Stochastic Portfolio Framework

Authors: Iulia E. Falcan

Abstract:

The transition of the European power sector towards a clean, renewable energy (RE) system faces the challenge of meeting power demand in times of low wind speed and low solar radiation, at a reasonable cost. This is likely to be achieved through a combination of 1) energy storage technologies, 2) development of the cross-border power grid, 3) installed overcapacity of RE and 4) dispatchable power sources – such as biomass. This paper uses NASA; derived hourly data on weather patterns of sixteen European countries for the past twenty-five years, and load data from the European Network of Transmission System Operators-Electricity (ENTSO-E), to develop a stochastic optimization model. This model aims to understand the synergies between the four classes of technologies mentioned above and to determine the optimal configuration of the energy technologies portfolio. While this issue has been addressed before, it was done so using deterministic models that extrapolated historic data on weather patterns and power demand, as well as ignoring the risk of an unbalanced grid-risk stemming from both the supply and the demand side. This paper aims to explicitly account for the inherent uncertainty in the energy system transition. It articulates two levels of uncertainty: a) the inherent uncertainty in future weather patterns and b) the uncertainty of fully meeting power demand. The first level of uncertainty is addressed by developing probability distributions for future weather data and thus expected power output from RE technologies, rather than known future power output. The latter level of uncertainty is operationalized by introducing a Conditional Value at Risk (CVaR) constraint in the portfolio optimization problem. By setting the risk threshold at different levels – 1%, 5% and 10%, important insights are revealed regarding the synergies of the different energy technologies, i.e., the circumstances under which they behave as either complements or substitutes to each other. The paper concludes that allowing for uncertainty in expected power output - rather than extrapolating historic data - paints a more realistic picture and reveals important departures from results of deterministic models. In addition, explicitly acknowledging the risk of an unbalanced grid - and assigning it different thresholds - reveals non-linearity in the cost functions of different technology portfolio configurations. This finding has significant implications for the design of the European energy mix.

Keywords: cross-border grid extension, energy storage technologies, energy system transition, stochastic portfolio optimization

Procedia PDF Downloads 157
20455 The Consumer Behavior and the Customer Loyalty of CP Fresh Mart Consumers in Bangkok

Authors: Kanmanas Muensak, Somphoom Saweangkun

Abstract:

The objectives of this research were to study the consumer behavior that affects the customer loyalty of CP Fresh Mart in Bangkok province. The sample of the study comprised 400 consumers over 15 years old who made the purchase through CP Fresh Mart in Bangkok. The questionnaires were used as the data gathering instrument, and the data were analyzed applying Percentage, Mean, Standard Deviation, Independent Sample t-test, Two- Way ANOVA, and Least Significant Difference, and Pearson’s Correlation Coefficient also. The result of hypothesis testing showed that the respondents of different gender, age, level of education, income, marital status and occupation had differences in consumer behavior through customer loyalty of CP Fresh Mart and the factors on customer loyalty in the aspects of re-purchase, word of mouth and price sensitive, promotion, process, and personnel had positive relationship with the consumer behavior through of CP Fresh Mart in Bangkok as well as.

Keywords: consumers in Bangkok, consumer behavior, customer loyalty, CP Fresh Mart, operating budget

Procedia PDF Downloads 318
20454 Comparisons of Surveying with Terrestrial Laser Scanner and Total Station for Volume Determination of Overburden and Coal Excavations in Large Open-Pit Mine

Authors: B. Keawaram, P. Dumrongchai

Abstract:

The volume of overburden and coal excavations in open-pit mine is generally determined by conventional survey such as total station. This study aimed to evaluate the accuracy of terrestrial laser scanner (TLS) used to measure overburden and coal excavations, and to compare TLS survey data sets with the data of the total station. Results revealed that, the reference points measured with the total station showed 0.2 mm precision for both horizontal and vertical coordinates. When using TLS on the same points, the standard deviations of 4.93 cm and 0.53 cm for horizontal and vertical coordinates, respectively, were achieved. For volume measurements covering the mining areas of 79,844 m2, TLS yielded the mean difference of about 1% and the surface error margin of 6 cm at the 95% confidence level when compared to the volume obtained by total station.

Keywords: mine, survey, terrestrial laser scanner, total station

Procedia PDF Downloads 371
20453 Satellite Derived Snow Cover Status and Trends in the Indus Basin Reservoir

Authors: Muhammad Tayyab Afzal, Muhammad Arslan, Mirza Muhammad Waqar

Abstract:

Snow constitutes an important component of the cryosphere, characterized by high temporal and spatial variability. Because of the contribution of snow melt to water availability, snow is an important focus for research on climate change and adaptation. MODIS satellite data have been used to identify spatial-temporal trends in snow cover in the upper Indus basin. For this research MODIS satellite 8 day composite data of medium resolution (250m) have been analysed from 2001-2005.Pixel based supervised classification have been performed and extent of snow have been calculated of all the images. Results show large variation in snow cover between years while an increasing trend from west to east is observed. Temperature data for the Upper Indus Basin (UIB) have been analysed for seasonal and annual trends over the period 2001-2005 and calibrated with the results acquired by the research. From the analysis it is concluded that there are indications that regional warming is one of the factor that is affecting the hydrology of the upper Indus basin due to accelerated glacial melting during the simulation period, stream flow in the upper Indus basin can be predicted with a high degree of accuracy. This conclusion is also supported by the research of ICIMOD in which there is an observation that the average annual precipitation over a five year period is less than the observed stream flow and supported by positive temperature trends in all seasons.

Keywords: indus basin, MODIS, remote sensing, snow cover

Procedia PDF Downloads 378
20452 Enhancing Scalability in Ethereum Network Analysis: Methods and Techniques

Authors: Stefan K. Behfar

Abstract:

The rapid growth of the Ethereum network has brought forth the urgent need for scalable analysis methods to handle the increasing volume of blockchain data. In this research, we propose efficient methodologies for making Ethereum network analysis scalable. Our approach leverages a combination of graph-based data representation, probabilistic sampling, and parallel processing techniques to achieve unprecedented scalability while preserving critical network insights. Data Representation: We develop a graph-based data representation that captures the underlying structure of the Ethereum network. Each block transaction is represented as a node in the graph, while the edges signify temporal relationships. This representation ensures efficient querying and traversal of the blockchain data. Probabilistic Sampling: To cope with the vastness of the Ethereum blockchain, we introduce a probabilistic sampling technique. This method strategically selects a representative subset of transactions and blocks, allowing for concise yet statistically significant analysis. The sampling approach maintains the integrity of the network properties while significantly reducing the computational burden. Graph Convolutional Networks (GCNs): We incorporate GCNs to process the graph-based data representation efficiently. The GCN architecture enables the extraction of complex spatial and temporal patterns from the sampled data. This combination of graph representation and GCNs facilitates parallel processing and scalable analysis. Distributed Computing: To further enhance scalability, we adopt distributed computing frameworks such as Apache Hadoop and Apache Spark. By distributing computation across multiple nodes, we achieve a significant reduction in processing time and enhanced memory utilization. Our methodology harnesses the power of parallelism, making it well-suited for large-scale Ethereum network analysis. Evaluation and Results: We extensively evaluate our methodology on real-world Ethereum datasets covering diverse time periods and transaction volumes. The results demonstrate its superior scalability, outperforming traditional analysis methods. Our approach successfully handles the ever-growing Ethereum data, empowering researchers and developers with actionable insights from the blockchain. Case Studies: We apply our methodology to real-world Ethereum use cases, including detecting transaction patterns, analyzing smart contract interactions, and predicting network congestion. The results showcase the accuracy and efficiency of our approach, emphasizing its practical applicability in real-world scenarios. Security and Robustness: To ensure the reliability of our methodology, we conduct thorough security and robustness evaluations. Our approach demonstrates high resilience against adversarial attacks and perturbations, reaffirming its suitability for security-critical blockchain applications. Conclusion: By integrating graph-based data representation, GCNs, probabilistic sampling, and distributed computing, we achieve network scalability without compromising analytical precision. This approach addresses the pressing challenges posed by the expanding Ethereum network, opening new avenues for research and enabling real-time insights into decentralized ecosystems. Our work contributes to the development of scalable blockchain analytics, laying the foundation for sustainable growth and advancement in the domain of blockchain research and application.

Keywords: Ethereum, scalable network, GCN, probabilistic sampling, distributed computing

Procedia PDF Downloads 63
20451 ACBM: Attention-Based CNN and Bi-LSTM Model for Continuous Identity Authentication

Authors: Rui Mao, Heming Ji, Xiaoyu Wang

Abstract:

Keystroke dynamics are widely used in identity recognition. It has the advantage that the individual typing rhythm is difficult to imitate. It also supports continuous authentication through the keyboard without extra devices. The existing keystroke dynamics authentication methods based on machine learning have a drawback in supporting relatively complex scenarios with massive data. There are drawbacks to both feature extraction and model optimization in these methods. To overcome the above weakness, an authentication model of keystroke dynamics based on deep learning is proposed. The model uses feature vectors formed by keystroke content and keystroke time. It ensures efficient continuous authentication by cooperating attention mechanisms with the combination of CNN and Bi-LSTM. The model has been tested with Open Data Buffalo dataset, and the result shows that the FRR is 3.09%, FAR is 3.03%, and EER is 4.23%. This proves that the model is efficient and accurate on continuous authentication.

Keywords: keystroke dynamics, identity authentication, deep learning, CNN, LSTM

Procedia PDF Downloads 143
20450 Prevalence Of Listeria And Salmonella Contamination In Fda Recalled Foods

Authors: Oluwatofunmi Musa-Ajakaiye, Paul Olorunfemi M.D MPH, John Obafaiye

Abstract:

Introduction: The U.S Food and Drug Administration (FDA) reports the public notices for recalled FDA-regulated products over periods of time. It study reviewed the primary reasons for recalls of products of various types over a period of 7 years. Methods: The study analyzed data provided in the FDA’s archived recalls for the years 2010-2017. It identified the various reasons for product recalls in the categories of foods, beverages, drugs, medical devices, animal and veterinary products, and dietary supplements. Using SPSS version 29, descriptive statistics and chi-square analysis of the data were performed. Results (numbers, percentages, p-values, chi-square): Over the period of analysis, a total of 931 recalls were reported. The most frequent reason for recalls was undeclared products (36.7%). The analysis showed that the most recalled product type in the data set was foods and beverages, representing 591 of all recalled products (63.5%).In addition, it was observed that foods and beverages represent 77.2% of products recalled due to the presence of microorganisms. Also, a sub-group analysis of recall reasons of food and beverages found that the most prevalent reason for such recalls was undeclared products (50.1%) followed by Listeria (17.3%) then Salmonella (13.2%). Conclusion: This analysis shows that foods and beverages have the greatest percentages of total recalls due to the presence of undeclared products listeria contamination and Salmonella contamination. The prevalence of Salmonella and Listeria contamination suggests that there is a high risk of microbial contamination in FDA-approved products and further studies on the effects of such contamination must be conducted to ensure consumer safety.

Keywords: food, beverages, listeria, salmonella, FDA, contamination, microbial

Procedia PDF Downloads 56
20449 Retail Managers’ Perception on Coca-Cola Company’s Success of Glass Package Recovery and Recycling in Nairobi, Kenya

Authors: Brigitte Wabuyabo-Okonga

Abstract:

Little research has been done to establish the level of success of Coca Cola Company in recycling and reusing their glass bottles. This paper attempts to establish retail managers’ perception of the company’s self acclaimed success. Retail managers of supermarkets in the CBD of Nairobi, Kenya were considered for the study. Data were collected through questionnaires and analyzed using descriptive (mean, frequencies and percentages) and inferential statistics (correlation analysis) were used to analyze the data. The study found out that there is relative success although a lot needs to be done. For example, improving in communicating policy issues and in practice enhance the actual collection of broken and/or non-broken Coca Cola Company glass bottles through providing drop-off points in open areas such as on the streets and in parks.

Keywords: Coca Cola Company glass bottles, Kenya, Nairobi, packaging, retail manager

Procedia PDF Downloads 308
20448 Public-Private Partnership for Community Empowerment and Sustainability: Exploring Save the Children’s 'School Me' Project in West Africa

Authors: Gae Hee Song

Abstract:

This paper aims to address the evolution of public-private partnerships for mainstreaming an evaluation approach in the community-based education project. It examines the distinctive features of Save the Children’s School Me project in terms of empowerment evaluation principles introduced by David M. Fetterman, especially community ownership, capacity building, and organizational learning. School Me is a Save the Children Korea funded-project, having been implemented in Cote d’Ivoire and Sierra Leone since 2016. The objective of this project is to reduce gender-based disparities in school completion and learning outcomes by creating an empowering learning environment for girls and boys. Both quasi-experimental and experimental methods for impact evaluation have been used to explore changes in learning outcomes, gender attitudes, and learning environments. To locate School Me in the public-private partnership framework for community empowerment and sustainability, the data have been collected from School Me progress/final reports, baseline, and endline reports, fieldwork observations, inter-rater reliability of baseline and endline data collected from a total of 75 schools in Cote d’Ivoire and Sierra Leone. The findings of this study show that School Me project has a significant evaluation component, including qualitative exploratory research, participatory monitoring, and impact evaluation. It strongly encourages key actors, girls, boys, parents, teachers, community leaders, and local education authorities, to participate in the collection and interpretation of data. For example, 45 community volunteers collected baseline data in Cote d’Ivoire; on the other hand, three local government officers and fourteen enumerators participated in the follow-up data collection of Sierra Leone. Not only does this public-private partnership improve local government and community members’ knowledge and skills of monitoring and evaluation, but the evaluative findings also help them find their own problems and solutions with a strong sense of community ownership. Such community empowerment enables Save the Children country offices and member offices to gain invaluable experiences and lessons learned. As a result, empowerment evaluation leads to community-oriented governance and the sustainability of the School Me project.

Keywords: community empowerment, Cote d’Ivoire, empowerment evaluation, public-private partnership, save the children, school me, Sierra Leone, sustainability

Procedia PDF Downloads 117
20447 Irradion: Portable Small Animal Imaging and Irradiation Unit

Authors: Josef Uher, Jana Boháčová, Richard Kadeřábek

Abstract:

In this paper, we present a multi-robot imaging and irradiation research platform referred to as Irradion, with full capabilities of portable arbitrary path computed tomography (CT). Irradion is an imaging and irradiation unit entirely based on robotic arms for research on cancer treatment with ion beams on small animals (mice or rats). The platform comprises two subsystems that combine several imaging modalities, such as 2D X-ray imaging, CT, and particle tracking, with precise positioning of a small animal for imaging and irradiation. Computed Tomography: The CT subsystem of the Irradion platform is equipped with two 6-joint robotic arms that position a photon counting detector and an X-ray tube independently and freely around the scanned specimen and allow image acquisition utilizing computed tomography. Irradiation measures nearly all conventional 2D and 3D trajectories of X-ray imaging with precisely calibrated and repeatable geometrical accuracy leading to a spatial resolution of up to 50 µm. In addition, the photon counting detectors allow X-ray photon energy discrimination, which can suppress scattered radiation, thus improving image contrast. It can also measure absorption spectra and recognize different materials (tissue) types. X-ray video recording and real-time imaging options can be applied for studies of dynamic processes, including in vivo specimens. Moreover, Irradion opens the door to exploring new 2D and 3D X-ray imaging approaches. We demonstrate in this publication various novel scan trajectories and their benefits. Proton Imaging and Particle Tracking: The Irradion platform allows combining several imaging modules with any required number of robots. The proton tracking module comprises another two robots, each holding particle tracking detectors with position, energy, and time-sensitive sensors Timepix3. Timepix3 detectors can track particles entering and exiting the specimen and allow accurate guiding of photon/ion beams for irradiation. In addition, quantifying the energy losses before and after the specimen brings essential information for precise irradiation planning and verification. Work on the small animal research platform Irradion involved advanced software and hardware development that will offer researchers a novel way to investigate new approaches in (i) radiotherapy, (ii) spectral CT, (iii) arbitrary path CT, (iv) particle tracking. The robotic platform for imaging and radiation research developed for the project is an entirely new product on the market. Preclinical research systems with precision robotic irradiation with photon/ion beams combined with multimodality high-resolution imaging do not exist currently. The researched technology can potentially cause a significant leap forward compared to the current, first-generation primary devices.

Keywords: arbitrary path CT, robotic CT, modular, multi-robot, small animal imaging

Procedia PDF Downloads 83
20446 Satellite LiDAR-Based Digital Terrain Model Correction using Gaussian Process Regression

Authors: Keisuke Takahata, Hiroshi Suetsugu

Abstract:

Forest height is an important parameter for forest biomass estimation, and precise elevation data is essential for accurate forest height estimation. There are several globally or nationally available digital elevation models (DEMs) like SRTM and ASTER. However, its accuracy is reported to be low particularly in mountainous areas where there are closed canopy or steep slope. Recently, space-borne LiDAR, such as the Global Ecosystem Dynamics Investigation (GEDI), have started to provide sparse but accurate ground elevation and canopy height estimates. Several studies have reported the high degree of accuracy in their elevation products on their exact footprints, while it is not clear how this sparse information can be used for wider area. In this study, we developed a digital terrain model correction algorithm by spatially interpolating the difference between existing DEMs and GEDI elevation products by using Gaussian Process (GP) regression model. The result shows that our GP-based methodology can reduce the mean bias of the elevation data from 3.7m to 0.3m when we use airborne LiDAR-derived elevation information as ground truth. Our algorithm is also capable of quantifying the elevation data uncertainty, which is critical requirement for biomass inventory. Upcoming satellite-LiDAR missions, like MOLI (Multi-footprint Observation Lidar and Imager), are expected to contribute to the more accurate digital terrain model generation.

Keywords: digital terrain model, satellite LiDAR, gaussian processes, uncertainty quantification

Procedia PDF Downloads 166
20445 Comparison of the H-Index of Researchers of Google Scholar and Scopus

Authors: Adian Fatchur Rochim, Abdul Muis, Riri Fitri Sari

Abstract:

H-index has been widely used as a performance indicator of researchers around the world especially in Indonesia. The Government uses Scopus and Google scholar as indexing references in providing recognition and appreciation. However, those two indexing services yield to different H-index values. For that purpose, this paper evaluates the difference of the H-index from those services. Researchers indexed by Webometrics, are used as reference’s data in this paper. Currently, Webometrics only uses H-index from Google Scholar. This paper observed and compared corresponding researchers’ data from Scopus to get their H-index score. Subsequently, some researchers with huge differences in score are observed in more detail on their paper’s publisher. This paper shows that the H-index of researchers in Google Scholar is approximately 2.45 times of their Scopus H-Index. Most difference exists due to the existence of uncertified publishers, which is considered in Google Scholar but not in Scopus.

Keywords: Google Scholar, H-index, Scopus, performance indicator

Procedia PDF Downloads 263
20444 Construction of Ovarian Cancer-on-Chip Model by 3D Bioprinting and Microfluidic Techniques

Authors: Zakaria Baka, Halima Alem

Abstract:

Cancer is a major worldwide health problem that has caused around ten million deaths in 2020. In addition, efforts to develop new anti-cancer drugs still face a high failure rate. This is partly due to the lack of preclinical models that recapitulate in-vivo drug responses. Indeed conventional cell culture approach (known as 2D cell culture) is far from reproducing the complex, dynamic and three-dimensional environment of tumors. To set up more in-vivo-like cancer models, 3D bioprinting seems to be a promising technology due to its ability to achieve 3D scaffolds containing different cell types with controlled distribution and precise architecture. Moreover, the introduction of microfluidic technology makes it possible to simulate in-vivo dynamic conditions through the so-called “cancer-on-chip” platforms. Whereas several cancer types have been modeled through the cancer-on-chip approach, such as lung cancer and breast cancer, only a few works describing ovarian cancer models have been described. The aim of this work is to combine 3D bioprinting and microfluidic technics with setting up a 3D dynamic model of ovarian cancer. In the first phase, alginate-gelatin hydrogel containing SKOV3 cells was used to achieve tumor-like structures through an extrusion-based bioprinter. The desired form of the tumor-like mass was first designed on 3D CAD software. The hydrogel composition was then optimized for ensuring good and reproducible printability. Cell viability in the bioprinted structures was assessed using Live/Dead assay and WST1 assay. In the second phase, these bioprinted structures will be included in a microfluidic device that allows simultaneous testing of different drug concentrations. This microfluidic dispositive was first designed through computational fluid dynamics (CFD) simulations for fixing its precise dimensions. It was then be manufactured through a molding method based on a 3D printed template. To confirm the results of CFD simulations, doxorubicin (DOX) solutions were perfused through the dispositive and DOX concentration in each culture chamber was determined. Once completely characterized, this model will be used to assess the efficacy of anti-cancer nanoparticles developed in the Jean Lamour institute.

Keywords: 3D bioprinting, ovarian cancer, cancer-on-chip models, microfluidic techniques

Procedia PDF Downloads 185
20443 Assessment Environmental and Economic of Yerba Mate as a Feed Additive on Feedlot Lamb

Authors: Danny Alexander R. Moreno, Gustavo L. Sartorello, Yuli Andrea P. Bermudez, Richard R. Lobo, Ives Claudio S. Bueno, Augusto H. Gameiro

Abstract:

Meat production is a significant sector for Brazil's economy; however, the agricultural segment has suffered censure regarding the negative impacts on the environment, which consequently results in climate change. Therefore, it is essential the implementation of nutritional strategies that can improve the environmental performance of livestock. This research aimed to estimate the environmental impact and profitability of the use of yerba mate extract (Ilex paraguariensis) as an additive in the feeding of feedlot lamb. Thirty-six castrated male lambs (average weight of 23.90 ± 3.67 kg and average age of 75 days) were randomly assigned to four experimental diets with different levels of inclusion of yerba mate extract (0, 1, 2, and 4 %) based on dry matter. The animals were confined for fifty-three days and fed with 60:40 corn silage to concentrate ratio. As an indicator of environmental impact, the carbon footprint (CF) was measured as kg of CO₂ equivalent (CO₂-eq) per kg of body weight produced (BWP). The greenhouse gas (GHG) emissions such as methane (CH₄) generated from enteric fermentation, were calculated using the sulfur hexafluoride gas tracer (SF₆) technique; while the CH₄, nitrous oxide (N₂O - emissions generated by feces and urine), and carbon dioxide (CO₂ - emissions generated by concentrate and silage processing) were estimated using the Intergovernmental Panel on Climate Change (IPCC) methodology. To estimate profitability, the gross margin was used, which is the total revenue minus the total cost; the latter is composed of the purchase of animals and food. The boundaries of this study considered only the lamb fattening system. The enteric CH₄ emission from the lamb was the largest source of on-farm GHG emissions (47%-50%), followed by CH₄ and N₂O emissions from manure (10%-20%) and CO₂ emission from the concentrate, silage, and fossil energy (17%-5%). The treatment that generated the least environmental impact was the group with 4% of yerba mate extract (YME), which showed a 3% reduction in total GHG emissions in relation to the control (1462.5 and 1505.5 kg CO₂-eq, respectively). However, the scenario with 1% YME showed an increase in emissions of 7% compared to the control group. In relation to CF, the treatment with 4% YME had the lowest value (4.1 kg CO₂-eq/kg LW) compared with the other groups. Nevertheless, although the 4% YME inclusion scenario showed the lowest CF, the gross margin decreased by 36% compared to the control group (0% YME), due to the cost of YME as a food additive. The results showed that the extract has the potential for use in reducing GHG. However, the cost of implementing this input as a mitigation strategy increased the production cost. Therefore, it is important to develop political strategies that help reduce the acquisition costs of input that contribute to the search for the environmental and economic benefit of the livestock sector.

Keywords: meat production, natural additives, profitability, sheep

Procedia PDF Downloads 126
20442 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation

Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez

Abstract:

With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).

Keywords: component carrier, carrier aggregation, LTE-advanced, scheduling

Procedia PDF Downloads 186
20441 Power Recovery in Egyptian Natural Gas Pressure Reduction Stations Using Turboexpander Systems

Authors: Kamel A. Elshorbagy, Mohamed A. Hussein, Rola S. Afify

Abstract:

Natural gas pressure reduction is typically achieved using pressure reducing valves, where isenthalpic expansion takes place with considerable amount of wasted energy in an irreversible throttling process of the gas. Replacing gas-throttling process by an expansion process in a turbo expander (TE) converts the pressure of natural gas into mechanical energy transmitted to a loading device (i.e. an electric generator). This paper investigates the performance of a turboexpander system for power recovery at natural gas pressure reduction stations. There is a considerable temperature drop associated with the turboexpander process. Essential preheating is required, using gas fired boilers, to avoid undesirable effects of a low outlet temperature. Various system configurations were simulated by the general flow sheet simulator HYSYS and factors affecting the overall performance of the systems were investigated. Power outputs and fuel requirements were found using typical gas flow variation data. The simulation was performed for two case studies in which real input data are used. These case studies involve a domestic (commercial) and an industrial natural gas pressure reduction stations in Egypt. Economic studies of using the turboexpander system in both of the two natural gas pressure reduction stations are conducted using precise data obtained through communication with several companies working in this field. The results of economic analysis, for the two case studies, prove that using turboexpander systems in Egyptian natural gas reduction stations can be a successful project for energy conservation.

Keywords: natural gas, power recovery, reduction stations, turboexpander systems

Procedia PDF Downloads 309
20440 A Similarity Measure for Classification and Clustering in Image Based Medical and Text Based Banking Applications

Authors: K. P. Sandesh, M. H. Suman

Abstract:

Text processing plays an important role in information retrieval, data-mining, and web search. Measuring the similarity between the documents is an important operation in the text processing field. In this project, a new similarity measure is proposed. To compute the similarity between two documents with respect to a feature the proposed measure takes the following three cases into account: (1) The feature appears in both documents; (2) The feature appears in only one document and; (3) The feature appears in none of the documents. The proposed measure is extended to gauge the similarity between two sets of documents. The effectiveness of our measure is evaluated on several real-world data sets for text classification and clustering problems, especially in banking and health sectors. The results show that the performance obtained by the proposed measure is better than that achieved by the other measures.

Keywords: document classification, document clustering, entropy, accuracy, classifiers, clustering algorithms

Procedia PDF Downloads 512
20439 Developing Pandi-Tekki to Tourism Destination in Tanglang, Billiri Local Government Area, Gombe State, Nigeria

Authors: Sanusi Abubakar Sadiq

Abstract:

Despite the significance of tourism as a key revenue earner and employment generator, it is still being disregarded in many areas. The prospects of existing resources could boost development in communities; region, etc. are less used. This study is carried out with the view of developing Pandi-Tekki in Tanglang in Billiri Local Government Area as a Tourism Destination. It was primarily aimed at identifying features of Pandi-Tekki that could be developed into tourism attraction and suggest ways of developing the prospective site into a tourism destination, as well as exploring its possible contribution to tourism sector in Gombe State. Literature was reviewed based on relevant published materials. Data was collected through the use of qualitative and quantitative methods which include personal observation and structured questionnaire. Data was analyzed using the statistical package for the social sciences (SPSS) software. Result based on the data collected shows that Pandi-Tekki has potentials that can be developed as an attraction. The result also shows that the local community perceives tourism as a good development that will open them up to the entire world and also generate revenue to stimulate their economy. Conclusions were drawn based on the findings with regard to the analysis carried out in this research. It was discovered that Pandi-Tekki can be developed as a tourism destination, and there will be a great success towards achieving the aim and objectives of the development. Therefore, recommendations were made on creating awareness on the need to develop Pandi-Tekki as a Tourism Destination and the need for government to provide tourism facilities at the destination since it is a public outfit.

Keywords: attraction, destination, developing, features

Procedia PDF Downloads 275
20438 Isotretinoin and Psychiatric Adverse Events: A Review of the Evidence

Authors: Thodoris Tsagkaris, Marios Stavropoulos, Panagiotis Theodosis-Nobelos, Charalampos Triantis

Abstract:

Isotretinoin is a widely used therapeutic for the treatment of acne vulgaris and various other skin disorders. However, since its approval, many side effects and contraindications have been described, particularly important, such as teratogenicity as well as liver disease and dermal deterioration. In a very important allegation, isotretinoin has been linked with psychiatric symptoms like depression, suicidal ideation, schizophrenia, and hypervitaminosis A syndrome characteristics. These adverse effects have raised significant concerns regarding the safety of isotretinoin. Numerous studies and research have associated isotretinoin with side effects on the mental health of patients and have proposed plausible mechanisms regarding this suspected causative relationship. However, the evidence is still contradicting, and the data disperse, making their validity less valuable. Thus, in the present study, we aim to analyze further the available literature and present a complete analysis of the side effects of isotretinoin, with particular emphasis on the effects it may have on the mental health of patients. The review is based on international articles from broad scientific electronic databases like PubMed and Scopus. This review concludes that although many studies have associated isotretinoin with mental effects like depression, bipolar disorder, schizophrenia, and suicidal ideation, the data are still insufficient and often contradictory. In fact, additional studies with accurate data and larger double-blinded samples, and more analytic systematic reviews are required. It is especially important to monitor the dose and the intervals that isotretinoin has to be administered in order to potentially cause mental health problems, as well as the duration of treatment and the role that the patient's medical and pharmaceutical history may play.

Keywords: acne, depression, isotretinoin, mental health

Procedia PDF Downloads 149
20437 Predictive Analytics of Bike Sharing Rider Parameters

Authors: Bongs Lainjo

Abstract:

The evolution and escalation of bike-sharing programs (BSP) continue unabated. Since the sixties, many countries have introduced different models and strategies of BSP. These include variations ranging from dockless models to electronic real-time monitoring systems. Reasons for using this BSP include recreation, errands, work, etc. And there is all indication that complex, and more innovative rider-friendly systems are yet to be introduced. The objective of this paper is to analyze current variables established by different operators and streamline them identifying the most compelling ones using analytics. Given the contents of available databases, there is a lack of uniformity and common standard on what is required and what is not. Two factors appear to be common: user type (registered and unregistered, and duration of each trip). This article uses historical data provided by one operator based in the greater Washington, District of Columbia, USA area. Several variables including categorical and continuous data types were screened. Eight out of 18 were considered acceptable and significantly contribute to determining a useful and reliable predictive model. Bike-sharing systems have become popular in recent years all around the world. Although this trend has resulted in many studies on public cycling systems, there have been few previous studies on the factors influencing public bicycle travel behavior. A bike-sharing system is a computer-controlled system in which individuals can borrow bikes for a fee or free for a limited period. This study has identified unprecedented useful, and pragmatic parameters required in improving BSP ridership dynamics.

Keywords: sharing program, historical data, parameters, ridership dynamics, trip duration

Procedia PDF Downloads 125
20436 Teachers and Learners Perceptions on the Impact of Different Test Procedures on Reading: A Case Study

Authors: Bahloul Amel

Abstract:

The main aim of this research was to investigate the perspectives of English language teachers and learners on the effect of test techniques on reading comprehension, test performance and assessment. The research has also aimed at finding the differences between teacher and learner perspectives, specifying the test techniques which have the highest effect, investigating the other factors affecting reading comprehension, and comparing the results with the similar studies. In order to achieve these objectives, perspectives and findings of different researchers were reviewed, two different questionnaires were prepared to collect data for the perspectives of teachers and learners, the questionnaires were applied to 26 learners and 8 teachers from the University of Batna (Algeria), and quantitative and qualitative data analysis of the results were done. The results and analysis of the results show that different test techniques affect reading comprehension, test performance and assessment at different percentages rates.

Keywords: reading comprehension, reading assessment, test performance, test techniques

Procedia PDF Downloads 446
20435 Regional Low Gravity Anomalies Influencing High Concentrations of Heavy Minerals on Placer Deposits

Authors: T. B. Karu Jayasundara

Abstract:

Regions of low gravity and gravity anomalies both influence heavy mineral concentrations on placer deposits. Economically imported heavy minerals are likely to have higher levels of deposition in low gravity regions of placer deposits. This can be found in coastal regions of Southern Asia, particularly in Sri Lanka and Peninsula India and areas located in the lowest gravity region of the world. The area about 70 kilometers of the east coast of Sri Lanka is covered by a high percentage of ilmenite deposits, and the southwest coast of the island consists of Monazite placer deposit. These deposits are one of the largest placer deposits in the world. In India, the heavy mineral industry has a good market. On the other hand, based on the coastal placer deposits recorded, the high gravity region located around Papua New Guinea, has no such heavy mineral deposits. In low gravity regions, with the help of other depositional environmental factors, the grains have more time and space to float in the sea, this helps bring high concentrations of heavy mineral deposits to the coast. The effect of low and high gravity can be demonstrated by using heavy mineral separation devices.  The Wilfley heavy mineral separating table is one of these; it is extensively used in industries and in laboratories for heavy mineral separation. The horizontally oscillating Wilfley table helps to separate heavy and light mineral grains in to deferent fractions, with the use of water. In this experiment, the low and high angle of the Wilfley table are representing low and high gravity respectively. A sample mixture of grain size <0.85 mm of heavy and light mineral grains has been used for this experiment. The high and low angle of the table was 60 and 20 respectively for this experiment. The separated fractions from the table are again separated into heavy and light minerals, with the use of heavy liquid, which consists of a specific gravity of 2.85. The fractions of separated heavy and light minerals have been used for drawing the two-dimensional graphs. The graphs show that the low gravity stage has a high percentage of heavy minerals collected in the upper area of the table than in the high gravity stage. The results of the experiment can be used for the comparison of regional low gravity and high gravity levels of heavy minerals. If there are any heavy mineral deposits in the high gravity regions, these deposits will take place far away from the coast, within the continental shelf.

Keywords: anomaly, gravity, influence, mineral

Procedia PDF Downloads 190