Search results for: open source data
28886 Azolla Pinnata as Promising Source for Animal Feed in India: An Experimental Study to Evaluate the Nutrient Enhancement Result of Feed
Authors: Roshni Raha, Karthikeyan S.
Abstract:
The world's largest livestock population resides in India. Existing strategies must be modified to increase the production of livestock and their by-products in order to meet the demands of the growing human population. Even though India leads the world in both milk production and the number of cows, average production is not very healthy and productive. This may be due to the animals' poor nutrition caused by a chronic under-availability of high-quality fodder and feed. This article explores Azolla pinnata to be a promising source to produce high-quality unconventional feed and fodder for effective livestock production and good quality breeding in India. This article is an exploratory study using a literature survey and experimentation analysis. In the realm of agri-biotechnology, azolla sp gained attention for helping farmers achieve sustainability, having minimal land requirements, and serving as a feed element that doesn't compete with human food sources. It has high methionine content, which is a good source of protein. It can be easily digested as the lignin content is low. It has high antioxidants and vitamins like beta carotene, vitamin A, and vitamin B12. Using this concept, the paper aims to investigate and develop a model of using azolla plants as a novel, high-potential feed source to combat the problems of low production and poor quality of animals in India. A representative sample of animal feed is collected where azolla is added. The sample is ground into a fine powder using mortar. PITC (phenylisothiocyanate) is added to derivatize the amino acids. The sample is analyzed using HPLC (High-Performance Liquid Chromatography) to measure the amino acids and monitor the protein content of the sample feed. The amino acid measurements from HPLC are converted to milligrams per gram of protein using the method of amino acid profiling via a set of calculations. The amino acid profile data is then obtained to validate the proximate results of nutrient enhancement of the composition of azolla in the sample. Based on the proximate composition of azolla meal, the enhancement results shown were higher compared to the standard values of normal fodder supplements indicating the feed to be much richer and denser in nutrient supply. Thus azolla fed sample proved to be a promising source for animal fodder. This would in turn lead to higher production and a good breed of animals that would help to meet the economic demands of the growing Indian population. Azolla plants have no side effects and can be considered as safe and effective to be immersed in the animal feed. One area of future research could begin with the upstream scaling strategy of azolla plants in India. This could involve introducing several bioreactor types for its commercial production. Since azolla sp has been proved in this paper as a promising source for high quality animal feed and fodder, large scale production of azolla plants will help to make the process much quicker, more efficient and easily accessible. Labor expenses will also be reduced by employing bioreactors for large-scale manufacturing.Keywords: azolla, fodder, nutrient, protein
Procedia PDF Downloads 5528885 Static Analysis of Security Issues of the Python Packages Ecosystem
Authors: Adam Gorine, Faten Spondon
Abstract:
Python is considered the most popular programming language and offers its own ecosystem for archiving and maintaining open-source software packages. This system is called the python package index (PyPI), the repository of this programming language. Unfortunately, one-third of these software packages have vulnerabilities that allow attackers to execute code automatically when a vulnerable or malicious package is installed. This paper contributes to large-scale empirical studies investigating security issues in the python ecosystem by evaluating package vulnerabilities. These provide a series of implications that can help the security of software ecosystems by improving the process of discovering, fixing, and managing package vulnerabilities. The vulnerable dataset is generated using the NVD, the national vulnerability database, and the Snyk vulnerability dataset. In addition, we evaluated 807 vulnerability reports in the NVD and 3900 publicly known security vulnerabilities in Python Package Manager (pip) from the Snyk database from 2002 to 2022. As a result, many Python vulnerabilities appear in high severity, followed by medium severity. The most problematic areas have been improper input validation and denial of service attacks. A hybrid scanning tool that combines the three scanners bandit, snyk and dlint, which provide a clear report of the code vulnerability, is also described.Keywords: Python vulnerabilities, bandit, Snyk, Dlint, Python package index, ecosystem, static analysis, malicious attacks
Procedia PDF Downloads 13928884 Feasibility of Simulating External Vehicle Aerodynamics Using Spalart-Allmaras Turbulence Model with Adjoint Method in OpenFOAM and Fluent
Authors: Arpit Panwar, Arvind Deshpande
Abstract:
The study of external vehicle aerodynamics using Spalart-Allmaras turbulence model with adjoint method was conducted. The accessibility and ease of working with the Fluent module of ANSYS and OpenFOAM were considered. The objective of the study was to understand and analyze the possibility of bringing high-level aerodynamic simulation to the average consumer vehicle. A form-factor of BMW M6 vehicle was designed in Solidworks, which was analyzed in OpenFOAM and Fluent. The turbulence model being a single equation provides much faster convergence rate when clubbed with the adjoint method. Fluent being commercial software still does not allow us to solve Spalart-Allmaras turbulence model using the adjoint method. Hence, the turbulence model was solved using the SIMPLE method in Fluent. OpenFOAM being an open source provide flexibility in simulation but is not user-friendly. It supports solving the defined turbulence model with the adjoint method. The result generated from the simulation gives us acceptable values of drag, when validated with the result of percentage error in drag values for a notch-back vehicle model on an extensive simulation produced at 6th ANSA and μETA conference, Greece. The success of this approach will allow us to bring more aerodynamic vehicle body design to all segments of the automobile and not limiting it to just the high-end sports cars.Keywords: Spalart-Allmaras turbulence model, OpenFOAM, adjoint method, SIMPLE method, vehicle aerodynamic design
Procedia PDF Downloads 20028883 Design of an Automatic Bovine Feeding Machine
Authors: Huseyin A. Yavasoglu, Yusuf Ziya Tengiz, Ali Göksenli
Abstract:
In this study, an automatic feeding machine for different type and class of bovine animals is designed. Daily nutrition of a bovine consists of grass, corn, straw, silage, oat, wheat and different vitamins and minerals. The amount and mixture amount of each of the nutrition depends on different parameters of the bovine. These parameters are; age, sex, weight and maternity of the bovine, also outside temperature. The problem in a farm is to constitute the correct mixture and amount of nutrition for each animal. Faulty nutrition will cause an insufficient feeding of the animal concluding in an unhealthy bovine. To solve this problem, a new automatic feeding machine is designed. Travelling of the machine is performed by four tires, which is pulled by a tractor. The carrier consists of eight bins, which each of them carries a nutrition type. Capacity of each unit is 250 kg. At the bottom of each chamber is a sensor measuring the weight of the food inside. A funnel is at the bottom of each chamber by which open/close function is controlled by a valve. Each animal will carry a RFID tag including ID on its ear. A receiver on the feeding machine will read this ID and by given previous information by the operator (veterinarian), the system will detect the amount of each nutrition unit which will be given to the selected animal for feeding. In the system, each bin will open its exit gate by the help of the valve under the control of PLC (Programmable Logic Controller). The amount of each nutrition type will be controlled by measuring the open/close time. The exit canals of the bins are collected in a reservoir. To achieve a homogenous nitration, the collected feed will be mixed by a worm gear. Further the mixture will be transported by a help of a funnel to the feeding unit of the animal. The feeding process can be performed in 100 seconds. After feeding of the animal, the tractor pulls the travelling machine to the next animal. By the help of this system animals can be feeded by right amount and mixture of nutritionKeywords: bovine, feeding, nutrition, transportation, automatic
Procedia PDF Downloads 34228882 Ground Source Ventilation and Solar PV Towards a Zero-Carbon House in Riyadh
Authors: Osamah S. Alanazi, Mohammad G. Kotbi, Mohammed O. AlFadil
Abstract:
While renewable energy technology is developing in Saudi Arabia, and the ambitious 2030 vision encourages the shift towards more efficient and clean energy usage. The research on the application of geothermal resources in residential use for the Saudi Arabian context will contribute towards a more sustainable environment. This paper is a part of an ongoing master's thesis, which its main goal is to investigate the possibility of achieving a zero-carbon house in Riyadh by applying a ground-coupled system into a current sustainable house that uses a grid-tied solar system. The current house was built and designed by King Saud University for the 2018 middle east solar decathlon competition. However, it failed to reach zero-carbon operation due to the high cooling demand. This study will redesign and validate the house using Revit and Carriers Hourly Analysis 'HAP' software with the use of ordinary least square 'OLS' regression. After that, a ground source ventilation system will be designed using the 'GCV Tool' to reduce cooling loads. After the application of the ground source system, the new electrical loads will be compared with the current house. Finally, a simple economic analysis that includes the cost of applying a ground source system will be reported. The findings of this study will indicate the possibility and feasibility of reaching a zero-carbon house in Riyadh, Saudi Arabia, using a ground-coupled ventilation system. While cooling in the residential sector is the dominant energy consumer in the Gulf region, this work will certainly help in moving towards using renewable sources to meet those demands. This paper will be limited to highlight the literature review, the methodology of the research, and the expected outcome.Keywords: renewable energy, zero-carbon houses, sustainable buildings, geothermal energy, solar PV, GCV Tool
Procedia PDF Downloads 18228881 Business Intelligence for Profiling of Telecommunication Customer
Authors: Rokhmatul Insani, Hira Laksmiwati Soemitro
Abstract:
Business Intelligence is a methodology that exploits the data to produce information and knowledge systematically, business intelligence can support the decision-making process. Some methods in business intelligence are data warehouse and data mining. A data warehouse can store historical data from transactional data. For data modelling in data warehouse, we apply dimensional modelling by Kimball. While data mining is used to extracting patterns from the data and get insight from the data. Data mining has many techniques, one of which is segmentation. For profiling of telecommunication customer, we use customer segmentation according to customer’s usage of services, customer invoice and customer payment. Customers can be grouped according to their characteristics and can be identified the profitable customers. We apply K-Means Clustering Algorithm for segmentation. The input variable for that algorithm we use RFM (Recency, Frequency and Monetary) model. All process in data mining, we use tools IBM SPSS modeller.Keywords: business intelligence, customer segmentation, data warehouse, data mining
Procedia PDF Downloads 48328880 Looking for a Connection between Oceanic Regions with Trends in Evaporation with Continental Ones with Trends in Precipitation through a Lagrangian Approach
Authors: Raquel Nieto, Marta Vázquez, Anita Drumond, Luis Gimeno
Abstract:
One of the hot spots of climate change is the increment of ocean evaporation. The best estimation of evaporation, OAFlux data, shows strong increasing trends in evaporation from the oceans since 1978, with peaks during the hemispheric winter and strongest along the paths of the global western boundary currents and any inner Seas. The transport of moisture from oceanic sources to the continents is the connection between evaporation from the ocean and precipitation over the continents. A key question is to try to relate evaporative source regions over the oceans where trends have occurred in the last decades with their sinks over the continents to check if there have been also any trends in the precipitation amount or its characteristics. A Lagrangian approach based on FLEXPART and ERA-interim data is used to establish this connection. The analyzed period was 1980 to 2012. Results show that there is not a general pattern, but a significant agreement was found in important areas of climate interest.Keywords: ocean evaporation, Lagrangian approaches, contiental precipitation, Europe
Procedia PDF Downloads 25628879 A Structure-Based Approach for Adaptable Building System
Authors: Alireza Taghdiri, Sara Ghanbarzade Ghomi
Abstract:
Existing buildings are permanently subjected to change, continuously renovated and repaired in their long service life. Old buildings are destroyed and their material and components are recycled or reused for constructing new ones. In this process, importance of sustainability principles for building construction is obviously known and great significance must be attached to consumption of resources, resulting effects on the environment and economic costs. Utilization strategies for extending buildings service life and delay in destroying have positive effect on environment protection. In addition, simpler alterability or expandability of buildings’ structures and reducing energy and natural resources consumption have benefits for users, producers and environment. To solve these problems, by applying theories of open building, structural components of some conventional building systems have been analyzed and then, a new geometry adaptive building system is developed which can transform and support different imposed loads. In order to achieve this goal, various research methods and tools such as professional and scientific literatures review, comparative analysis, case study and computer simulation were applied and data interpretation was implemented using descriptive statistics and logical arguments. Therefore, hypothesis and proposed strategies were evaluated and an adaptable and reusable 2-dimensional building system was presented which can respond appropriately to dwellers and end-users needs and provide reusability of structural components of building system in new construction or function. Investigations showed that this incremental building system can be successfully applied in achieving the architectural design objectives and by small modifications on components and joints, it is easy to obtain different and adaptable load-optimized component alternatives for flexible spaces.Keywords: adaptability, durability, open building, service life, structural building system
Procedia PDF Downloads 58028878 An Era of Arts: Examining Intersection of Technology and Museums
Authors: Vivian Li
Abstract:
With the rapid development of technology, virtual reality (VR) and augmented reality (AR) are becoming increasingly prominent in our lives. Museums have led the way in digitization, offering their collections to the wider public through the open internet, which is dramatically changing our experience of art. Technology is also being implemented into our physical art-viewing experience, enabling museums to capture historical sites while creating a more immersive experience for patrons. This study takes a qualitative approach, examining secondary sources and synthesizing information from interviews with field professionals to answer the question: to what extent is the contemporary perception of art transformed by the digitization of art museums? The findings establish that museums are becoming increasingly open with their collections, utilizing digitization to spread their intellectual content to people worldwide and to diversify their audiences. The use of VR and AR is also enabling museums to preserve and showcase historical artifacts and sites in a more interactive and user-focused way. Technology is also crafting new forms of art and art museums. Ultimately, the intersection of technology and museums is not changing the definition of art but rather offering new modes for the public to experience and learn about arts and history.Keywords: art, augmented reality, digitization, museums, technology, virtual reality
Procedia PDF Downloads 12728877 Machine Learning in Agriculture: A Brief Review
Authors: Aishi Kundu, Elhan Raza
Abstract:
"Necessity is the mother of invention" - Rapid increase in the global human population has directed the agricultural domain toward machine learning. The basic need of human beings is considered to be food which can be satisfied through farming. Farming is one of the major revenue generators for the Indian economy. Agriculture is not only considered a source of employment but also fulfils humans’ basic needs. So, agriculture is considered to be the source of employment and a pillar of the economy in developing countries like India. This paper provides a brief review of the progress made in implementing Machine Learning in the agricultural sector. Accurate predictions are necessary at the right time to boost production and to aid the timely and systematic distribution of agricultural commodities to make their availability in the market faster and more effective. This paper includes a thorough analysis of various machine learning algorithms applied in different aspects of agriculture (crop management, soil management, water management, yield tracking, livestock management, etc.).Due to climate changes, crop production is affected. Machine learning can analyse the changing patterns and come up with a suitable approach to minimize loss and maximize yield. Machine Learning algorithms/ models (regression, support vector machines, bayesian models, artificial neural networks, decision trees, etc.) are used in smart agriculture to analyze and predict specific outcomes which can be vital in increasing the productivity of the Agricultural Food Industry. It is to demonstrate vividly agricultural works under machine learning to sensor data. Machine Learning is the ongoing technology benefitting farmers to improve gains in agriculture and minimize losses. This paper discusses how the irrigation and farming management systems evolve in real-time efficiently. Artificial Intelligence (AI) enabled programs to emerge with rich apprehension for the support of farmers with an immense examination of data.Keywords: machine Learning, artificial intelligence, crop management, precision farming, smart farming, pre-harvesting, harvesting, post-harvesting
Procedia PDF Downloads 10528876 Investigating the Effects of Hydrogen on Wet Cement for Underground Hydrogen Storage Applications in Oil and Gas Wells
Authors: Hamoud Al-Hadrami, Hossein Emadi, Athar Hussain
Abstract:
Green hydrogen is quickly emerging as a new source of renewable energy for the world. Hydrogen production using water electrolysis is deemed as an environmentally friendly and safe source of energy for transportation and other industries. However, storing a high volume of hydrogen seems to be a significant challenge. Abandoned hydrocarbon reservoirs are considered as viable hydrogen storage options because of the availability of the required infrastructure such as wells and surface facilities. However, long-term wellbore integrity in these wells could be a serious challenge. Hydrogen reduces the compressive strength of a set cement if it gets in contact with the cement slurry. Also, mixing hydrogen with cement slurry slightly increases its density and rheological properties, which need to be considered to have a successful primary cementing operation.Keywords: hydrogen, well bore integrity, clean energy, cementing
Procedia PDF Downloads 21428875 Promoting Couple HIV Testing among Migrants for HIV Prevention: Learnings from Integrated Counselling and Testing Centre (ICTC) in Odisha, India
Authors: Sunil Mekale, Debasish Chowdhury, Sanchita Patnaik, Amitav Das, Ashok Agarwal
Abstract:
Background: Odisha is a low HIV prevalence state in India (ANC-HIV positivity of 0.42% as per HIV sentinel surveillance 2010-2011); however, it is an important source migration state with 3.2% of male migrants reporting to be PLHIV. USAID Public Health Foundation of India -PIPPSE project is piloting a source-destination corridor programme between Odisha and Gujarat. In Odisha, the focus has been on developing a comprehensive strategy to reach out to the out migrants and their spouses in the place of their origin based on their availability. The project has made concerted attempts to identify vulnerable districts with high out migration and high positivity rate. Description: 48 out of 97 ICTCs were selected from nine top high out migration districts through multistage sampling. A retrospective descriptive analysis of HIV positive male migrants and their spouses for two years (April 2013-March 2015) was conducted. A total of 3,645 HIV positive records were analysed. Findings: Among 34.2% detected HIV positive in the ICTCs, 23.3% were male migrants and 11% were spouses of male migrants; almost 50% of total ICTC attendees. More than 70% of the PLHIV male migrants and their spouses were less than 45 years old. Conclusions: Couple HIV testing approach may be considered for male migrants and their spouses. ICTC data analysis could guide in identifying the locations with high HIV positivity among male migrants and their spouses.Keywords: HIV testing, migrants, spouse of migrants, Integrated Counselling and Testing Centre (ICTC)
Procedia PDF Downloads 37928874 A Study on 5-11 Year-Old Children's Level of Knowledge about Personal Safety and Protection from Social Dangers
Authors: Özden Kuşcu, Yağmur Kuşcu, Zeynep Çetintaş, S. Sunay Yildirim Doğru
Abstract:
The purpose of this work is to evaluate the effect of the subjects “personal safety” and “protection from dangers” included in primary school curriculum on the students’ levels of knowledge about safety and protection from social dangers. The study group included 469 students between 5–11 years old with 231 preschoolers and 238 primary school students and their parents and teachers. Instruments used to collect data were “Personal Safety Interview Form” for children, “Parent Interview Form” and “Teacher Interview Form”. Forms included 15 open-ended questions about personal safety. The researchers collected the research data through one-on-one interviews with children. Results of the study revealed that preschoolers and 1st, 2nd, and 3rd graders did not know their home addresses and telephone numbers and their families were not aware of that. The study also showed that those who had this information were unsure as to who to share this information with. Accordingly, more should be done to increase the levels of knowledge of preschoolers and 1st, 2nd, and 3rd graders about personal safety and protection from dangers.Keywords: security, social danger, elementary school, preschool
Procedia PDF Downloads 45728873 Effect of Perioperative Protocol of Care on Clinical Outcomes among Patients Undergoing Coronary Artery Bypass Graft
Authors: Manal Ahmed, Amal Shehata, Shereen Deeb
Abstract:
The study's purpose was to determine the effect of the perioperative protocol of care on clinical outcomes among patients undergoing coronary artery bypass graft. Subjects: A sample of 100 adult patients who were planned for coronary artery bypass graft, were selected and divided alternatively and randomly into two equal groups (50 study -50 control).The study was carried out at National heart Institute in Cairo and open heart surgical intensive care unit in Shebin El-Kom Teaching Hospital. Instruments: Four instruments were used for data collection: Interviewing questionnaire, dyspnea analogue scale, Biophysiological measurement instrument, and Compliance assessment sheet. Results: There were statistically significant differences between both groups regarding most respiratory system assessment findings at discharge. More than two-thirds of the study group of the current study had a continuous and regular commitment to diet regimen, which ranked first followed by the compliance of daily living activities then quitting smoking. Conclusions: The perioperative protocol of care has a significant improving effect on respiratory findings, dyspnea degree, duration of mechanical ventilation, length of hospital stay, compliance to diet, therapeutic regimen, daily living activities, and quit smoking among study group undergoing CABG. Recommendations: Perioperative protocol of care should be carried out for CABG patients at open-heart surgical units as well as an illustrative colored booklet about CAD, CABG and perioperative care should be available and distributed to all CABG patients.Keywords: perioperative, effect, clinical outcomes, coronary artery, bypass graft, protocol of care
Procedia PDF Downloads 13928872 Recovery of the Demolition and Construction Waste, Casablanca (Morocco)
Authors: Morsli Mourad, Tahiri Mohamed, Samdi Azzeddine
Abstract:
Casablanca is the biggest city in Morocco. It concentrates more than 60% of the economic and industrial activity of the kingdom. Its building and public works (BTP) sector is the leading source of inert waste scattered in open areas. This inert waste is a major challenge for the city of Casablanca, as it is not properly managed, thus causing a significant nuisance for the environment and the health of the population. Hence the vision of our project is to recycle and valorize concrete waste. In this work, we present concrete results in the exploitation of this abundant and permanent deposit. Typical wastes are concrete, clay and concrete bricks, ceramic tiles, marble panels, gypsum, scrap metal, wood . The work performed included: geolocation with a combination of artificial intelligence and Google Earth, estimation of the amount of waste per site, sorting, crushing, grinding, and physicochemical characterization of the samples. Then, we proceeded to the exploitation of the types of substrates to be developed: light cement, coating, and glue for ceramics... The said products were tested and characterized by X-ray fluorescence, specific surface, resistance to bending and crushing, etc. We will present in detail the main results of our research work and also describe the specific properties of each material developed.Keywords: déchets de démolition et des chantiers de construction, logiciels de combinaison SIG, valorisation de déchets inertes, enduits, ciment leger, casablanca
Procedia PDF Downloads 11228871 Diversifying from Petroleum Products to Arable Farming as Source of Revenue Generation in Nigeria: A Case Study of Ondo West Local Government
Authors: A. S. Akinbani
Abstract:
Overdependence on petroleum is causing set back in Nigeria economy. Field survey was carried out to assess the profitability and production of selected arable crops in six selected towns and villages of Ondo southwestern. Data were collected from 240 arable crop farmers with the aid of both primary and secondary data. Data were collected with the use of oral interview and structured questionnaires. Data collected were analyzed using both descriptive and inferential statistics. Forty farmers were randomly selected to give a total number of 240 respondents. 84 farmers interviewed had no formal education, 72 had primary education, 50 farmers attained secondary education while 38 attained beyond secondary education. The majority of the farmers hold less than 10 acres of land. The data collected from the field showed that 192 farmers practiced mixed cropping which includes mixtures of yam, cowpea, cocoyam, vegetable, cassava and maize while only 48 farmers practiced monocropping. Among the sampled farmers, 93% agreed that arable production is profitable while 7% disagreed. The findings show that managerial practices that conserve the soil fertility and reduce labor cost such as planting of leguminous crops and herbicide application instead of using hand held hoe for weeding should be encouraged. All the respondents agreed that yam, cowpea, cocoyam, sweet potato, rice, maize and vegetable production will solve the problem of hunger and increase standard of living compared with petroleum product that Nigeria relied on as means of livelihood.Keywords: farmers, arable crop, cocoyam, respondents, maize
Procedia PDF Downloads 25128870 Awareness, Use and Searching Behavior of 'Virtua' Online Public Access Catalog Users
Authors: Saira Soroya, Khalid Mahmood
Abstract:
Library catalogs open the door to the library collection. OPAC (Online Public Access Catalog) are one of the services offered by automated libraries. The present study aims to explore user’s awareness, the level of use and their searching behavior of OPAC with a purpose to give suggestions and ways to improve user-friendly features of library OPAC. The population consisted of OPAC users of Lahore University of Management Sciences (LUMS). Convenient sampling technique was carried out. Total sample size was 100 OPAC users. Quantitative research design, based on survey method used to carry out the study. The data collection instrument was adopted. Data was analyzed using SPSS. Results revealed that a considerable number of users were not aware of OPAC i.e. (30%); however, those who were aware were using basic features of the OPAC. It was found that lack of knowledge was considered the frequent reason for not using all features of OPAC. In this regard, it is strongly recommended that compulsory information literacy programme should be established.Keywords: catalog, OPAC, library automation, usability study, university library
Procedia PDF Downloads 33628869 Feature Selection of Personal Authentication Based on EEG Signal for K-Means Cluster Analysis Using Silhouettes Score
Authors: Jianfeng Hu
Abstract:
Personal authentication based on electroencephalography (EEG) signals is one of the important field for the biometric technology. More and more researchers have used EEG signals as data source for biometric. However, there are some disadvantages for biometrics based on EEG signals. The proposed method employs entropy measures for feature extraction from EEG signals. Four type of entropies measures, sample entropy (SE), fuzzy entropy (FE), approximate entropy (AE) and spectral entropy (PE), were deployed as feature set. In a silhouettes calculation, the distance from each data point in a cluster to all another point within the same cluster and to all other data points in the closest cluster are determined. Thus silhouettes provide a measure of how well a data point was classified when it was assigned to a cluster and the separation between them. This feature renders silhouettes potentially well suited for assessing cluster quality in personal authentication methods. In this study, “silhouettes scores” was used for assessing the cluster quality of k-means clustering algorithm is well suited for comparing the performance of each EEG dataset. The main goals of this study are: (1) to represent each target as a tuple of multiple feature sets, (2) to assign a suitable measure to each feature set, (3) to combine different feature sets, (4) to determine the optimal feature weighting. Using precision/recall evaluations, the effectiveness of feature weighting in clustering was analyzed. EEG data from 22 subjects were collected. Results showed that: (1) It is possible to use fewer electrodes (3-4) for personal authentication. (2) There was the difference between each electrode for personal authentication (p<0.01). (3) There is no significant difference for authentication performance among feature sets (except feature PE). Conclusion: The combination of k-means clustering algorithm and silhouette approach proved to be an accurate method for personal authentication based on EEG signals.Keywords: personal authentication, K-mean clustering, electroencephalogram, EEG, silhouettes
Procedia PDF Downloads 28528868 Adaptability of Steel-Framed Industrialized Building System
Authors: Alireza Taghdiri, Sara Ghanbarzade Ghomi
Abstract:
Existing buildings are permanently subjected to change, continuously renovated and repaired in their long service life. Old buildings are destroyed and their material and components are recycled or reused for constructing new ones. In this process, importance of sustainability principles for building construction is obviously known and great significance must be attached to consumption of resources, resulting effects on the environment and economic costs. Utilization strategies for extending buildings service life and delay in destroying have positive effect on environment protection. In addition, simpler alterability or expandability of buildings’ structures and reducing energy and natural resources consumption have benefits for users, producers and environment. To solve these problems, by applying theories of open building, structural components of some conventional building systems have been analyzed and then, a new geometry adaptive building system is developed which can transform and support different imposed loads. In order to achieve this goal, various research methods and tools such as professional and scientific literatures review, comparative analysis, case study and computer simulation were applied and data interpretation was implemented using descriptive statistics and logical arguments. Therefore, hypothesis and proposed strategies were evaluated and an adaptable and reusable 2-dimensional building system was presented which can respond appropriately to dwellers and end-users needs and provide reusability of structural components of building system in new construction or function. Investigations showed that this incremental building system can be successfully applied in achieving the architectural design objectives and by small modifications on components and joints, it is easy to obtain different and adaptable load-optimized component alternatives for flexible spaces.Keywords: adaptability, durability, open building, service life, structural building system
Procedia PDF Downloads 36428867 55 dB High Gain L-Band EDFA Utilizing Single Pump Source
Authors: M. H. Al-Mansoori, W. S. Al-Ghaithi, F. N. Hasoon
Abstract:
In this paper, we experimentally investigate the performance of an efficient high gain triple-pass L-band Erbium-Doped Fiber (EDF) amplifier structure with a single pump source. The amplifier gain and noise figure variation with EDF pump power, input signal power and wavelengths have been investigated. The generated backward Amplified Spontaneous Emission (ASE) noise of the first amplifier stage is suppressed by using a tunable band-pass filter. The amplifier achieves a signal gain of 55 dB with low noise figure of 3.8 dB at -50 dBm input signal power. The amplifier gain shows significant improvement of 12.8 dB compared to amplifier structure without ASE suppression.Keywords: optical amplifiers, EDFA, L-band, optical networks
Procedia PDF Downloads 34828866 Effects of Temperature and the Use of Bacteriocins on Cross-Contamination from Animal Source Food Processing: A Mathematical Model
Authors: Benjamin Castillo, Luis Pastenes, Fernando Cerdova
Abstract:
The contamination of food by microbial agents is a common problem in the industry, especially regarding the elaboration of animal source products. Incorrect manipulation of the machinery or on the raw materials can cause a decrease in production or an epidemiological outbreak due to intoxication. In order to improve food product quality, different methods have been used to reduce or, at least, to slow down the growth of the pathogens, especially deteriorated, infectious or toxigenic bacteria. These methods are usually carried out under low temperatures and short processing time (abiotic agents), along with the application of antibacterial substances, such as bacteriocins (biotic agents). This, in a controlled and efficient way that fulfills the purpose of bacterial control without damaging the final product. Therefore, the objective of the present study is to design a secondary mathematical model that allows the prediction of both the biotic and abiotic factor impact associated with animal source food processing. In order to accomplish this objective, the authors propose a three-dimensional differential equation model, whose components are: bacterial growth, release, production and artificial incorporation of bacteriocins and changes in pH levels of the medium. These three dimensions are constantly being influenced by the temperature of the medium. Secondly, this model adapts to an idealized situation of cross-contamination animal source food processing, with the study agents being both the animal product and the contact surface. Thirdly, the stochastic simulations and the parametric sensibility analysis are compared with referential data. The main results obtained from the analysis and simulations of the mathematical model were to discover that, although bacterial growth can be stopped in lower temperatures, even lower ones are needed to eradicate it. However, this can be not only expensive, but counterproductive as well in terms of the quality of the raw materials and, on the other hand, higher temperatures accelerate bacterial growth. In other aspects, the use and efficiency of bacteriocins are an effective alternative in the short and medium terms. Moreover, an indicator of bacterial growth is a low-level pH, since lots of deteriorating bacteria are lactic acids. Lastly, the processing times are a secondary agent of concern when the rest of the aforementioned agents are under control. Our main conclusion is that when acclimating a mathematical model within the context of the industrial process, it can generate new tools that predict bacterial contamination, the impact of bacterial inhibition, and processing method times. In addition, the mathematical modeling proposed logistic input of broad application, which can be replicated on non-meat food products, other pathogens or even on contamination by crossed contact of allergen foods.Keywords: bacteriocins, cross-contamination, mathematical model, temperature
Procedia PDF Downloads 14428865 Numerical Simulations of the Transition Flow of Model Propellers for Predicting Open Water Performance
Authors: Huilan Yao, Huaixin Zhang
Abstract:
Simulations of the transition flow of model propellers are important for predicting hydrodynamic performance and studying scale effects. In this paper, the transition flow of a model propeller under different loadings are simulated using a transition model provided by STAR-CCM+, and the influence of turbulence intensity (TI) on the transition, especially friction and pressure components of propeller performance, was studied. Before that, the transition model was applied to simulate the transition flow of a flat plate and an airfoil. Predicted transitions agree well with experimental results. Then, the transition model was applied for propeller simulations in open water, and the influence of TI was studied. Under the heavy and moderate loadings, thrust and torque of the propeller predicted by the transition model (different TI) and two turbulence models are very close and agree well with measurements. However, under the light loading, only the transition model with low TI predicts the most accurate results. Above all, the friction components of propeller performance predicted by the transition model with different TI have obvious difference.Keywords: transition flow, model propellers, hydrodynamic performance, numerical simulation
Procedia PDF Downloads 26328864 Minimally Invasive Open Lumbar Discectomy with Nucleoplasty and Annuloplasty as a Technique for Effective Reduction of Both Axial and Radicular Pain
Authors: Wael Elkholy, Ashraf Sakr, Mahmoud Qandeel, Adam Elkholy
Abstract:
Lumbar disc herniation is a common pathology that may cause significant low back pain and radicular pain that could profoundly impair daily life activities of individuals. Patients who undergo surgical treatment for lumbar disc herniation usually present with radiculopathy along with low back pain (LBP) instead of radiculopathy alone. When discectomy is performed, improvement in leg radiating pain is observed due to spinal nerve irritation. However, long-term LBP due to degenerative changes in the disc may occur postoperatively. In addition, limited research has been reported on the short-term (within 1 year) improvement in LBP after discectomy. In this study we would like to share our minimally invasive open technique for lumbar discectomy with annuloplasty and nuceloplasty as a technique for effective reduction of both axial and radicular pain.Keywords: nucleoplasty, sinuvertebral nerve cauterization, annuloplasty, discogenic low back pain, axial pain, radicular pain, minimally invasive lumbar discectomy
Procedia PDF Downloads 6828863 The Critical Velocity and Heat of Smoke Outflow in Z-shaped Passage Fires Under Weak Stack Effect
Authors: Zekun Li, Bart Merci, Miaocheng Weng, Fang Liu
Abstract:
The Z-shaped passage, widely used in metro entrance/exit passageways, inclined mining laneways, and other applications, features steep slopes and a combination of horizontal and inclined sections. These characteristics lead to notable differences in airflow patterns and temperature distributions compared to conventional confined passages. In fires occurring within Z-shaped passages under natural ventilation with a weak stack effect, the induced airflow may be insufficient to fully confined smoke downstream of the fire source. This can cause smoke back-layering upstream, with the possibility of smoke escaping from the lower entrance located upstream of the fire. Consequently, not all the heat from the fire source contributes to the stack effect. This study combines theoretical analysis and fire simulations to examine the influence of various heat release rates (HRR), passage structures, and fire source locations on the induced airflow velocity driven by the stack effect. An empirical equation is proposed to quantify the strength of the stack effect under different conditions. Additionally, predictive models have been developed to determine the critical induced airflow and to estimate the heat of smoke escaping from the lower entrance of the passage.Keywords: stack effect, critical velocity, heat outflow, numerical simulation
Procedia PDF Downloads 828862 Imputation Technique for Feature Selection in Microarray Data Set
Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam
Abstract:
Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.Keywords: DNA microarray, feature selection, missing data, bioinformatics
Procedia PDF Downloads 57428861 PDDA: Priority-Based, Dynamic Data Aggregation Approach for Sensor-Based Big Data Framework
Authors: Lutful Karim, Mohammed S. Al-kahtani
Abstract:
Sensors are being used in various applications such as agriculture, health monitoring, air and water pollution monitoring, traffic monitoring and control and hence, play the vital role in the growth of big data. However, sensors collect redundant data. Thus, aggregating and filtering sensors data are significantly important to design an efficient big data framework. Current researches do not focus on aggregating and filtering data at multiple layers of sensor-based big data framework. Thus, this paper introduces (i) three layers data aggregation and framework for big data and (ii) a priority-based, dynamic data aggregation scheme (PDDA) for the lowest layer at sensors. Simulation results show that the PDDA outperforms existing tree and cluster-based data aggregation scheme in terms of overall network energy consumptions and end-to-end data transmission delay.Keywords: big data, clustering, tree topology, data aggregation, sensor networks
Procedia PDF Downloads 34628860 A PHREEQC Reactive Transport Simulation for Simply Determining Scaling during Desalination
Authors: Andrew Freiburger, Sergi Molins
Abstract:
Freshwater is a vital resource; yet, the supply of clean freshwater is diminishing as the consequence of melting snow and ice from global warming, pollution from industry, and an increasing demand from human population growth. The unsustainable trajectory of diminishing water resources is projected to jeopardize water security for billions of people in the 21st century. Membrane desalination technologies may resolve the growing discrepancy between supply and demand by filtering arbitrary feed water into a fraction of renewable, clean water and a fraction of highly concentrated brine. The leading hindrance of membrane desalination is fouling, whereby the highly concentrated brine solution encourages micro-organismal colonization and/or the precipitation of occlusive minerals (i.e. scale) upon the membrane surface. Thus, an understanding of brine formation is necessary to mitigate membrane fouling and to develop efficacious desalination technologies that can bolster the supply of available freshwater. This study presents a reactive transport simulation of brine formation and scale deposition during reverse osmosis (RO) desalination. The simulation conceptually represents the RO module as a one-dimensional domain, where feed water directionally enters the domain with a prescribed fluid velocity and is iteratively concentrated in the immobile layer of a dual porosity model. Geochemical PHREEQC code numerically evaluated the conceptual model with parameters for the BW30-400 RO module and for real water feed sources – e.g. the Red and Mediterranean seas, and produced waters from American oil-wells, based upon peer-review data. The presented simulation is computationally simpler, and hence less resource intensive, than the existent and more rigorous simulations of desalination phenomena, like TOUGHREACT. The end-user may readily prepare input files and execute simulations on a personal computer with open source software. The graphical results of fouling-potential and brine characteristics may therefore be particularly useful as the initial tool for screening candidate feed water sources and/or informing the selection of an RO module.Keywords: desalination, PHREEQC, reactive transport, scaling
Procedia PDF Downloads 13628859 Application of Single Subject Experimental Designs in Adapted Physical Activity Research: A Descriptive Analysis
Authors: Jiabei Zhang, Ying Qi
Abstract:
The purpose of this study was to develop a descriptive profile of the adapted physical activity research using single subject experimental designs. All research articles using single subject experimental designs published in the journal of Adapted Physical Activity Quarterly from 1984 to 2013 were employed as the data source. Each of the articles was coded in a subcategory of seven categories: (a) the size of sample; (b) the age of participants; (c) the type of disabilities; (d) the type of data analysis; (e) the type of designs, (f) the independent variable, and (g) the dependent variable. Frequencies, percentages, and trend inspection were used to analyze the data and develop a profile. The profile developed characterizes a small portion of research articles used single subject designs, in which most researchers used a small sample size, recruited children as subjects, emphasized learning and behavior impairments, selected visual inspection with descriptive statistics, preferred a multiple baseline design, focused on effects of therapy, inclusion, and strategy, and measured desired behaviors more often, with a decreasing trend over years.Keywords: adapted physical activity research, single subject experimental designs, physical education, sport science
Procedia PDF Downloads 46628858 Improved Production, Purification and Characterization of Invertase from Penicillium lilacinum by Shaken Flask Technique of Submerged Fermentation
Authors: Kashif Ahmed
Abstract:
Recent years researchers have been motivated towards extensive exploring of living organism, which could be utilized effectively in intense industrial conditions. The present study shows enhanced production, purification and characterization of industrial enzyme, invertase (Beta-D-fructofuranosidase) from Penicillium lilacinum. Various agricultural based by-products (cotton stalk, sunflower waste, rice husk, molasses and date syrup) were used as energy source. The highest amount of enzyme (13.05 Units/mL) was produced when the strain was cultured on growth medium containing date syrup as energy source. Yeast extract was used as nitrogen source after 96 h of incubation at incubation temperature of 40º C. Initial pH of medium was 8.0, inoculum size 6x10⁶ conidia and 200 rev/min agitation rate. The enzyme was also purified (7 folds than crude) and characterized. Molecular mass of purified enzyme (65 kDa) was determined by 10 % SDS-PAGE. Lineweaver-Burk Plot was used to determine Kinetic constants (Vmax 178.6 U/mL/min and Km 2.76 mM). Temperature and pH optima were 55º C and 5.5 respectively. MnCl₂ (52.9 %), MgSO₄ (48.9 %), BaCl₂ (24.6 %), MgCl₂ (9.6 %), CoCl₂ (5.7 %) and NaCl (4.2 %) enhanced the relative activity of enzyme and HgCl₂ (-92.8 %), CuSO₄ (-80.2 %) and CuCl₂ (-76.6 %) were proved inhibitors. The strain was showing enzyme activity even at extreme conditions of temperature (up to 60º C) and pH (up to 9), so it can be used in industries.Keywords: invertase, Penicillium lilacinum, submerged fermentation, industrial enzyme
Procedia PDF Downloads 15028857 i2kit: A Tool for Immutable Infrastructure Deployments
Authors: Pablo Chico De Guzman, Cesar Sanchez
Abstract:
Microservice architectures are increasingly in distributed cloud applications due to the advantages on the software composition, development speed, release cycle frequency and the business logic time to market. On the other hand, these architectures also introduce some challenges on the testing and release phases of applications. Container technology solves some of these issues by providing reproducible environments, easy of software distribution and isolation of processes. However, there are other issues that remain unsolved in current container technology when dealing with multiple machines, such as networking for multi-host communication, service discovery, load balancing or data persistency (even though some of these challenges are already solved by traditional cloud vendors in a very mature and widespread manner). Container cluster management tools, such as Kubernetes, Mesos or Docker Swarm, attempt to solve these problems by introducing a new control layer where the unit of deployment is the container (or the pod — a set of strongly related containers that must be deployed on the same machine). These tools are complex to configure and manage and they do not follow a pure immutable infrastructure approach since servers are reused between deployments. Indeed, these tools introduce dependencies at execution time for solving networking or service discovery problems. If an error on the control layer occurs, which would affect running applications, specific expertise is required to perform ad-hoc troubleshooting. As a consequence, it is not surprising that container cluster support is becoming a source of revenue for consulting services. This paper presents i2kit, a deployment tool based on the immutable infrastructure pattern, where the virtual machine is the unit of deployment. The input for i2kit is a declarative definition of a set of microservices, where each microservice is defined as a pod of containers. Microservices are built into machine images using linuxkit —- a tool for creating minimal linux distributions specialized in running containers. These machine images are then deployed to one or more virtual machines, which are exposed through a cloud vendor load balancer. Finally, the load balancer endpoint is set into other microservices using an environment variable, providing service discovery. The toolkit i2kit reuses the best ideas from container technology to solve problems like reproducible environments, process isolation, and software distribution, and at the same time relies on mature, proven cloud vendor technology for networking, load balancing and persistency. The result is a more robust system with no learning curve for troubleshooting running applications. We have implemented an open source prototype that transforms i2kit definitions into AWS cloud formation templates, where each microservice AMI (Amazon Machine Image) is created on the fly using linuxkit. Even though container cluster management tools have more flexibility for resource allocation optimization, we defend that adding a new control layer implies more important disadvantages. Resource allocation is greatly improved by using linuxkit, which introduces a very small footprint (around 35MB). Also, the system is more secure since linuxkit installs the minimum set of dependencies to run containers. The toolkit i2kit is currently under development at the IMDEA Software Institute.Keywords: container, deployment, immutable infrastructure, microservice
Procedia PDF Downloads 179