Search results for: data reduction
27336 Effects of Spent Dyebath Recycling on Pollution and Cost of Production in a Cotton Textile Industry
Authors: Dinesh Kumar Sharma, Sanjay Sharma
Abstract:
Textile manufacturing industry uses a substantial amount of chemicals not only in the production processes but also in manufacturing the raw materials. Dyes are the most significant raw material which provides colour to the fabric and yarn. Dyes are produced by using a large amount of chemicals both organic and inorganic in nature. Dyes are further classified as Reactive or Vat Dyes which are mostly used in cotton textiles. In the process of application of dyes to the cotton fiber, yarn or fabric, several auxiliary chemicals are also used in the solution called dyebath to improve the absorption of dyes. There is a very little absorption of dyes and auxiliary chemicals and a residual amount of all these substances is released as the spent dye bath effluent. Because of the wide variety of chemicals used in cotton textile dyes, there is always a risk of harmful effects which may not be apparent immediately but may have an irreversible impact in the long term. Colour imparted by the dyes to the water also has an adverse effect on its public acceptability and the potability. This study has been conducted with an objective to assess the feasibility of reuse of the spent dye bath. Studies have been conducted in two independent industries manufacturing dyed cotton yarn and dyed cotton fabric respectively. These have been referred as Unit-I and Unit-II. The studies included assessment of reduction in pollution levels and the economic benefits of such reuse. The study conclusively establishes that the reuse of spent dyebath results in prevention of pollution, reduction in pollution loads and cost of effluent treatment & production. This pollution prevention technique presents a good preposition for pollution prevention in cotton textile industry.Keywords: dyes, dyebath, reuse, toxic, pollution, costs
Procedia PDF Downloads 39627335 Event Driven Dynamic Clustering and Data Aggregation in Wireless Sensor Network
Authors: Ashok V. Sutagundar, Sunilkumar S. Manvi
Abstract:
Energy, delay and bandwidth are the prime issues of wireless sensor network (WSN). Energy usage optimization and efficient bandwidth utilization are important issues in WSN. Event triggered data aggregation facilitates such optimal tasks for event affected area in WSN. Reliable delivery of the critical information to sink node is also a major challenge of WSN. To tackle these issues, we propose an event driven dynamic clustering and data aggregation scheme for WSN that enhances the life time of the network by minimizing redundant data transmission. The proposed scheme operates as follows: (1) Whenever the event is triggered, event triggered node selects the cluster head. (2) Cluster head gathers data from sensor nodes within the cluster. (3) Cluster head node identifies and classifies the events out of the collected data using Bayesian classifier. (4) Aggregation of data is done using statistical method. (5) Cluster head discovers the paths to the sink node using residual energy, path distance and bandwidth. (6) If the aggregated data is critical, cluster head sends the aggregated data over the multipath for reliable data communication. (7) Otherwise aggregated data is transmitted towards sink node over the single path which is having the more bandwidth and residual energy. The performance of the scheme is validated for various WSN scenarios to evaluate the effectiveness of the proposed approach in terms of aggregation time, cluster formation time and energy consumed for aggregation.Keywords: wireless sensor network, dynamic clustering, data aggregation, wireless communication
Procedia PDF Downloads 45327334 Investigation of Leptospira Infection in Stray Animals in Thailand: Leptospirosis Risk Reduction in Human
Authors: Ruttayaporn Ngasaman, Saowakon Indouang, Usa Chethanond
Abstract:
Leptospirosis is a public health concern zoonosis in Thailand. Human and animals are often infected by contact with contaminated water. The infected animals play an important role in leptospira infection for both human and other hosts via urine. In humans, it can cause a wide range of symptoms, some of which may present mild flu-like symptoms including fever, vomiting, and jaundice. Without treatment, Leptospirosis can lead to kidney damage, meningitis, liver failure, respiratory distress, and even death. The prevalence of leptospirosis in stray animals in Thailand is unknown. The aim of this study was to investigate leptospira infection in stray animals including dogs and cats in Songkhla province, Thailand. Total of 434 blood samples were collected from 370 stray dogs and 64 stray cats during the population control program from 2014 to 2018. Screening test using latex agglutination for the detection of antibodies against Leptospira interrogans in serum samples shows 29.26% (127/434) positive. There were 120 positive samples of stray dogs and 7 positive samples of stray cats. Detection by polymerase chain reaction specific to LipL32 gene of Leptospira interrogans showed 1.61% (7/434) positive. Stray cats (5/64) show higher prevalence than stray dogs (2/370). Although active infection was low detected, but seroprevalence was high. This result indicated that stray animals were not active infection during sample collection but they use to get infected or in a latent period of infection. They may act as a reservoir for domestic animals and human in which stay in the same environment. In order to prevent and reduce the risk of leptospira infection in a human, stray animals should be done health checking, vaccination, and disease treatment.Keywords: leptospirosis, stray animals, risk reduction, Thailand
Procedia PDF Downloads 13527333 Offshore Outsourcing: Global Data Privacy Controls and International Compliance Issues
Authors: Michelle J. Miller
Abstract:
In recent year, there has been a rise of two emerging issues that impact the global employment and business market that the legal community must review closer: offshore outsourcing and data privacy. These two issues intersect because employment opportunities are shifting due to offshore outsourcing and some States, like the United States, anti-outsourcing legislation has been passed or presented to retain jobs within the country. In addition, the legal requirements to retain the privacy of data as a global employer extends to employees and third party service provides, including services outsourced to offshore locations. For this reason, this paper will review the intersection of these two issues with a specific focus on data privacy.Keywords: outsourcing, data privacy, international compliance, multinational corporations
Procedia PDF Downloads 41227332 Weighted Data Replication Strategy for Data Grid Considering Economic Approach
Authors: N. Mansouri, A. Asadi
Abstract:
Data Grid is a geographically distributed environment that deals with data intensive application in scientific and enterprise computing. Data replication is a common method used to achieve efficient and fault-tolerant data access in Grids. In this paper, a dynamic data replication strategy, called Enhanced Latest Access Largest Weight (ELALW) is proposed. This strategy is an enhanced version of Latest Access Largest Weight strategy. However, replication should be used wisely because the storage capacity of each Grid site is limited. Thus, it is important to design an effective strategy for the replication replacement task. ELALW replaces replicas based on the number of requests in future, the size of the replica, and the number of copies of the file. It also improves access latency by selecting the best replica when various sites hold replicas. The proposed replica selection selects the best replica location from among the many replicas based on response time that can be determined by considering the data transfer time, the storage access latency, the replica requests that waiting in the storage queue and the distance between nodes. Simulation results utilizing the OptorSim show our replication strategy achieve better performance overall than other strategies in terms of job execution time, effective network usage and storage resource usage.Keywords: data grid, data replication, simulation, replica selection, replica placement
Procedia PDF Downloads 26127331 Evaluation of Satellite and Radar Rainfall Product over Seyhan Plain
Authors: Kazım Kaba, Erdem Erdi, M. Akif Erdoğan, H. Mustafa Kandırmaz
Abstract:
Rainfall is crucial data source for very different discipline such as agriculture, hydrology and climate. Therefore rain rate should be known well both spatial and temporal for any area. Rainfall is measured by using rain-gauge at meteorological ground stations traditionally for many years. At the present time, rainfall products are acquired from radar and satellite images with a temporal and spatial continuity. In this study, we investigated the accuracy of these rainfall data according to rain-gauge data. For this purpose, we used Adana-Hatay radar hourly total precipitation product (RN1) and Meteosat convective rainfall rate (CRR) product over Seyhan plain. We calculated daily rainfall values from RN1 and CRR hourly precipitation products. We used the data of rainy days of four stations located within range of the radar from October 2013 to November 2015. In the study, we examined two rainfall data over Seyhan plain and the correlation between the rain-gauge data and two raster rainfall data was observed lowly.Keywords: meteosat, radar, rainfall, rain-gauge, Turkey
Procedia PDF Downloads 32927330 Disaster Preparedness for People with Disabilities through EPPO's Educational Awareness Initiative
Authors: A. Kourou, A. Ioakeimidou, E. Pelli, M. Panoutsopoulou, V. Abramea
Abstract:
Worldwide there is a growing recognition that education is a critical component of any disaster impacts reduction effort and a great challenge too. Given this challenge, a broad range of awareness raising projects at all levels are implemented and are continuously evaluated by Earthquake Planning and Protection Organization (EPPO). This paper presents an overview of EPPO educational initiative (seminars, lectures, workshops, campaigns and educational material) and its evaluation results. The abovementioned initiative is focused to aware the public, train teachers and civil protection staff, inform students and educate people with disabilities on subjects related to earthquake reduction issues. The better understating of how human activity can link to disaster and what can be done at the individual, family or workplace level to contribute to seismic reduction are the main issues of EPPO projects. Survey results revealed that a high percentage of teachers (included the ones of special schools) from all over the country have taken the appropriate preparedness measures at schools. On the other hand, the implementation of earthquake preparedness measures at various workplaces (kindergartens, banks, utilities etc.) has still significant room for improvement. Results show that the employees in banks and public utilities have substantially higher rates in preventive and preparedness actions in their workplaces than workers in kindergartens and other workplaces. One of the EPPO educational priorities is to enhance earthquake preparedness of people with disabilities. Booklets, posters and applications have been created with the financial support of the Council of Europe, addressed to people who have mobility impairments, learning difficulties or cognitive disability (ή intellectual disabilities). Part of the educational material was developed using the «easy-to-read» method and Makaton language program with the collaboration of experts on special needs education and teams of people with cognitive disability. Furthermore, earthquake safety seminars and earthquake drills have been implemented in order to develop children’s, parents’ and teachers abilities and skills on earthquake impacts reduction. To enhance the abovementioned efforts, EPPO is a partner at prevention and preparedness projects supported by EU Civil Protection Financial Instrument. One of them is E-PreS’ project (Monitoring and Evaluation of Natural Hazard Preparedness at School Environment). The main objectives of E-PreS project are: 1) to create smart tools which define, simulate and evaluate drills procedure at schools, centers of vocational training of people with disabilities or other workplaces, and 2) to involve students or adults with disabilities in the E-PreS system evacuation procedure in case of earthquake, flood, or volcanic occurrence. Two other EU projects (RACCE educational kit and EVANDE educational platform) are also with the aim of contributing to raising awareness among people with disabilities, students, teachers, volunteers etc. It is worth mentioning that even though in Greece many efforts have been done till now to build awareness towards earthquakes and establish preparedness status for prospective earthquakes, there are still actions to be taken.Keywords: earthquake, emergency plans, E-PreS project, people with disabilities, special needs education
Procedia PDF Downloads 26727329 A phytochemical and Biological Study of Viscum schemperi Engl. Growing in Saudi Arabia
Authors: Manea A. I. Alqrad, Alaa Sirwi, Sabrin R. M. Ibrahim, Hossam M. Abdallah, Gamal A. Mohamed
Abstract:
Phytochemical study of the methanolic extract of the air dried powdered of the parts of Viscum schemperi Engl. (Family: Viscaceae) using different chromatographic techniques led to the isolation of five compounds: -amyrenone (1), betulinic acid (2), (3β)-olean-12-ene-3,23-diol (3), -oleanolic acid (4), and α-oleanolic acid (5). Their structures were established based on physical, chemical, and spectral data. Anti-inflammatory and anti-apoptotic activities of oleanolic acid in a mouse model of acute hepatorenal damage were assessed. This study showed the efficacy of oleanolic acid to counteract thioacetamide-induced hepatic and kidney injury in mice through the reduction of hepatocyte oxidative damage, suppression of inflammation, and apoptosis. More importantly, oleanolic acid suppressed thioacetamide-induced hepatic and kidney injury by inhibiting NF-κB/TNF-α-mediated inflammation/apoptosis and enhancing SIRT1/Nrf2/Heme-oxygenase signalling pathway. These promising pharmacological activities suggest the potential use of oleanolic acid against hepatorenal damage.Keywords: oleanolic acid, viscum schimperi, thioacetamide, SIRT1/Nrf2/NF-κB, hepatorenal damage
Procedia PDF Downloads 10027328 Spatial Data Mining by Decision Trees
Authors: Sihem Oujdi, Hafida Belbachir
Abstract:
Existing methods of data mining cannot be applied on spatial data because they require spatial specificity consideration, as spatial relationships. This paper focuses on the classification with decision trees, which are one of the data mining techniques. We propose an extension of the C4.5 algorithm for spatial data, based on two different approaches Join materialization and Querying on the fly the different tables. Similar works have been done on these two main approaches, the first - Join materialization - favors the processing time in spite of memory space, whereas the second - Querying on the fly different tables- promotes memory space despite of the processing time. The modified C4.5 algorithm requires three entries tables: a target table, a neighbor table, and a spatial index join that contains the possible spatial relationship among the objects in the target table and those in the neighbor table. Thus, the proposed algorithms are applied to a spatial data pattern in the accidentology domain. A comparative study of our approach with other works of classification by spatial decision trees will be detailed.Keywords: C4.5 algorithm, decision trees, S-CART, spatial data mining
Procedia PDF Downloads 61727327 Investigation of Extreme Gradient Boosting Model Prediction of Soil Strain-Shear Modulus
Authors: Ehsan Mehryaar, Reza Bushehri
Abstract:
One of the principal parameters defining the clay soil dynamic response is the strain-shear modulus relation. Predicting the strain and, subsequently, shear modulus reduction of the soil is essential for performance analysis of structures exposed to earthquake and dynamic loadings. Many soil properties affect soil’s dynamic behavior. In order to capture those effects, in this study, a database containing 1193 data points consists of maximum shear modulus, strain, moisture content, initial void ratio, plastic limit, liquid limit, initial confining pressure resulting from dynamic laboratory testing of 21 clays is collected for predicting the shear modulus vs. strain curve of soil. A model based on an extreme gradient boosting technique is proposed. A tree-structured parzan estimator hyper-parameter tuning algorithm is utilized simultaneously to find the best hyper-parameters for the model. The performance of the model is compared to the existing empirical equations using the coefficient of correlation and root mean square error.Keywords: XGBoost, hyper-parameter tuning, soil shear modulus, dynamic response
Procedia PDF Downloads 20527326 Data-Driven Dynamic Overbooking Model for Tour Operators
Authors: Kannapha Amaruchkul
Abstract:
We formulate a dynamic overbooking model for a tour operator, in which most reservations contain at least two people. The cancellation rate and the timing of the cancellation may depend on the group size. We propose two overbooking policies, namely economic- and service-based. In an economic-based policy, we want to minimize the expected oversold and underused cost, whereas, in a service-based policy, we ensure that the probability of an oversold situation does not exceed the pre-specified threshold. To illustrate the applicability of our approach, we use tour package data in 2016-2018 from a tour operator in Thailand to build a data-driven robust optimization model, and we tested the proposed overbooking policy in 2019. We also compare the data-driven approach to the conventional approach of fitting data into a probability distribution.Keywords: applied stochastic model, data-driven robust optimization, overbooking, revenue management, tour operator
Procedia PDF Downloads 13527325 Modeling and Statistical Analysis of a Soap Production Mix in Bejoy Manufacturing Industry, Anambra State, Nigeria
Authors: Okolie Chukwulozie Paul, Iwenofu Chinwe Onyedika, Sinebe Jude Ebieladoh, M. C. Nwosu
Abstract:
The research work is based on the statistical analysis of the processing data. The essence is to analyze the data statistically and to generate a design model for the production mix of soap manufacturing products in Bejoy manufacturing company Nkpologwu, Aguata Local Government Area, Anambra state, Nigeria. The statistical analysis shows the statistical analysis and the correlation of the data. T test, Partial correlation and bi-variate correlation were used to understand what the data portrays. The design model developed was used to model the data production yield and the correlation of the variables show that the R2 is 98.7%. However, the results confirm that the data is fit for further analysis and modeling. This was proved by the correlation and the R-squared.Keywords: General Linear Model, correlation, variables, pearson, significance, T-test, soap, production mix and statistic
Procedia PDF Downloads 44827324 Environmental Protection by Optimum Utilization of Car Air Conditioners
Authors: Sanchita Abrol, Kunal Rana, Ankit Dhir, S. K. Gupta
Abstract:
According to N.R.E.L.’s findings, 700 crore gallons of petrol is used annually to run the air conditioners of passenger vehicles (nearly 6% of total fuel consumption in the USA). Beyond fuel use, the Environmental Protection Agency reported that refrigerant leaks from auto air conditioning units add an additional 5 crore metric tons of carbon emissions to the atmosphere each year. The objective of our project is to deal with this vital issue by carefully modifying the interiors of a car thereby increasing its mileage and the efficiency of its engine. This would consequently result in a decrease in tail emission and generated pollution along with improved car performance. An automatic mechanism, deployed between the front and the rear seats, consisting of transparent thermal insulating sheet/curtain, would roll down as per the requirement of the driver in order to optimize the volume for effective air conditioning, when travelling alone or with a person. The reduction in effective volume will yield favourable results. Even on a mild sunny day, the temperature inside a parked car can quickly spike to life-threatening levels. For a stationary parked car, insulation would be provided beneath its metal body so as to reduce the rate of heat transfer and increase the transmissivity. As a result, the car would not require a large amount of air conditioning for maintaining lower temperature, which would provide us similar benefits. Authors established the feasibility studies, system engineering and primarily theoretical and experimental results confirming the idea and motivation to fabricate and test the actual product.Keywords: automation, car, cooling insulating curtains, heat optimization, insulation, reduction in tail emission, mileage
Procedia PDF Downloads 28027323 Design and Development of Tandem Dynamometer for Testing and Validation of Motor Performance Parameters
Authors: Vedansh More, Lalatendu Bal, Ronak Panchal, Atharva Kulkarni
Abstract:
The project aims at developing a cost-effective test bench capable of testing and validating the complete powertrain package of an electric vehicle. Emrax 228 high voltage synchronous motor was selected as the prime mover for study. A tandem type dynamometer comprising of two loading methods; inertial, using standard inertia rollers and absorptive, using a separately excited DC generator with resistive coils was developed. The absorptive loading of the prime mover was achieved by implementing a converter circuit through which duty of the input field voltage level was controlled. This control was efficacious in changing the magnetic flux and hence the generated voltage which was ultimately dropped across resistive coils assembled in a load bank with all parallel configuration. The prime mover and loading elements were connected via a chain drive with a 2:1 reduction ratio which allows flexibility in placement of components and a relaxed rating of the DC generator. The development will aid in determination of essential characteristics like torque-RPM, power-RPM, torque factor, RPM factor, heat loads of devices and battery pack state of charge efficiency but also provides a significant financial advantage over existing versions of dynamometers with its cost-effective solution.Keywords: absorptive load, chain drive, chordal action, DC generator, dynamometer, electric vehicle, inertia rollers, load bank, powertrain, pulse width modulation, reduction ratio, road load, testbench
Procedia PDF Downloads 23527322 Inter-Generational Benefits of Improving Access to Justice for Women: Evidence from Peru
Authors: Iva Trako, Maris Micaela Sviatschi, Guadalupe Kavanaugh
Abstract:
Domestic violence is a major concern in developing countries, with important social, economic and health consequences. However, institutions do not usually address the problems facing women or ethnic and religious minorities. For example, the police do very little to stop domestic violence in rural areas of developing countries. This paper exploits the introduction of women’s justice centers (WJCs) in Peru to provide causal estimates on the effects of improving access to justice for women and children. These centers offer a new integrated public service model for women by including medical, psychological and legal support in cases of violence against women. Our empirical approach uses a difference in difference estimation exploiting variation over time and space in the opening of WJC together with province-by-year fixed effects. Exploiting administrative data from health providers and district attorney offices, we find that after the opening of these centers, there are important improvements on women's welfare: a large reduction in femicides and female hospitalizations for assault. Moreover, using geo-coded household surveys we find evidence that the existence of these services reduces domestic violence, improves women's health, increases women's threat points and, therefore, lead to household decisions that are more aligned with their interests. Using administrative data on the universe of schools, we find large gains on human capital for their children: affected children are more likely to enroll, attend school and have better grades in national exams, instead of working for the family. In sum, the evidence in this paper shows that providing access to justice for women can be a powerful tool to reduce domestic violence and increase education of children, suggesting a positive inter-generational benefit.Keywords: access to justice, domestic violence, education, household bargaining
Procedia PDF Downloads 18627321 Helping the Development of Public Policies with Knowledge of Criminal Data
Authors: Diego De Castro Rodrigues, Marcelo B. Nery, Sergio Adorno
Abstract:
The project aims to develop a framework for social data analysis, particularly by mobilizing criminal records and applying descriptive computational techniques, such as associative algorithms and extraction of tree decision rules, among others. The methods and instruments discussed in this work will enable the discovery of patterns, providing a guided means to identify similarities between recurring situations in the social sphere using descriptive techniques and data visualization. The study area has been defined as the city of São Paulo, with the structuring of social data as the central idea, with a particular focus on the quality of the information. Given this, a set of tools will be validated, including the use of a database and tools for visualizing the results. Among the main deliverables related to products and the development of articles are the discoveries made during the research phase. The effectiveness and utility of the results will depend on studies involving real data, validated both by domain experts and by identifying and comparing the patterns found in this study with other phenomena described in the literature. The intention is to contribute to evidence-based understanding and decision-making in the social field.Keywords: social data analysis, criminal records, computational techniques, data mining, big data
Procedia PDF Downloads 8627320 Evaluation of Simulated Noise Levels through the Analysis of Temperature and Rainfall: A Case Study of Nairobi Central Business District
Authors: Emmanuel Yussuf, John Muthama, John Ng'ang'A
Abstract:
There has been increasing noise levels all over the world in the last decade. Many factors contribute to this increase, which is causing health related effects to humans. Developing countries are not left out of the whole picture as they are still growing and advancing their development. Motor vehicles are increasing on urban roads; there is an increase in infrastructure due to the rising population, increasing number of industries to provide goods and so many other activities. All this activities lead to the high noise levels in cities. This study was conducted in Nairobi’s Central Business District (CBD) with the main objective of simulating noise levels in order to understand the noise exposed to the people within the urban area, in relation to weather parameters namely temperature, rainfall and wind field. The study was achieved using the Neighbourhood Proximity Model and Time Series Analysis, with data obtained from proxies/remotely-sensed from satellites, in order to establish the levels of noise exposed to which people of Nairobi CBD are exposed to. The findings showed that there is an increase in temperature (0.1°C per year) and a decrease in precipitation (40 mm per year), which in comparison to the noise levels in the area, are increasing. The study also found out that noise levels exposed to people in Nairobi CBD were roughly between 61 and 63 decibels and has been increasing, a level which is high and likely to cause adverse physical and psychological effects on the human body in which air temperature, precipitation and wind contribute so much in the spread of noise. As a noise reduction measure, the use of sound proof materials in buildings close to busy roads, implementation of strict laws to most emitting sources as well as further research on the study was recommended. The data used for this study ranged from the year 2000 to 2015, rainfall being in millimeters (mm), temperature in degrees Celsius (°C) and the urban form characteristics being in meters (m).Keywords: simulation, noise exposure, weather, proxy
Procedia PDF Downloads 38027319 Statistical Manufacturing Cell/Process Qualification Sample Size Optimization
Authors: Angad Arora
Abstract:
In production operations/manufacturing, a cell or line is typically a bunch of similar machines (computer numerical control (CNCs), advanced cutting, 3D printing or special purpose machines. For qualifying a typical manufacturing line /cell / new process, Ideally, we need a sample of parts that can be flown through the process and then we make a judgment on the health of the line/cell. However, with huge volumes and mass production scope, such as in the mobile phone industry, for example, the actual cells or lines can go in thousands and to qualify each one of them with statistical confidence means utilizing samples that are very large and eventually add to product /manufacturing cost + huge waste if the parts are not intended to be customer shipped. To solve this, we come up with 2 steps statistical approach. We start with a small sample size and then objectively evaluate whether the process needs additional samples or not. For example, if a process is producing bad parts and we saw those samples early, then there is a high chance that the process will not meet the desired yield and there is no point in keeping adding more samples. We used this hypothesis and came up with 2 steps binomial testing approach. Further, we also prove through results that we can achieve an 18-25% reduction in samples while keeping the same statistical confidence.Keywords: statistics, data science, manufacturing process qualification, production planning
Procedia PDF Downloads 9927318 Optimization of Real Time Measured Data Transmission, Given the Amount of Data Transmitted
Authors: Michal Kopcek, Tomas Skulavik, Michal Kebisek, Gabriela Krizanova
Abstract:
The operation of nuclear power plants involves continuous monitoring of the environment in their area. This monitoring is performed using a complex data acquisition system, which collects status information about the system itself and values of many important physical variables e.g. temperature, humidity, dose rate etc. This paper describes a proposal and optimization of communication that takes place in teledosimetric system between the central control server responsible for the data processing and storing and the decentralized measuring stations, which are measuring the physical variables. Analyzes of ongoing communication were performed and consequently the optimization of the system architecture and communication was done.Keywords: communication protocol, transmission optimization, data acquisition, system architecture
Procedia PDF Downloads 52227317 Age and Population Structure of the Goby Parapocryptes Serperaster in the Mekong Delta, Vietnam, Based on Length-Frequency and Otolith Analyses
Authors: Quang Minh Dinh, Jian Guang Qin, Sabine Dittmann, Dinh Dac Tran
Abstract:
The age and population structure the dermal gopy Parapocryptes serperaster were studied using length distributions, otolith and von Bertalanffy model in the Mekong Delta over a whole year through monthly sampling. The sex ratio of P. serperaster was near 1:1, and von Bertalanffy growth parameters were L∞= 25.2 cm, K = 0.74 yr-1, and t0 = -0.22 yr-1. Fish size at first entry to fishery was 14.6 cm, and fishing mortality (1.57 yr-1) and natural mortality (1.51 yr-1) accounted for 51% and 49% of the total mortality (3.07 yr-1), respectively. Relative yield-per-recruit and biomass-per-recruit analyses revealed the levels of maximum exploitation yield (Emax = 0.83), maximum economic yield (E0.1 = 0.71) and the yield at 50% reduction of exploitation (E0.5 = 0.37). Otoliths from 164 female and 196 male gobies were readable, and the otolith morphometry data were used for age identification. The mean age estimated by reading otolith annual rings and by analysing length frequency distribution was consistent. This study shows that the otolith morphometry is a reliable method for aging this goby and possibly also applicable for other tropical gobies. The fishery analysis indicates that this goby stock has not been overexploited in the Mekong Delta.Keywords: Parapcryptes serperaster, otolith, age, pulation structure, Vietnam
Procedia PDF Downloads 65627316 Evaluation of the Impact of Reducing the Traffic Light Cycle for Cars to Improve Non-Vehicular Transportation: A Case of Study in Lima
Authors: Gheyder Concha Bendezu, Rodrigo Lescano Loli, Aldo Bravo Lizano
Abstract:
In big urbanized cities of Latin America, motor vehicles have priority over non-motor vehicles and pedestrians. There is an important problem that affects people's health and quality of life; lack of inclusion towards pedestrians makes it difficult for them to move smoothly and safely since the city has been planned for the transit of motor vehicles. Faced with the new trend for sustainable and economical transport, the city is forced to develop infrastructure in order to incorporate pedestrians and users with non-motorized vehicles in the transport system. The present research aims to study the influence of non-motorized vehicles on an avenue, the optimization of a cycle using traffic lights based on simulation in Synchro software, to improve the flow of non-motor vehicles. The evaluation is of the microscopic type; for this reason, field data was collected, such as vehicular, pedestrian, and non-motor vehicle user demand. With the values of speed and travel time, it is represented in the current scenario that contains the existing problem. These data allow to create a microsimulation model in Vissim software, later to be calibrated and validated so that it has a behavior similar to reality. The results of this model are compared with the efficiency parameters of the proposed model; these parameters are the queue length, the travel speed, and mainly the travel times of the users at this intersection. The results reflect a reduction of 27% in travel time, that is, an improvement between the proposed model and the current one for this great avenue. The tail length of motor vehicles is also reduced by 12.5%, a considerable improvement. All this represents an improvement in the level of service and in the quality of life of users.Keywords: bikeway, microsimulation, pedestrians, queue length, traffic light cycle, travel time
Procedia PDF Downloads 17727315 Hybrid Materials on the Basis of Magnetite and Magnetite-Gold Nanoparticles for Biomedical Application
Authors: Mariia V. Efremova, Iana O. Tcareva, Anastasia D. Blokhina, Ivan S. Grebennikov, Anastasia S. Garanina, Maxim A. Abakumov, Yury I. Golovin, Alexander G. Savchenko, Alexander G. Majouga, Natalya L. Klyachko
Abstract:
During last decades magnetite nanoparticles (NPs) attract a deep interest of scientists due to their potential application in therapy and diagnostics. However, magnetite nanoparticles are toxic and non-stable in physiological conditions. To solve these problems, we decided to create two types of hybrid systems based on magnetite and gold which is inert and biocompatible: gold as a shell material (first type) and gold as separate NPs interfacially bond to magnetite NPs (second type). The synthesis of the first type hybrid nanoparticles was carried out as follows: Magnetite nanoparticles with an average diameter of 9±2 nm were obtained by co-precipitation of iron (II, III) chlorides then they were covered with gold shell by iterative reduction of hydrogen tetrachloroaurate with hydroxylamine hydrochloride. According to the TEM, ICP MS and EDX data, final nanoparticles had an average diameter of 31±4 nm and contained iron even after hydrochloric acid treatment. However, iron signals (K-line, 7,1 keV) were not localized so we can’t speak about one single magnetic core. Described nanoparticles covered with mercapto-PEG acid were non-toxic for human prostate cancer PC-3/ LNCaP cell lines (more than 90% survived cells as compared to control) and had high R2-relaxivity rates (>190 mМ-1s-1) that exceed the transverse relaxation rate of commercial MRI-contrasting agents. These nanoparticles were also used for chymotrypsin enzyme immobilization. The effect of alternating magnetic field on catalytic properties of chymotrypsin immobilized on magnetite nanoparticles, notably the slowdown of catalyzed reaction at the level of 35-40 % was found. The synthesis of the second type hybrid nanoparticles also involved two steps. Firstly, spherical gold nanoparticles with an average diameter of 9±2 nm were synthesized by the reduction of hydrogen tetrachloroaurate with oleylamine; secondly, they were used as seeds during magnetite synthesis by thermal decomposition of iron pentacarbonyl in octadecene. As a result, so-called dumbbell-like structures were obtained where magnetite (cubes with 25±6 nm diagonal) and gold nanoparticles were connected together pairwise. By HRTEM method (first time for this type of structure) an epitaxial growth of magnetite nanoparticles on gold surface with co-orientation of (111) planes was discovered. These nanoparticles were transferred into water by means of block-copolymer Pluronic F127 then loaded with anti-cancer drug doxorubicin and also PSMA-vector specific for LNCaP cell line. Obtained nanoparticles were found to have moderate toxicity for human prostate cancer cells and got into the intracellular space after 45 minutes of incubation (according to fluorescence microscopy data). These materials are also perspective from MRI point of view (R2-relaxivity rates >70 mМ-1s-1). Thereby, in this work magnetite-gold hybrid nanoparticles, which have a strong potential for biomedical application, particularly in targeted drug delivery and magnetic resonance imaging, were synthesized and characterized. That paves the way to the development of special medicine types – theranostics. The authors knowledge financial support from Ministry of Education and Science of the Russian Federation (14.607.21.0132, RFMEFI60715X0132). This work was also supported by Grant of Ministry of Education and Science of the Russian Federation К1-2014-022, Grant of Russian Scientific Foundation 14-13-00731 and MSU development program 5.13.Keywords: drug delivery, magnetite-gold, MRI contrast agents, nanoparticles, toxicity
Procedia PDF Downloads 38427314 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation
Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski
Abstract:
Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.Keywords: bootstrap, edgeworth approximation, IID, quantile
Procedia PDF Downloads 16027313 The Duty of Application and Connection Providers Regarding the Supply of Internet Protocol by Court Order in Brazil to Determine Authorship of Acts Practiced on the Internet
Authors: João Pedro Albino, Ana Cláudia Pires Ferreira de Lima
Abstract:
Humanity has undergone a transformation from the physical to the virtual world, generating an enormous amount of data on the world wide web, known as big data. Many facts that occur in the physical world or in the digital world are proven through records made on the internet, such as digital photographs, posts on social media, contract acceptances by digital platforms, email, banking, and messaging applications, among others. These data recorded on the internet have been used as evidence in judicial proceedings. The identification of internet users is essential for the security of legal relationships. This research was carried out on scientific articles and materials from courses and lectures, with an analysis of Brazilian legislation and some judicial decisions on the request of static data from logs and Internet Protocols (IPs) from application and connection providers. In this article, we will address the determination of authorship of data processing on the internet by obtaining the IP address and the appropriate judicial procedure for this purpose under Brazilian law.Keywords: IP address, digital forensics, big data, data analytics, information and communication technology
Procedia PDF Downloads 12527312 The Survey of Relationship between Health Literacy and Knowledge of Heart Failure with Rehospitalization in Patients with Heart Failure Admitted to Heart Failure Clinic
Authors: Jaleh Mohammad Aliha, Rezvan Razazi, Nasim Naderi
Abstract:
Introduction: Despite the progress in new effective drugs in the treatment of heart failure, the disease still accompanied with frequent hospitalization, impaired quality of life, early mortality and significant economic burden. Patients with chronic disease and consequently patients with heart failure need the knowledge and optimal health literacy to improve the quality of life and minimize the rate of rehopitalizatio. So, considering to importance of knowledge and health literacy in this patients as well as contradictory literature, this study conducted to investigate the relationship between health literacy and Knowledge of heart failure with rehospitalization in patients with heart failure admitted to heart failure clinic in Rajai Heart center in 1394. Methods: The cross-sectional method with convenience sampling method was used in this study. After obtaining the necessary permissions from the ethics committee and the Shahid Rajai Heart center, 238 patients who were older than 18 years and had ejection fraction 35% or less with the ability to read and write and lack of psychiatric, neurological and cognitive disorders and signed the informed consent were recruited. Data collection were perfomed through demographic data questionnaire, short standard health literacy questionnaire 'Short-TOFHLA-16' and Vanderwall (2005) knowledge of heart failure questionnaire. Reliability was assessed by internal consistency method and Cronbach's alpha for both questionnaires was more than 0.7. Then data were analysed by SPSS-20 with descriptive statistic and analytical statistic such as T-test, Chi-square and ANOVA. Results: The majority of patients were male (66%), married (80%) and had age between 50 to 70 years old (42%). The majority of studied men and women have good health literacy and About half of them have adequate knowledge about heart failure. Fisher's exact test showed that there was a significant statistical correlation between health literacy and knowlegh about heart failure. In other words, higher health literacy associated with more knowledge about their condition. Also findings showed that there was no significant statistical correlation between health literacy and knowledge about heart failure and frequency of CCU and emergency admissions. Conclusion: The study results showed that the higher health literacy, associated with the greater knowledge about heart failure and patients' perception about caring recommendations and disease outcomes. Therefore, the knowledge about heart failure and factors which related to severity of the disease, is the important issue to problem identification and treatment and reduction of rehospitalization.Keywords: health literacy, heart failure, knowlegde, rehospitalization
Procedia PDF Downloads 40227311 Protective Effect of Levetiracetam on Aggravation of Memory Impairment in Temporal Lobe Epilepsy by Phenytoin
Authors: Asher John Mohan, Krishna K. L.
Abstract:
Objectives: (1) To assess the extent of memory impairment induced by Phenytoin (PHT) at normal and reduced dose on temporal lobe epileptic mice. (2) To evaluate the protective effect of Levetiracetam (LEV) on aggravation of memory impairment in temporal lobe epileptic mice by PHT. Materials and Methods: Albino mice of either sex (n=36) were used for the study for a period of 64 days. Convulsions were induced by intraperitoneal administration of pilocarpine 280 mg/kg on every 6th day. Radial arm maze (RAM) was employed to evaluate the memory impairment activity on every 7th day. The anticonvulsant and memory impairment activity were assessed in PHT normal and reduced doses both alone and in combination with LEV. RAM error scores and convulsive scores were the parameters considered for this study. Brain acetylcholine esterase and glutamate were determined along with histopathological studies of frontal cortex. Results: Administration of PHT for 64 days on mice has shown aggravation of memory impairment activity on temporal lobe epileptic mice. Although the reduction in PHT dose was found to decrease the degree of memory impairment the same decreased the anticonvulsant potency. The combination with LEV not only brought about the correction of impaired memory but also replaced the loss of potency due to the reduction of the dose of the antiepileptic drug employed. These findings were confirmed with enzyme and neurotransmitter levels in addition to histopathological studies. Conclusion: This study thus builds a foundation in combining a nootropic anticonvulsant with an antiepileptic drug to curb the adverse effect of memory impairment associated with temporal lobe epilepsy. However further extensive research is a must for the practical incorporation of this approach into disease therapy.Keywords: anti-epileptic drug, Phenytoin, memory impairment, Pilocarpine
Procedia PDF Downloads 31727310 Sourcing and Compiling a Maltese Traffic Dataset MalTra
Authors: Gabriele Borg, Alexei De Bono, Charlie Abela
Abstract:
There on a constant rise in the availability of high volumes of data gathered from multiple sources, resulting in an abundance of unprocessed information that can be used to monitor patterns and trends in user behaviour. Similarly, year after year, Malta is also constantly experiencing ongoing population growth and an increase in mobilization demand. This research takes advantage of data which is continuously being sourced and converting it into useful information related to the traffic problem on the Maltese roads. The scope of this paper is to provide a methodology to create a custom dataset (MalTra - Malta Traffic) compiled from multiple participants from various locations across the island to identify the most common routes taken to expose the main areas of activity. This use of big data is seen being used in various technologies and is referred to as ITSs (Intelligent Transportation Systems), which has been concluded that there is significant potential in utilising such sources of data on a nationwide scale.Keywords: Big Data, vehicular traffic, traffic management, mobile data patterns
Procedia PDF Downloads 11027309 Comparative Study of Accuracy of Land Cover/Land Use Mapping Using Medium Resolution Satellite Imagery: A Case Study
Authors: M. C. Paliwal, A. K. Jain, S. K. Katiyar
Abstract:
Classification of satellite imagery is very important for the assessment of its accuracy. In order to determine the accuracy of the classified image, usually the assumed-true data are derived from ground truth data using Global Positioning System. The data collected from satellite imagery and ground truth data is then compared to find out the accuracy of data and error matrices are prepared. Overall and individual accuracies are calculated using different methods. The study illustrates advanced classification and accuracy assessment of land use/land cover mapping using satellite imagery. IRS-1C-LISS IV data were used for classification of satellite imagery. The satellite image was classified using the software in fourteen classes namely water bodies, agricultural fields, forest land, urban settlement, barren land and unclassified area etc. Classification of satellite imagery and calculation of accuracy was done by using ERDAS-Imagine software to find out the best method. This study is based on the data collected for Bhopal city boundaries of Madhya Pradesh State of India.Keywords: resolution, accuracy assessment, land use mapping, satellite imagery, ground truth data, error matrices
Procedia PDF Downloads 50927308 Effect of Genuine Missing Data Imputation on Prediction of Urinary Incontinence
Authors: Suzan Arslanturk, Mohammad-Reza Siadat, Theophilus Ogunyemi, Ananias Diokno
Abstract:
Missing data is a common challenge in statistical analyses of most clinical survey datasets. A variety of methods have been developed to enable analysis of survey data to deal with missing values. Imputation is the most commonly used among the above methods. However, in order to minimize the bias introduced due to imputation, one must choose the right imputation technique and apply it to the correct type of missing data. In this paper, we have identified different types of missing values: missing data due to skip pattern (SPMD), undetermined missing data (UMD), and genuine missing data (GMD) and applied rough set imputation on only the GMD portion of the missing data. We have used rough set imputation to evaluate the effect of such imputation on prediction by generating several simulation datasets based on an existing epidemiological dataset (MESA). To measure how well each dataset lends itself to the prediction model (logistic regression), we have used p-values from the Wald test. To evaluate the accuracy of the prediction, we have considered the width of 95% confidence interval for the probability of incontinence. Both imputed and non-imputed simulation datasets were fit to the prediction model, and they both turned out to be significant (p-value < 0.05). However, the Wald score shows a better fit for the imputed compared to non-imputed datasets (28.7 vs. 23.4). The average confidence interval width was decreased by 10.4% when the imputed dataset was used, meaning higher precision. The results show that using the rough set method for missing data imputation on GMD data improve the predictive capability of the logistic regression. Further studies are required to generalize this conclusion to other clinical survey datasets.Keywords: rough set, imputation, clinical survey data simulation, genuine missing data, predictive index
Procedia PDF Downloads 16927307 Quasi-Photon Monte Carlo on Radiative Heat Transfer: An Importance Sampling and Learning Approach
Authors: Utkarsh A. Mishra, Ankit Bansal
Abstract:
At high temperature, radiative heat transfer is the dominant mode of heat transfer. It is governed by various phenomena such as photon emission, absorption, and scattering. The solution of the governing integrodifferential equation of radiative transfer is a complex process, more when the effect of participating medium and wavelength properties are taken into consideration. Although a generic formulation of such radiative transport problem can be modeled for a wide variety of problems with non-gray, non-diffusive surfaces, there is always a trade-off between simplicity and accuracy of the problem. Recently, solutions of complicated mathematical problems with statistical methods based on randomization of naturally occurring phenomena have gained significant importance. Photon bundles with discrete energy can be replicated with random numbers describing the emission, absorption, and scattering processes. Photon Monte Carlo (PMC) is a simple, yet powerful technique, to solve radiative transfer problems in complicated geometries with arbitrary participating medium. The method, on the one hand, increases the accuracy of estimation, and on the other hand, increases the computational cost. The participating media -generally a gas, such as CO₂, CO, and H₂O- present complex emission and absorption spectra. To model the emission/absorption accurately with random numbers requires a weighted sampling as different sections of the spectrum carries different importance. Importance sampling (IS) was implemented to sample random photon of arbitrary wavelength, and the sampled data provided unbiased training of MC estimators for better results. A better replacement to uniform random numbers is using deterministic, quasi-random sequences. Halton, Sobol, and Faure Low-Discrepancy Sequences are used in this study. They possess better space-filling performance than the uniform random number generator and gives rise to a low variance, stable Quasi-Monte Carlo (QMC) estimators with faster convergence. An optimal supervised learning scheme was further considered to reduce the computation costs of the PMC simulation. A one-dimensional plane-parallel slab problem with participating media was formulated. The history of some randomly sampled photon bundles is recorded to train an Artificial Neural Network (ANN), back-propagation model. The flux was calculated using the standard quasi PMC and was considered to be the training target. Results obtained with the proposed model for the one-dimensional problem are compared with the exact analytical and PMC model with the Line by Line (LBL) spectral model. The approximate variance obtained was around 3.14%. Results were analyzed with respect to time and the total flux in both cases. A significant reduction in variance as well a faster rate of convergence was observed in the case of the QMC method over the standard PMC method. However, the results obtained with the ANN method resulted in greater variance (around 25-28%) as compared to the other cases. There is a great scope of machine learning models to help in further reduction of computation cost once trained successfully. Multiple ways of selecting the input data as well as various architectures will be tried such that the concerned environment can be fully addressed to the ANN model. Better results can be achieved in this unexplored domain.Keywords: radiative heat transfer, Monte Carlo Method, pseudo-random numbers, low discrepancy sequences, artificial neural networks
Procedia PDF Downloads 225