Search results for: waste processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6177

Search results for: waste processing

717 Eco-Friendly Softener Extracted from Ricinus communis (Castor) Seeds for Organic Cotton Fabric

Authors: Fisaha Asmelash

Abstract:

The processing of textiles to achieve a desired handle is a crucial aspect of finishing technology. Softeners can enhance the properties of textiles, such as softness, smoothness, elasticity, hydrophilicity, antistatic properties, and soil release properties, depending on the chemical nature used. However, human skin is sensitive to rough textiles, making softeners increasingly important. Although synthetic softeners are available, they are often expensive and can cause allergic reactions on human skin. This paper aims to extract a natural softener from Ricinus communis and produce an eco-friendly and user-friendly alternative due to its 100% herbal and organic nature. Crushed Ricinus communis seeds were soaked in a mechanical oil extractor for one hour with a 100g cotton fabric sample. The defatted cake or residue obtained after the extraction of oil from the seeds, also known as Ricinus communis meal, was obtained by filtering the raffinate and then dried at 1030c for four hours before being stored under laboratory conditions for the softening process. The softener was applied directly to 100% cotton fabric using the padding process, and the fabric was tested for stiffness, crease recovery, and drape ability. The effect of different concentrations of finishing agents on fabric stiffness, crease recovery, and drape ability was also analyzed. The results showed that the change in fabric softness depends on the concentration of the finish used. As the concentration of the finish was increased, there was a decrease in bending length and drape coefficient. Fabrics with a high concentration of softener showed a maximum decrease in drape coefficient and stiffness, comparable to commercial softeners such as silicon. The highest decrease in drape coefficient was found to be comparable with commercial softeners, silicon. Maximum increases in crease recovery were seen in fabrics treated with Ricinus communis softener at a concentration of 30gpl. From the results, the extracted softener proved to be effective in the treatment of 100% cotton fabric

Keywords: ricinus communis, crease recovery, drapability, softeners, stiffness

Procedia PDF Downloads 91
716 Study of the Montmorillonite Effect on PET/Clay and PEN/Clay Nanocomposites

Authors: F. Zouai, F. Z. Benabid, S. Bouhelal, D. Benachour

Abstract:

Nanocomposite polymer / clay are relatively important area of research. These reinforced plastics have attracted considerable attention in scientific and industrial fields because a very small amount of clay can significantly improve the properties of the polymer. The polymeric matrices used in this work are two saturated polyesters ie polyethylene terephthalate (PET) and polyethylene naphthalate (PEN).The success of processing compatible blends, based on poly(ethylene terephthalate) (PET)/ poly(ethylene naphthalene) (PEN)/clay nanocomposites in one step by reactive melt extrusion is described. Untreated clay was first purified and functionalized ‘in situ’ with a compound based on an organic peroxide/ sulfur mixture and (tetramethylthiuram disulfide) as the activator for sulfur. The PET and PEN materials were first separately mixed in the molten state with functionalized clay. The PET/4 wt% clay and PEN/7.5 wt% clay compositions showed total exfoliation. These compositions, denoted nPET and nPEN, respectively, were used to prepare new n(PET/PEN) nanoblends in the same mixing batch. The n(PET/PEN) nanoblends were compared to neat PET/PEN blends. The blends and nanocomposites were characterized using various techniques. Microstructural and nanostructural properties were investigated. Fourier transform infrared spectroscopy (FTIR) results showed that the exfoliation of tetrahedral clay nanolayers is complete and the octahedral structure totally disappears. It was shown that total exfoliation, confirmed by wide angle X-ray scattering (WAXS) measurements, contributes to the enhancement of impact strength and tensile modulus. In addition, WAXS results indicated that all samples are amorphous. The differential scanning calorimetry (DSC) study indicated the occurrence of one glass transition temperature Tg, one crystallization temperature Tc and one melting temperature Tm for every composition. This was evidence that both PET/PEN and nPET/nPEN blends are compatible in the entire range of compositions. In addition, the nPET/nPEN blends showed lower Tc and higher Tm values than the corresponding neat PET/PEN blends. In conclusion, the results obtained indicate that n(PET/PEN) blends are different from the pure ones in nanostructure and physical behavior.

Keywords: blends, exfoliation, DRX, DSC, montmorillonite, nanocomposites, PEN, PET, plastograph, reactive melt-mixing

Procedia PDF Downloads 298
715 Smart Help at the Workplace for Persons with Disabilities (SHW-PWD)

Authors: Ghassan Kbar, Shady Aly, Ibrahim Alsharawy, Akshay Bhatia, Nur Alhasan, Ronaldo Enriquez

Abstract:

The Smart Help for persons with disability (PWD) is a part of the project SMARTDISABLE which aims to develop relevant solution for PWD that target to provide an adequate workplace environment for them. It would support PWD needs smartly through smart help to allow them access to relevant information and communicate with other effectively and flexibly, and smart editor that assist them in their daily work. It will assist PWD in knowledge processing and creation as well as being able to be productive at the work place. The technical work of the project involves design of a technological scenario for the Ambient Intelligence (AmI) - based assistive technologies at the workplace consisting of an integrated universal smart solution that suits many different impairment conditions and will be designed to empower the Physically disabled persons (PDP) with the capability to access and effectively utilize the ICTs in order to execute knowledge rich working tasks with minimum efforts and with sufficient comfort level. The proposed technology solution for PWD will support voice recognition along with normal keyboard and mouse to control the smart help and smart editor with dynamic auto display interface that satisfies the requirements for different PWD group. In addition, a smart help will provide intelligent intervention based on the behavior of PWD to guide them and warn them about possible misbehavior. PWD can communicate with others using Voice over IP controlled by voice recognition. Moreover, Auto Emergency Help Response would be supported to assist PWD in case of emergency. This proposed technology solution intended to make PWD very effective at the work environment and flexible using voice to conduct their tasks at the work environment. The proposed solution aims to provide favorable outcomes that assist PWD at the work place, with the opportunity to participate in PWD assistive technology innovation market which is still small and rapidly growing as well as upgrading their quality of life to become similar to the normal people at the workplace. Finally, the proposed smart help solution is applicable in all workplace setting, including offices, manufacturing, hospital, etc.

Keywords: ambient intelligence, ICT, persons with disability PWD, smart application, SHW

Procedia PDF Downloads 423
714 MIMO Radar-Based System for Structural Health Monitoring and Geophysical Applications

Authors: Davide D’Aria, Paolo Falcone, Luigi Maggi, Aldo Cero, Giovanni Amoroso

Abstract:

The paper presents a methodology for real-time structural health monitoring and geophysical applications. The key elements of the system are a high performance MIMO RADAR sensor, an optical camera and a dedicated set of software algorithms encompassing interferometry, tomography and photogrammetry. The MIMO Radar sensor proposed in this work, provides an extremely high sensitivity to displacements making the system able to react to tiny deformations (up to tens of microns) with a time scale which spans from milliseconds to hours. The MIMO feature of the system makes the system capable of providing a set of two-dimensional images of the observed scene, each mapped on the azimuth-range directions with noticeably resolution in both the dimensions and with an outstanding repetition rate. The back-scattered energy, which is distributed in the 3D space, is projected on a 2D plane, where each pixel has as coordinates the Line-Of-Sight distance and the cross-range azimuthal angle. At the same time, the high performing processing unit allows to sense the observed scene with remarkable refresh periods (up to milliseconds), thus opening the way for combined static and dynamic structural health monitoring. Thanks to the smart TX/RX antenna array layout, the MIMO data can be processed through a tomographic approach to reconstruct the three-dimensional map of the observed scene. This 3D point cloud is then accurately mapped on a 2D digital optical image through photogrammetric techniques, allowing for easy and straightforward interpretations of the measurements. Once the three-dimensional image is reconstructed, a 'repeat-pass' interferometric approach is exploited to provide the user of the system with high frequency three-dimensional motion/vibration estimation of each point of the reconstructed image. At this stage, the methodology leverages consolidated atmospheric correction algorithms to provide reliable displacement and vibration measurements.

Keywords: interferometry, MIMO RADAR, SAR, tomography

Procedia PDF Downloads 195
713 Designing and Implementing a Tourist-Guide Web Service Based on Volunteer Geographic Information Using Open-Source Technologies

Authors: Javad Sadidi, Ehsan Babaei, Hani Rezayan

Abstract:

The advent of web 2.0 gives a possibility to scale down the costs of data collection and mapping, specifically if the process is done by volunteers. Every volunteer can be thought of as a free and ubiquitous sensor to collect spatial, descriptive as well as multimedia data for tourist services. The lack of large-scale information, such as real-time climate and weather conditions, population density, and other related data, can be considered one of the important challenges in developing countries for tourists to make the best decision in terms of time and place of travel. The current research aims to design and implement a spatiotemporal web map service using volunteer-submitted data. The service acts as a tourist-guide service in which tourists can search interested places based on their requested time for travel. To design the service, three tiers of architecture, including data, logical processing, and presentation tiers, have been utilized. For implementing the service, open-source software programs, client and server-side programming languages (such as OpenLayers2, AJAX, and PHP), Geoserver as a map server, and Web Feature Service (WFS) standards have been used. The result is two distinct browser-based services, one for sending spatial, descriptive, and multimedia volunteer data and another one for tourists and local officials. Local official confirms the veracity of the volunteer-submitted information. In the tourist interface, a spatiotemporal search engine has been designed to enable tourists to find a tourist place based on province, city, and location at a specific time of interest. Implementing the tourist-guide service by this methodology causes the following: the current tourists participate in a free data collection and sharing process for future tourists, a real-time data sharing and accessing for all, avoiding a blind selection of travel destination and significantly, decreases the cost of providing such services.

Keywords: VGI, tourism, spatiotemporal, browser-based, web mapping

Procedia PDF Downloads 98
712 The Gender Digital Divide in Education: The Case of Students from Rural Area from Republic of Moldova

Authors: Bărbuță Alina

Abstract:

The inter-causal relationship between social inequalities and the digital divide raises the relation issue of gender and information and communication technologies (ICT) - a key element in achieving sustainable development. In preparing generations as future digital citizens and for active socio-economic participation, ICT plays a key role in respecting gender equality. Although several studies over the years have shown that gender plays an important role in digital exclusion, in recent years, many studies with a focus on economically developed or developing countries identify an improvement in these aspects and a gap narrowing. By measuring students' digital competencies level, this paper aims to identify and analyse the existing gender digital inequalities among students. Our analyses are based on a sample of 1526 middle school students residing in rural areas from Republic of Moldova (54.2% girls, mean age 14,00, SD = 1.02). During the online survey they filled in a questionnaire adapted from the (yDSI) ”The Youth Digital Skills Indicator”. The instrument measures the level of five digital competence areas indicated in The European Digital Competence Framework (DigiCom 2.3.). Our results, based on t-test, indicate that depending on gender, there are no statistically significant differences regarding the levels of digital skills in 3 areas: Information navigation and processing; Communication and interaction; Problem solving. However, were identified significant differences in the level of digital skills in the area of ”Digital content creation” [t(1425) = 4.20, p = .000] and ”Safety” [t(1421) = 2.49, p = .000], with higher scores recorded by girls. Our results contradicts the general stereotype regarding the low level of digital competence among girls, in our sample girls scores being on pear with boys and even bigger in knowledge related to digital content creation and online safety skills. Additional investigations related to boys competence on digital safety are necessary as the implication of their low scores on this dimension may suggest boys exposure to digital threats.

Keywords: digital divide, education, gender digital divide, digital literacy, remote learning

Procedia PDF Downloads 101
711 Algae for Wastewater Treatment and CO₂ Sequestration along with Recovery of Bio-Oil and Value Added Products

Authors: P. Kiran Kumar, S. Vijaya Krishna, Kavita Verma1, V. Himabindu

Abstract:

Concern about global warming and energy security has led to increased biomass utilization as an alternative feedstock to fossil fuels. Biomass is a promising feedstock since it is abundant and cheap and can be transformed into fuels and chemical products. Microalgae biofuels are likely to have a much lower impact on the environment. Microalgae cultivation using sewage with industrial flue gases is a promising concept for integrated biodiesel production, CO₂ sequestration, and nutrients recovery. Autotrophic, Mixotrophic, and Heterotrophic are the three modes of cultivation for microalgae biomass. Several mechanical and chemical processes are available for the extraction of lipids/oily components from microalgae biomass. In organic solvent extraction methods, a prior drying of biomass and recovery of the solvent is required, which are energy-intensive. Thus, the hydrothermal process overcomes the drawbacks of conventional solvent extraction methods. In the hydrothermal process, the biomass is converted into oily components by processing in a hot, pressurized water environment. In this process, in addition to the lipid fraction of microalgae, other value-added products such as proteins, carbohydrates, and nutrients can also be recovered. In the present study was (Scenedesmus quadricauda) was isolated and cultivated in autotrophic, heterotrophic, and mixotrophically using sewage wastewater and industrial flue gas in batch and continuous mode. The harvested algae biomass from S. quadricauda was used for the recovery of lipids and bio-oil. The lipids were extracted from the algal biomass using sonication as a cell disruption method followed by solvent (Hexane) extraction, and the lipid yield obtained was 8.3 wt% with Palmitic acid, Oleic acid, and Octadeonoic acid as fatty acids. The hydrothermal process was also carried out for extraction of bio-oil, and the yield obtained was 18wt%. The bio-oil compounds such as nitrogenous compounds, organic acids, and esters, phenolics, hydrocarbons, and alkanes were obtained by the hydrothermal process of algal biomass. Nutrients such as NO₃⁻ (68%) and PO₄⁻ (15%) were also recovered along with bio-oil in the hydrothermal process.

Keywords: flue gas, hydrothermal process, microalgae, sewage wastewater, sonication

Procedia PDF Downloads 140
710 Thickness-Tunable Optical, Magnetic, and Dielectric Response of Lithium Ferrite Thin Film Synthesized by Pulsed Laser Deposition

Authors: Prajna Paramita Mohapatra, Pamu Dobbidi

Abstract:

Lithium ferrite (LiFe5O8) has potential applications as a component of microwave magnetic devices such as circulators and monolithic integrated circuits. For efficient device applications, spinel ferrites in the form of thin films are highly required. It is necessary to improve their magnetic and dielectric behavior by optimizing the processing parameters during deposition. The lithium ferrite thin films are deposited on Pt/Si substrate using the pulsed laser deposition technique (PLD). As controlling the film thickness is the easiest parameter to tailor the strain, we deposited the thin films having different film thicknesses (160 nm, 200 nm, 240 nm) at oxygen partial pressure of 0.001 mbar. The formation of single phase with spinel structure (space group - P4132) is confirmed by the XRD pattern and the Rietveld analysis. The optical bandgap is decreased with the increase in thickness. FESEM confirmed the formation of uniform grains having well separated grain boundaries. Further, the film growth and the roughness are analyzed by AFM. The root-mean-square (RMS) surface roughness is decreased from 13.52 nm (160 nm) to 9.34 nm (240 nm). The room temperature magnetization is measured with a maximum field of 10 kOe. The saturation magnetization is enhanced monotonically with an increase in thickness. The magnetic resonance linewidth is obtained in the range of 450 – 780 Oe. The dielectric response is measured in the frequency range of 104 – 106 Hz and in the temperature range of 303 – 473 K. With an increase in frequency, the dielectric constant and the loss tangent of all the samples decreased continuously, which is a typical behavior of conventional dielectric material. The real part of the dielectric constant and the dielectric loss is increased with an increase in thickness. The contribution of grain and grain boundaries is also analyzed by employing the equivalent circuit model. The highest dielectric constant is obtained for the film having a thickness of 240 nm at 104 Hz. The obtained results demonstrate that desired response can be obtained by tailoring the film thickness for the microwave magnetic devices.

Keywords: PLD, optical response, thin films, magnetic response, dielectric response

Procedia PDF Downloads 98
709 Constructing a Semi-Supervised Model for Network Intrusion Detection

Authors: Tigabu Dagne Akal

Abstract:

While advances in computer and communications technology have made the network ubiquitous, they have also rendered networked systems vulnerable to malicious attacks devised from a distance. These attacks or intrusions start with attackers infiltrating a network through a vulnerable host and then launching further attacks on the local network or Intranet. Nowadays, system administrators and network professionals can attempt to prevent such attacks by developing intrusion detection tools and systems using data mining technology. In this study, the experiments were conducted following the Knowledge Discovery in Database Process Model. The Knowledge Discovery in Database Process Model starts from selection of the datasets. The dataset used in this study has been taken from Massachusetts Institute of Technology Lincoln Laboratory. After taking the data, it has been pre-processed. The major pre-processing activities include fill in missed values, remove outliers; resolve inconsistencies, integration of data that contains both labelled and unlabelled datasets, dimensionality reduction, size reduction and data transformation activity like discretization tasks were done for this study. A total of 21,533 intrusion records are used for training the models. For validating the performance of the selected model a separate 3,397 records are used as a testing set. For building a predictive model for intrusion detection J48 decision tree and the Naïve Bayes algorithms have been tested as a classification approach for both with and without feature selection approaches. The model that was created using 10-fold cross validation using the J48 decision tree algorithm with the default parameter values showed the best classification accuracy. The model has a prediction accuracy of 96.11% on the training datasets and 93.2% on the test dataset to classify the new instances as normal, DOS, U2R, R2L and probe classes. The findings of this study have shown that the data mining methods generates interesting rules that are crucial for intrusion detection and prevention in the networking industry. Future research directions are forwarded to come up an applicable system in the area of the study.

Keywords: intrusion detection, data mining, computer science, data mining

Procedia PDF Downloads 296
708 Alternative Approach to the Machine Vision System Operating for Solving Industrial Control Issue

Authors: M. S. Nikitenko, S. A. Kizilov, D. Y. Khudonogov

Abstract:

The paper considers an approach to a machine vision operating system combined with using a grid of light markers. This approach is used to solve several scientific and technical problems, such as measuring the capability of an apron feeder delivering coal from a lining return port to a conveyor in the technology of mining high coal releasing to a conveyor and prototyping an autonomous vehicle obstacle detection system. Primary verification of a method of calculating bulk material volume using three-dimensional modeling and validation in laboratory conditions with relative errors calculation were carried out. A method of calculating the capability of an apron feeder based on a machine vision system and a simplifying technology of a three-dimensional modelled examined measuring area with machine vision was offered. The proposed method allows measuring the volume of rock mass moved by an apron feeder using machine vision. This approach solves the volume control issue of coal produced by a feeder while working off high coal by lava complexes with release to a conveyor with accuracy applied for practical application. The developed mathematical apparatus for measuring feeder productivity in kg/s uses only basic mathematical functions such as addition, subtraction, multiplication, and division. Thus, this fact simplifies software development, and this fact expands the variety of microcontrollers and microcomputers suitable for performing tasks of calculating feeder capability. A feature of an obstacle detection issue is to correct distortions of the laser grid, which simplifies their detection. The paper presents algorithms for video camera image processing and autonomous vehicle model control based on obstacle detection machine vision systems. A sample fragment of obstacle detection at the moment of distortion with the laser grid is demonstrated.

Keywords: machine vision, machine vision operating system, light markers, measuring capability, obstacle detection system, autonomous transport

Procedia PDF Downloads 114
707 The Interplay between Consumer Knowledge, Cognitive Effort, Financial Healthiness and Trust in the Financial Marketplace

Authors: Torben Hansen

Abstract:

While trust has long been regarded as one of the most critical variables for developing and maintaining well-functioning financial customer-seller relationships it can be suggested that trust not only relates to customer trust in individual companies (narrow-scope trust). Trust also relates to the broader business context in which consumers may carry out their financial behaviour (broad-scope trust). However, despite the well-recognized significance of trust in marketing research, only few studies have investigated the role of broad-scope trust in consumer financial behaviour. Moreover, as one of its many serious outcomes, the global financial crisis has elevated the need for an improved understanding of the role of broad-scope trust in consumer financial services markets. Only a minority of US and European consumers are currently confident in financial companies and ‘financial stability’ and ‘trust’ are now among the top reasons for choosing a bank. This research seeks to address this shortcoming in the marketing literature by investigating direct and moderating effects of broad-scope trust on consumer financial behaviour. Specifically, we take an ability-effort approach to consumer financial behaviour. The ability-effort approach holds the basic premise that the quality of consumer actions is influenced by ability factors, for example consumer knowledge and cognitive effort. Our study is based on two surveys. Survey 1 comprises 1,155 bank consumers, whereas survey 2 comprises 764 pension consumers. The results indicate that broad-scope trust negatively moderates relationships between knowledge and financial healthiness and between cognitive effort and financial healthiness. In addition, it is demonstrated that broad-scope trust negatively influences cognitive effort. Specifically, the results suggest that broad-scope trust contributes to the financial well-being of consumers with limited financial knowledge and processing capabilities. Since financial companies are dependent on customers to pay their loans and bills they have a greater interest in developing relations with consumers with a healthy financial behaviour than with the opposite. Hence, financial managers should be engaged with monitoring and influencing broad-scope trust. To conclude, by taking into account the contextual effect of broad-scope trust, the present study adds to our understanding of knowledge-effort-behaviour relationship in consumer financial markets.

Keywords: cognitive effort, customer-seller relationships, financial healthiness, knowledge, trust

Procedia PDF Downloads 441
706 Study of the Kinetics of Formation of Carboxylic Acids Using Ion Chromatography during Oxidation Induced by Rancimat of the Oleic Acid, Linoleic Acid, Linolenic Acid, and Biodiesel

Authors: Patrícia T. Souza, Marina Ansolin, Eduardo A. C. Batista, Antonio J. A. Meirelles, Matthieu Tubino

Abstract:

Lipid oxidation is a major cause of the deterioration of the quality of the biodiesel, because the waste generated damages the engines. Among the main undesirable effects are the increase of viscosity and acidity, leading to the formation of insoluble gums and sediments which cause the blockage of fuel filters. The auto-oxidation is defined as the spontaneous reaction of atmospheric oxygen with lipids. Unsaturated fatty acids are usually the components affected by such reactions. They are present as free fatty acids, fatty esters and glycerides. To determine the oxidative stability of biodiesels, through the induction period, IP, the Rancimat method is used, which allows continuous monitoring of the induced oxidation process of the samples. During the oxidation of the lipids, volatile organic acids are produced as byproducts, in addition, other byproducts, including alcohols and carbonyl compounds, may be further oxidized to carboxylic acids. By the methodology developed in this work using ion chromatography, IC, analyzing the water contained in the conductimetric vessel, were quantified organic anions of carboxylic acids in samples subjected to oxidation induced by Rancimat. The optimized chromatographic conditions were: eluent water:acetone (80:20 v/v) with 0.5 mM sulfuric acid; flow rate 0.4 mL min-1; injection volume 20 µL; eluent suppressor 20 mM LiCl; analytical curve from 1 to 400 ppm. The samples studied were methyl biodiesel from soybean oil and unsaturated fatty acids standards: oleic, linoleic and linolenic. The induced oxidation kinetics curves were constructed by analyzing the water contained in the conductimetric vessels which were removed, each one, from the Rancimat apparatus at prefixed intervals of time. About 3 g of sample were used under the conditions of 110 °C and air flow rate of 10 L h-1. The water of each conductimetric Rancimat measuring vessel, where the volatile compounds were collected, was filtered through a 0.45 µm filter and analyzed by IC. Through the kinetic data of the formation of the organic anions of carboxylic acids, the formation rates of the same were calculated. The observed order of the rates of formation of the anions was: formate >>> acetate > hexanoate > valerate for the oleic acid; formate > hexanoate > acetate > valerate for the linoleic acid; formate >>> valerate > acetate > propionate > butyrate for the linolenic acid. It is possible to suppose that propionate and butyrate are obtained mainly from linolenic acid and that hexanoate is originated from oleic and linoleic acid. For the methyl biodiesel the order of formation of anions was: formate >>> acetate > valerate > hexanoate > propionate. According to the total rate of formation these anions produced during the induced degradation of the fatty acids can be assigned the order of reactivity: linolenic acid > linoleic acid >>> oleic acid.

Keywords: anions of carboxylic acids, biodiesel, ion chromatography, oxidation

Procedia PDF Downloads 475
705 A Decadal Flood Assessment Using Time-Series Satellite Data in Cambodia

Authors: Nguyen-Thanh Son

Abstract:

Flood is among the most frequent and costliest natural hazards. The flood disasters especially affect the poor people in rural areas, who are heavily dependent on agriculture and have lower incomes. Cambodia is identified as one of the most climate-vulnerable countries in the world, ranked 13th out of 181 countries most affected by the impacts of climate change. Flood monitoring is thus a strategic priority at national and regional levels because policymakers need reliable spatial and temporal information on flood-prone areas to form successful monitoring programs to reduce possible impacts on the country’s economy and people’s likelihood. This study aims to develop methods for flood mapping and assessment from MODIS data in Cambodia. We processed the data for the period from 2000 to 2017, following three main steps: (1) data pre-processing to construct smooth time-series vegetation and water surface indices, (2) delineation of flood-prone areas, and (3) accuracy assessment. The results of flood mapping were verified with the ground reference data, indicating the overall accuracy of 88.7% and a Kappa coefficient of 0.77, respectively. These results were reaffirmed by close agreement between the flood-mapping area and ground reference data, with the correlation coefficient of determination (R²) of 0.94. The seasonally flooded areas observed for 2010, 2015, and 2016 were remarkably smaller than other years, mainly attributed to the El Niño weather phenomenon exacerbated by impacts of climate change. Eventually, although several sources potentially lowered the mapping accuracy of flood-prone areas, including image cloud contamination, mixed-pixel issues, and low-resolution bias between the mapping results and ground reference data, our methods indicated the satisfactory results for delineating spatiotemporal evolutions of floods. The results in the form of quantitative information on spatiotemporal flood distributions could be beneficial to policymakers in evaluating their management strategies for mitigating the negative effects of floods on agriculture and people’s likelihood in the country.

Keywords: MODIS, flood, mapping, Cambodia

Procedia PDF Downloads 126
704 Sexual Health And Male Fertility: Improving Sperm Health With Focus On Technology

Authors: Diana Peninger

Abstract:

Over 10% of couples in the U.S. have infertility problems, with roughly 40% traceable to the male partner. Yet, little attention has been given to improving men’s contribution to the conception process. One solution that is showing promise in increasing conception rates for IVF and other assisted reproductive technology treatments is a first-of-its-kind semen collection that has been engineered to mitigate sperm damage caused by traditional collection methods. Patients are able to collect semen at home and deliver to clinics within 48 hours for use in fertility analysis and treatment, with less stress and improved specimen viability. This abstract will share these findings along with expert insight and tips to help attendees understand the key role sperm collection plays in addressing and treating reproductive issues, while helping to improve patient outcomes and success. Our research was to determine if male reproductive outcomes can be increased by improving sperm specimen health with a focus on technology. We utilized a redesigned semen collection cup (patented as the Device for Improved Semen Collection/DISC—U.S. Patent 6864046 – known commercially as a ProteX) that met a series of physiological parameters. Previous research demonstrated significant improvement in semen perimeters (motility forward, progression, viability, and longevity) and overall sperm biochemistry when the DISC is used for collection. Animal studies have also shown dramatic increases in pregnancy rates. Our current study compares samples collected in the DISC, next-generation DISC (DISCng), and a standard specimen cup (SSC), dry, with the 1 mL measured amount of media and media in excess ( 5mL). Both human and animal testing will be included. With sperm counts declining at alarming rates due to environmental, lifestyle, and other health factors, accurate evaluations of sperm health are critical to understanding reproductive health, origins, and treatments of infertility. An increase in the health of the sperm as measured by extensive semen parameter analysis and improved semen parameters stable for 48 hours, expanding the processing time from 1 hour to 48 hours were also demonstrated.

Keywords: reprodutive, sperm, male, infertility

Procedia PDF Downloads 129
703 A Brazilian Study Applied to the Regulatory Environmental Issues of Nanomaterials

Authors: Luciana S. Almeida

Abstract:

Nanotechnology has revolutionized the world of science and technology bringing great expectations due to its great potential of application in the most varied industrial sectors. The same characteristics that make nanoparticles interesting from the point of view of the technological application, these may be undesirable when released into the environment. The small size of nanoparticles facilitates their diffusion and transport in the atmosphere, water, and soil and facilitates the entry and accumulation of nanoparticles in living cells. The main objective of this study is to evaluate the environmental regulatory process of nanomaterials in the Brazilian scenario. Three specific objectives were outlined. The first is to carry out a global scientometric study, in a research platform, with the purpose of identifying the main lines of study of nanomaterials in the environmental area. The second is to verify how environmental agencies in other countries have been working on this issue by means of a bibliographic review. And the third is to carry out an assessment of the Brazilian Nanotechnology Draft Law 6741/2013 with the state environmental agencies. This last one has the aim of identifying the knowledge of the subject by the environmental agencies and necessary resources available in the country for the implementation of the Policy. A questionnaire will be used as a tool for this evaluation to identify the operational elements and build indicators through the Environment of Evaluation Application, a computational application developed for the development of questionnaires. At the end will be verified the need to propose changes in the Draft Law of the National Nanotechnology Policy. Initial studies, in relation to the first specific objective, have already identified that Brazil stands out in the production of scientific publications in the area of nanotechnology, although the minority is in studies focused on environmental impact studies. Regarding the general panorama of other countries, some findings have also been raised. The United States has included the nanoform of the substances in an existing program in the EPA (Environmental Protection Agency), the TSCA (Toxic Substances Control Act). The European Union issued a draft of a document amending Regulation 1907/2006 of the European Parliament and Council to cover the nanoform of substances. Both programs are based on the study and identification of environmental risks associated with nanomaterials taking into consideration the product life cycle. In relation to Brazil, regarding the third specific objective, it is notable that the country does not have any regulations applicable to nanostructures, although there is a Draft Law in progress. In this document, it is possible to identify some requirements related to the environment, such as environmental inspection and licensing; industrial waste management; notification of accidents and application of sanctions. However, it is not known if these requirements are sufficient for the prevention of environmental impacts and if national environmental agencies will know how to apply them correctly. This study intends to serve as a basis for future actions regarding environmental management applied to the use of nanotechnology in Brazil.

Keywords: environment; management; nanotecnology; politics

Procedia PDF Downloads 122
702 Reinforced Concrete Bridge Deck Condition Assessment Methods Using Ground Penetrating Radar and Infrared Thermography

Authors: Nicole M. Martino

Abstract:

Reinforced concrete bridge deck condition assessments primarily use visual inspection methods, where an inspector looks for and records locations of cracks, potholes, efflorescence and other signs of probable deterioration. Sounding is another technique used to diagnose the condition of a bridge deck, however this method listens for damage within the subsurface as the surface is struck with a hammer or chain. Even though extensive procedures are in place for using these inspection techniques, neither one provides the inspector with a comprehensive understanding of the internal condition of a bridge deck – the location where damage originates from.  In order to make accurate estimates of repair locations and quantities, in addition to allocating the necessary funding, a total understanding of the deck’s deteriorated state is key. The research presented in this paper collected infrared thermography and ground penetrating radar data from reinforced concrete bridge decks without an asphalt overlay. These decks were of various ages and their condition varied from brand new, to in need of replacement. The goals of this work were to first verify that these nondestructive evaluation methods could identify similar areas of healthy and damaged concrete, and then to see if combining the results of both methods would provide a higher confidence than if the condition assessment was completed using only one method. The results from each method were presented as plan view color contour plots. The results from one of the decks assessed as a part of this research, including these plan view plots, are presented in this paper. Furthermore, in order to answer the interest of transportation agencies throughout the United States, this research developed a step-by-step guide which demonstrates how to collect and assess a bridge deck using these nondestructive evaluation methods. This guide addresses setup procedures on the deck during the day of data collection, system setups and settings for different bridge decks, data post-processing for each method, and data visualization and quantification.

Keywords: bridge deck deterioration, ground penetrating radar, infrared thermography, NDT of bridge decks

Procedia PDF Downloads 154
701 The Misuse of Free Cash and Earnings Management: An Analysis of the Extent to Which Board Tenure Mitigates Earnings Management

Authors: Michael McCann

Abstract:

Managerial theories propose that, in joint stock companies, executives may be tempted to waste excess free cash on unprofitable projects to keep control of resources. In order to conceal their projects' poor performance, they may seek to engage in earnings management. On the one hand, managers may manipulate earnings upwards in order to post ‘good’ performances and safeguard their position. On the other, since managers pursuit of unrewarding investments are likely to lead to low long-term profitability, managers will use negative accruals to reduce current year’s earnings, smoothing earnings over time in order to conceal the negative effects. Agency models argue that boards of directors are delegated by shareholders to ensure that companies are governed properly. Part of that responsibility is ensuring the reliability of financial information. Analyses of the impact of board characteristics, particularly board independence on the misuse of free cash flow and earnings management finds conflicting evidence. However, existing characterizations of board independence do not account for such directors gaining firm-specific knowledge over time, influencing their monitoring ability. Further, there is little analysis of the influence of the relative experience of independent directors and executives on decisions surrounding the use of free cash. This paper contributes to this literature regarding the heterogeneous characteristics of boards by investigating the influence of independent director tenure on earnings management and the relative tenures of independent directors and Chief Executives. A balanced panel dataset comprising 51 companies across 11 annual periods from 2005 to 2015 is used for the analysis. In each annual period, firms were classified as conducting earnings management if they had discretionary accruals in the bottom quartile (downwards) and top quartile (upwards) of the distributed values for the sample. Logistical regressions were conducted to determine the marginal impact of independent board tenure and a number of control variables on the probability of conducting earnings management. The findings indicate that both absolute and relative measures of board independence and experience do not have a significant impact on the likelihood of earnings management. It is the level of free cash flow which is the major influence on the probability of earnings management. Higher free cash flow increases the probability of earnings management significantly. The research also investigates whether board monitoring of earnings management is contingent on the level of free cash flow. However, the results suggest that board monitoring is not amplified when free cash flow is higher. This suggests that the extent of earnings management in companies is determined by a range of company, industry and situation-specific factors.

Keywords: corporate governance, boards of directors, agency theory, earnings management

Procedia PDF Downloads 233
700 The Effect of Music on Consumer Behavior

Authors: Lara Ann Türeli, Özlem Bozkurt

Abstract:

There is a biochemical component to listening to music. The type of music listened to can lead to different levels of neurotransmitter and biochemical activity within the brain, resulting in brain stimulation and different moods. Therefore, music plays an important role in neuromarketing and consumer behavior. The quality of a commercial can be measured by the effect the music has on its audience. Thus, understanding how music can affect the brain can provide better marketing strategies for all businesses. The type of music used plays an important role in how a person responds to certain experiences. In the context of marketing and consumer behavior, music can determine whether a person will be intrigued to buy something. Depending on the type of music listened to by an individual; the music may trigger the release of pleasurable neurotransmitters such as dopamine. Dopamine is a neurotransmitter that plays an important role in reward pathways in the brain. When an individual experiences a pleasurable activity, increased levels of dopamine are produced, eventually leading to the formation of new reward pathways. Consequently, the increased dopamine activity within the brain triggered by music can result in new reward pathways along the dopamine pathways in the brain. Selecting pleasurable music for commercials can result in long-term brain stimulation, increasing consumerism. The effect of music on consumerism should be considered not only in commercials but also in the atmosphere it creates within stores. The type of music played in a store can affect consumer behavior and intention. Specifically, the rhythm, pitch, and pace of music can contribute to the mood of the song. The background music in a store can determine the consumer’s emotional presence and consequently affect their intentions. In conclusion, understanding the physiological, psychological, and neurochemical basis of the effect of music on brain stimulation is essential to understand consumer behavior. The role of dopamine in the formation of reward pathways as a result of music directly contributes to consumer behavior and the tendency of a commercial or store to leave a long-term effect on the consumer. The careful consideration of the pitch, pace, and rhythm of a song in the selection of music can not only help companies predict the behavior of a consumer but also determine the behavior of a consumer.

Keywords: sensory processing, neuropsychology, dopamine, neuromarketing

Procedia PDF Downloads 80
699 Hematological and Biochemical Indices of Starter Broiler Chickens Fed African Black Plum Seed Nut (Vitex Doniana) Meal

Authors: Obadire F. O., Obadire, S. O., Adeoti R. F., Pirgozliev V.

Abstract:

An experiment was conducted to determine the efficacy of utilizing African black plum seed nut (ABPNBD) meal on hematological and biochemical indices of broiler chicken ration formulated to substitute wheat offal. A total of 150- 1day-old, male Agrited birds were reared for 28 days of the experiment. The birds were assigned to five dietary treatments, with ten birds per treatment replicated 3 times. Experimental diets were formulated by supplementing the milled African black plum nut at (0, 5, 10, 12.5, and 15%) inclusion levels in the starter broiler’s ratio designated as T1 (control diet containing no ABPBD), Treatments (T2, 3,4 and 5) contained ABPNBD at 5, 10, 12.5, and 15%, respectively, in a completely randomized design. The hematological and biochemical indices of the birds were determined. The result revealed that all hematological parameters measured were significant (P <0.05) except for WBC. Increasing inclusion levels of ABPNBD decreased the PCV, HB, and RBC of the birds across the treatment groups. Birds fed 12.5 and 15% ABPNBD diets recorded the least of the parameters. The result of the serum biochemical indices showed significant (P < 0.05) influence for all parameters measured except for alanine transaminase (ALT), (AST), and creatinine. The total protein (TP), albumin, globulin, and glucose values were reduced across the treatment group as ABPNBD inclusion increased. Birds fed above 10% ABPNBD recorded the lowest value of TP, albumin, globulin, and glucose when compared with birds on a control diet and other treatments. The uric acid ranged from 3.85 to 2 .13 mmol/L, while creatinine ranged from 62.00 to 53.50 mmol/l. AST ranged between 8.50 u/l (5%) to 7.90 u/l (10%). ALT ranged between 7.50 u/l (12.5%) to 5.50 u/l (5 and 10%). In conclusion, dietary inclusion of African black plum up to 10% has no detrimental effect on the health of the starter chickens. Meanwhile, inclusion above 10% revealed a negative effect on some blood parameters measured. Therefore, African black plum should be supplemented with probable probiotics or subjected to different processing methods if to be used at a 15% inclusion level for optimal results.

Keywords: African black plum seed, starter broiler chickens, hematological and serum biochemical indices, (Vitex doniana)

Procedia PDF Downloads 52
698 Studies of Single Nucleotide Polymorphism of Proteosomal Gene Complex and Their Association with HBV Infection Risk in India

Authors: Jasbir Singh, Devender Kumar, Davender Redhu, Surender Kumar, Vandana Bhardwaj

Abstract:

Single Nucleotide polymorphism (SNP) of proteosomal gene complex is involved in the pathogenesis of hepatitis B Virus (HBV) infection. Some of such proteosomal gene complex are large multifunctional proteins (LMP) and antigen associated transporters that help in antigen presentation. Both are involved in intracellular processing and presentation of viral antigens in association with Major Histocompatability Complex (MHC) Class I molecules. A total of hundred each of hepatitis B virus infected and control samples from northern India were studied. Genomic DNA was extracted from all studied samples and PCR-RFLP method was used for genotyping at different positions of LMP genes. Genotypes at a given position were inferred from the pattern of bands and genotype frequencies and haplotype frequencies were also calculated. Homozygous SNP {A>C} was observed at codon 145 of LMP7 gene and having a protective role against HBV as there was statistically significant high distribution of this SNP among controls than cases. Heterozygous SNP {A>C} was observed at codon 145 of LMP7 gene and made individuals more susceptible to HBV infection as there was statistically significant high distribution of this SNP among cases than control. SNP {T>C} was observed at codon 60 of LMP2 gene but statistically significant differences were not observed among controls and cases. For codon 145 of LMP7 and codon 60 of LMP2 genes, four haplotypes were constructed. Haplotype I (LMP2 ‘C’ and LMP7 ‘A’) made individuals carrying it more susceptible to HBV infection as there was statistically significant high distribution of this haplotype among cases than control. Haplotype II (LMP2 ‘C’ and LMP7 ‘C’) made individuals carrying it more immune to HBV infection as there was statistically significant high distribution of this haplotype among control than cases. Thus it can be concluded that homozygous SNP {A>C} at codon 145 of LMP7 and Haplotype II (LMP2 ‘C’ and LMP7 ‘C’) has a protective role against HBV infection whereas heterozygous SNP {A>C} at codon 145 of LMP7 and Haplotype I (LMP2 ‘C’ and LMP7 ‘A’) made individuals more susceptible to HBV infection.

Keywords: Hepatitis B Virus, single nucleotide polymorphism, low molecular weight proteins, transporters associated with antigen presentation

Procedia PDF Downloads 308
697 Language Choice and Language Maintenance of Northeastern Thai Staff in Suan Sunandha Rajabhat University

Authors: Napasri Suwanajote

Abstract:

The purposes of this research were to analyze and evaluate successful factors in OTOP production process for the developing of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The research has been designed as a qualitative study to gather information from 30 OTOP producers in Bangkontee District, Samudsongkram Province. They were all interviewed on 3 main parts. Part 1 was about the production process including 1) production, 2) product development, 3) the community strength, 4) marketing possibility, and 5) product quality. Part 2 evaluated appropriate successful factors including 1) the analysis of the successful factors, 2) evaluate the strategy based on Sufficiency Economic Philosophy, and 3) the model of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The results showed that the production did not affect the environment with potential in continuing standard quality production. They used the raw materials in the country. On the aspect of product and community strength in the past 1 year, it was found that there was no appropriate packaging showing product identity according to global market standard. They needed the training on packaging especially for food and drink products. On the aspect of product quality and product specification, it was found that the products were certified by the local OTOP standard. There should be a responsible organization to help the uncertified producers pass the standard. However, there was a problem on food contamination which was hazardous to the consumers. The producers should cooperate with the government sector or educational institutes involving with food processing to reach FDA standard. The results from small group discussion showed that the community expected high education and better standard living. Some problems reported by the community included informal debt and drugs in the community. There were 8 steps in developing the model of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality.

Keywords: production process, OTOP, sufficiency economic philosophy, language choice

Procedia PDF Downloads 237
696 Identification of Candidate Congenital Heart Defects Biomarkers by Applying a Random Forest Approach on DNA Methylation Data

Authors: Kan Yu, Khui Hung Lee, Eben Afrifa-Yamoah, Jing Guo, Katrina Harrison, Jack Goldblatt, Nicholas Pachter, Jitian Xiao, Guicheng Brad Zhang

Abstract:

Background and Significance of the Study: Congenital Heart Defects (CHDs) are the most common malformation at birth and one of the leading causes of infant death. Although the exact etiology remains a significant challenge, epigenetic modifications, such as DNA methylation, are thought to contribute to the pathogenesis of congenital heart defects. At present, no existing DNA methylation biomarkers are used for early detection of CHDs. The existing CHD diagnostic techniques are time-consuming and costly and can only be used to diagnose CHDs after an infant was born. The present study employed a machine learning technique to analyse genome-wide methylation data in children with and without CHDs with the aim to find methylation biomarkers for CHDs. Methods: The Illumina Human Methylation EPIC BeadChip was used to screen the genome‐wide DNA methylation profiles of 24 infants diagnosed with congenital heart defects and 24 healthy infants without congenital heart defects. Primary pre-processing was conducted by using RnBeads and limma packages. The methylation levels of top 600 genes with the lowest p-value were selected and further investigated by using a random forest approach. ROC curves were used to analyse the sensitivity and specificity of each biomarker in both training and test sample sets. The functionalities of selected genes with high sensitivity and specificity were then assessed in molecular processes. Major Findings of the Study: Three genes (MIR663, FGF3, and FAM64A) were identified from both training and validating data by random forests with an average sensitivity and specificity of 85% and 95%. GO analyses for the top 600 genes showed that these putative differentially methylated genes were primarily associated with regulation of lipid metabolic process, protein-containing complex localization, and Notch signalling pathway. The present findings highlight that aberrant DNA methylation may play a significant role in the pathogenesis of congenital heart defects.

Keywords: biomarker, congenital heart defects, DNA methylation, random forest

Procedia PDF Downloads 158
695 Rendering Cognition Based Learning in Coherence with Development within the Context of PostgreSQL

Authors: Manuela Nayantara Jeyaraj, Senuri Sucharitharathna, Chathurika Senarath, Yasanthy Kanagaraj, Indraka Udayakumara

Abstract:

PostgreSQL is an Object Relational Database Management System (ORDBMS) that has been in existence for a while. Despite the superior features that it wraps and packages to manage database and data, the database community has not fully realized the importance and advantages of PostgreSQL. Hence, this research tends to focus on provisioning a better environment of development for PostgreSQL in order to induce the utilization and elucidate the importance of PostgreSQL. PostgreSQL is also known to be the world’s most elementary SQL-compliant open source ORDBMS. But, users have not yet resolved to PostgreSQL due to the facts that it is still under the layers and the complexity of its persistent textual environment for an introductory user. Simply stating this, there is a dire need to explicate an easy way of making the users comprehend the procedure and standards with which databases are created, tables and the relationships among them, manipulating queries and their flow based on conditions in PostgreSQL to help the community resolve to PostgreSQL at an augmented rate. Hence, this research under development within the context tends to initially identify the dominant features provided by PostgreSQL over its competitors. Following the identified merits, an analysis on why the database community holds a hesitance in migrating to PostgreSQL’s environment will be carried out. These will be modulated and tailored based on the scope and the constraints discovered. The resultant of the research proposes a system that will serve as a designing platform as well as a learning tool that will provide an interactive method of learning via a visual editor mode and incorporate a textual editor for well-versed users. The study is based on conjuring viable solutions that analyze a user’s cognitive perception in comprehending human computer interfaces and the behavioural processing of design elements. By providing a visually draggable and manipulative environment to work with Postgresql databases and table queries, it is expected to highlight the elementary features displayed by Postgresql over any other existent systems in order to grasp and disseminate the importance and simplicity offered by this to a hesitant user.

Keywords: cognition, database, PostgreSQL, text-editor, visual-editor

Procedia PDF Downloads 283
694 Spatial Analysis of Survival Pattern and Treatment Outcomes of Multi-Drug Resistant Tuberculosis (MDR-TB) Patients in Lagos, Nigeria

Authors: Akinsola Oluwatosin, Udofia Samuel, Odofin Mayowa

Abstract:

The study is aimed at assessing the Geographic Information System (GIS)-based spatial analysis of Survival Pattern and Treatment Outcomes of Multi-Drug Resistant Tuberculosis (MDR-TB) cases for Lagos, Nigeria, with an objective to inform priority areas for public health planning and resource allocation. Multi-drug resistant tuberculosis (MDR-TB) develops due to problems such as irregular drug supply, poor drug quality, inappropriate prescription, and poor adherence to treatment. The shapefile(s) for this study were already georeferenced to Minna datum. The patient’s information was acquired on MS Excel and later converted to . CSV file for easy processing to ArcMap from various hospitals. To superimpose the patient’s information the spatial data, the addresses was geocoded to generate the longitude and latitude of the patients. The database was used for the SQL query to the various pattern of the treatment. To show the pattern of disease spread, spatial autocorrelation analysis was used. The result was displayed in a graphical format showing the areas of dispersing, random and clustered of patients in the study area. Hot and cold spot analysis was analyzed to show high-density areas. The distance between these patients and the closest health facility was examined using the buffer analysis. The result shows that 22% of the points were successfully matched, while 15% were tied. However, the result table shows that a greater percentage of it was unmatched; this is evident in the fact that most of the streets within the State are unnamed, and then again, most of the patients are likely to supply the wrong addresses. MDR-TB patients of all age groups are concentrated within Lagos-Mainland, Shomolu, Mushin, Surulere, Oshodi-Isolo, and Ifelodun LGAs. MDR-TB patients between the age group of 30-47 years had the highest number and were identified to be about 184 in number. The outcome of patients on ART treatment revealed that a high number of patients (300) were not ART treatment while a paltry 45 patients were on ART treatment. The result shows the Z-score of the distribution is greater than 1 (>2.58), which means that the distribution is highly clustered at a significance level of 0.01.

Keywords: tuberculosis, patients, treatment, GIS, MDR-TB

Procedia PDF Downloads 152
693 Tale of Massive Distressed Migration from Rural to Urban Areas: A Study of Mumbai City

Authors: Vidya Yadav

Abstract:

Migration is the demographic process that links rural to urban areas, generating or spurring the growth of cities. Evidence shows the role of the city as a production processes. It looks the city as a power of centre, and a centre of change. It has been observed that not only the professionals want to settle down in an urban area but rural labourers are also coming to cities for employment. These are the people who are compelled to migrate to metropolises because of lack of employment opportunities in their place of residence. However, the cities also fail to provide adequate employment because of limited job opportunity creation and capital-intensive industrialization. So these masses of incoming migrants are force to take up whatever employment absorption is available to them particularly in urban informal activities. Ultimately with this informal job they are compelled to stay in the slum areas, which is another form of deprived housing colonies. The paper seeks to examine the evidences of poverty induced migration from rural to urban areas (particularly in urban agglomeration). The present paper utilizes an abundant rich source of census migration data (D-Series) of 1991-2001. Result shows that Mumbai remain as the most attractive place to migrate. The migrants are mainly from the major states like Uttar Pradesh, Bihar, West Bengal, Jharkhand, Odisha, and Rajasthan. Male dominated migration is related mostly for employment and females due to marriages. The picture of occupational absorption of migrants who moved for employment, cross classified with educational status. Result shows that illiterate males are primarily engaged in low grade production processing work. Illiterate’s females engaged in service sectors; but these are actually very low grade services in urban informal sectors in India like maid servants, domestic help, hawkers, vendors or vegetables sellers. Among the higher educational level, a small percentage of males and females got absorbed in professional or clerical work but the percentage has been increased in the period 1991-2001.

Keywords: informal, job, migration, urban

Procedia PDF Downloads 283
692 Cultural Heritage, Urban Planning and the Smart City in Indian Context

Authors: Paritosh Goel

Abstract:

The conservation of historic buildings and historic Centre’s over recent years has become fully encompassed in the planning of built-up areas and their management following climate changes. The approach of the world of restoration, in the Indian context on integrated urban regeneration and its strategic potential for a smarter, more sustainable and socially inclusive urban development introduces, for urban transformations in general (historical centers and otherwise), the theme of sustainability. From this viewpoint, it envisages, as a primary objective, a real “green, ecological or environmental” requalification of the city through interventions within the main categories of sustainability: mobility, energy efficiency, use of sources of renewable energy, urban metabolism (waste, water, territory, etc.) and natural environment. With this the concept of a “resilient city” is also introduced, which can adapt through progressive transformations to situations of change which may not be predictable, behavior that the historical city has always been able to express. Urban planning on the other hand, has increasingly focused on analyses oriented towards the taxonomic description of social/economic and perceptive parameters. It is connected with human behavior, mobility and the characterization of the consumption of resources, in terms of quantity even before quality to inform the city design process, which for ancient fabrics, and mainly affects the public space also in its social dimension. An exact definition of the term “smart city” is still essentially elusive, since we can attribute three dimensions to the term: a) That of a virtual city, evolved based on digital networks and web networks b) That of a physical construction determined by urban planning based on infrastructural innovation, which in the case of historic Centre’s implies regeneration that stimulates and sometimes changes the existing fabric; c) That of a political and social/economic project guided by a dynamic process that provides new behavior and requirements of the city communities that orients the future planning of cities also through participation in their management. This paper is a preliminary research into the connections between these three dimensions applied to the specific case of the fabric of ancient cities with the aim of obtaining a scientific theory and methodology to apply to the regeneration of Indian historical Centre’s. The Smart city scheme if contextualize with heritage of the city it can be an initiative which intends to provide a transdisciplinary approach between various research networks (natural sciences, socio-economics sciences and humanities, technological disciplines, digital infrastructures) which are united in order to improve the design, livability and understanding of urban environment and high historical/cultural performance levels.

Keywords: historical cities regeneration, sustainable restoration, urban planning, smart cities, cultural heritage development strategies

Procedia PDF Downloads 281
691 Solar Liquid Desiccant Regenerator for Two Stage KCOOH Based Fresh Air Dehumidifier

Authors: M. V. Rane, Tareke Tekia

Abstract:

Liquid desiccant based fresh air dehumidifiers can be gainfully deployed for air-conditioning, agro-produce drying and in many industrial processes. Regeneration of liquid desiccant can be done using direct firing, high temperature waste heat or solar energy. Solar energy is clean and available in abundance; however, it is costly to collect. A two stage liquid desiccant fresh air dehumidification system can offer Coefficient of Performance (COP), in the range of 1.6 to 2 for comfort air conditioning applications. High COP helps reduce the size and cost of collectors required. Performance tests on high temperature regenerator of a two stage liquid desiccant fresh air dehumidifier coupled with seasonally tracked flat plate like solar collector will be presented in this paper. The two stage fresh air dehumidifier has four major components: High Temperature Regenerator (HTR), Low Temperature Regenerator (LTR), High and Low Temperature Solution Heat Exchangers and Fresh Air Dehumidifier (FAD). This open system can operate at near atmospheric pressure in all the components. These systems can be simple, maintenance-free and scalable. Environmentally benign, non-corrosive, moderately priced Potassium Formate, KCOOH, is used as a liquid desiccant. Typical KCOOH concentration in the system is expected to vary between 65 and 75%. Dilute liquid desiccant at 65% concentration exiting the fresh air dehumidifier will be pumped and preheated in solution heat exchangers before entering the high temperature solar regenerator. In the solar collector, solution will be regenerated to intermediate concentration of 70%. Steam and saturated solution exiting the solar collector array will be separated. Steam at near atmospheric pressure will then be used to regenerate the intermediate concentration solution up to a concentration of 75% in a low temperature regenerator where moisture vaporized be released in to atmosphere. Condensed steam can be used as potable water after adding a pinch of salt and some nutrient. Warm concentrated liquid desiccant will be routed to solution heat exchanger to recycle its heat to preheat the weak liquid desiccant solution. Evacuated glass tube based seasonally tracked solar collector is used for regeneration of liquid desiccant at high temperature. Temperature of regeneration for KCOOH is 133°C at 70% concentration. The medium temperature collector was designed for temperature range of 100 to 150°C. Double wall polycarbonate top cover helps reduce top losses. Absorber integrated heat storage helps stabilize the temperature of liquid desiccant exiting the collectors during intermittent cloudy conditions, and extends the operation of the system by couple of hours beyond the sunshine hours. This solar collector is light in weight, 12 kg/m2 without absorber integrated heat storage material, and 27 kg/m2 with heat storage material. Cost of the collector is estimated to be 10,000 INR/m2. Theoretical modeling of the collector has shown that the optical efficiency is 62%. Performance test of regeneration of KCOOH will be reported.

Keywords: solar, liquid desiccant, dehumidification, air conditioning, regeneration

Procedia PDF Downloads 348
690 Exploration of Hydrocarbon Unconventional Accumulations in the Argillaceous Formation of the Autochthonous Miocene Succession in the Carpathian Foredeep

Authors: Wojciech Górecki, Anna Sowiżdżał, Grzegorz Machowski, Tomasz Maćkowski, Bartosz Papiernik, Michał Stefaniuk

Abstract:

The article shows results of the project which aims at evaluating possibilities of effective development and exploitation of natural gas from argillaceous series of the Autochthonous Miocene in the Carpathian Foredeep. To achieve the objective, the research team develop a world-trend based but unique methodology of processing and interpretation, adjusted to data, local variations and petroleum characteristics of the area. In order to determine the zones in which maximum volumes of hydrocarbons might have been generated and preserved as shale gas reservoirs, as well as to identify the most preferable well sites where largest gas accumulations are anticipated a number of task were accomplished. Evaluation of petrophysical properties and hydrocarbon saturation of the Miocene complex is based on laboratory measurements as well as interpretation of well-logs and archival data. The studies apply mercury porosimetry (MICP), micro CT and nuclear magnetic resonance imaging (using the Rock Core Analyzer). For prospective location (e.g. central part of Carpathian Foredeep – Brzesko-Wojnicz area) reprocessing and reinterpretation of detailed seismic survey data with the use of integrated geophysical investigations has been made. Construction of quantitative, structural and parametric models for selected areas of the Carpathian Foredeep is performed on the basis of integrated, detailed 3D computer models. Modeling are carried on with the Schlumberger’s Petrel software. Finally, prospective zones are spatially contoured in a form of regional 3D grid, which will be framework for generation modelling and comprehensive parametric mapping, allowing for spatial identification of the most prospective zones of unconventional gas accumulation in the Carpathian Foredeep. Preliminary results of research works indicate a potentially prospective area for occurrence of unconventional gas accumulations in the Polish part of Carpathian Foredeep.

Keywords: autochthonous Miocene, Carpathian foredeep, Poland, shale gas

Procedia PDF Downloads 228
689 Application of Groundwater Level Data Mining in Aquifer Identification

Authors: Liang Cheng Chang, Wei Ju Huang, You Cheng Chen

Abstract:

Investigation and research are keys for conjunctive use of surface and groundwater resources. The hydrogeological structure is an important base for groundwater analysis and simulation. Traditionally, the hydrogeological structure is artificially determined based on geological drill logs, the structure of wells, groundwater levels, and so on. In Taiwan, groundwater observation network has been built and a large amount of groundwater-level observation data are available. The groundwater level is the state variable of the groundwater system, which reflects the system response combining hydrogeological structure, groundwater injection, and extraction. This study applies analytical tools to the observation database to develop a methodology for the identification of confined and unconfined aquifers. These tools include frequency analysis, cross-correlation analysis between rainfall and groundwater level, groundwater regression curve analysis, and decision tree. The developed methodology is then applied to groundwater layer identification of two groundwater systems: Zhuoshui River alluvial fan and Pingtung Plain. The abovementioned frequency analysis uses Fourier Transform processing time-series groundwater level observation data and analyzing daily frequency amplitude of groundwater level caused by artificial groundwater extraction. The cross-correlation analysis between rainfall and groundwater level is used to obtain the groundwater replenishment time between infiltration and the peak groundwater level during wet seasons. The groundwater regression curve, the average rate of groundwater regression, is used to analyze the internal flux in the groundwater system and the flux caused by artificial behaviors. The decision tree uses the information obtained from the above mentioned analytical tools and optimizes the best estimation of the hydrogeological structure. The developed method reaches training accuracy of 92.31% and verification accuracy 93.75% on Zhuoshui River alluvial fan and training accuracy 95.55%, and verification accuracy 100% on Pingtung Plain. This extraordinary accuracy indicates that the developed methodology is a great tool for identifying hydrogeological structures.

Keywords: aquifer identification, decision tree, groundwater, Fourier transform

Procedia PDF Downloads 157
688 Investigating Visual Statistical Learning during Aging Using the Eye-Tracking Method

Authors: Zahra Kazemi Saleh, Bénédicte Poulin-Charronnat, Annie Vinter

Abstract:

This study examines the effects of aging on visual statistical learning, using eye-tracking techniques to investigate this cognitive phenomenon. Visual statistical learning is a fundamental brain function that enables the automatic and implicit recognition, processing, and internalization of environmental patterns over time. Some previous research has suggested the robustness of this learning mechanism throughout the aging process, underscoring its importance in the context of education and rehabilitation for the elderly. The study included three distinct groups of participants, including 21 young adults (Mage: 19.73), 20 young-old adults (Mage: 67.22), and 17 old-old adults (Mage: 79.34). Participants were exposed to a series of 12 arbitrary black shapes organized into 6 pairs, each with different spatial configurations and orientations (horizontal, vertical, and oblique). These pairs were not explicitly revealed to the participants, who were instructed to passively observe 144 grids presented sequentially on the screen for a total duration of 7 min. In the subsequent test phase, participants performed a two-alternative forced-choice task in which they had to identify the most familiar pair from 48 trials, each consisting of a base pair and a non-base pair. Behavioral analysis using t-tests revealed notable findings. The mean score for the first group was significantly above chance, indicating the presence of visual statistical learning. Similarly, the second group also performed significantly above chance, confirming the persistence of visual statistical learning in young-old adults. Conversely, the third group, consisting of old-old adults, showed a mean score that was not significantly above chance. This lack of statistical learning in the old-old adult group suggests a decline in this cognitive ability with age. Preliminary eye-tracking results showed a decrease in the number and duration of fixations during the exposure phase for all groups. The main difference was that older participants focused more often on empty cases than younger participants, likely due to a decline in the ability to ignore irrelevant information, resulting in a decrease in statistical learning performance.

Keywords: aging, eye tracking, implicit learning, visual statistical learning

Procedia PDF Downloads 77