Search results for: automatic mapping
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2002

Search results for: automatic mapping

292 Altering Surface Properties of Magnetic Nanoparticles with Single-Step Surface Modification with Various Surface Active Agents

Authors: Krupali Mehta, Sandip Bhatt, Umesh Trivedi, Bhavesh Bharatiya, Mukesh Ranjan, Atindra D. Shukla

Abstract:

Owing to the dominating surface forces and large-scale surface interactions, the nano-scale particles face difficulties in getting suspended in various media. Magnetic nanoparticles of iron oxide offer a great deal of promise due to their ease of preparation, reasonable magnetic properties, low cost and environmental compatibility. We intend to modify the surface of magnetic Fe₂O₃ nanoparticles with selected surface modifying agents using simple and effective single-step chemical reactions in order to enhance dispersibility of magnetic nanoparticles in non-polar media. Magnetic particles were prepared by hydrolysis of Fe²⁺/Fe³⁺ chlorides and their subsequent oxidation in aqueous medium. The dried particles were then treated with Octadecyl quaternary ammonium silane (Terrasil™), stearic acid and gallic acid ester of stearyl alcohol in ethanol separately to yield S-2 to S-4 respectively. The untreated Fe₂O₃ was designated as S-1. The surface modified nanoparticles were then analysed with Dynamic Light Scattering (DLS), Fourier Transform Infrared spectroscopy (FTIR), X-Ray Diffraction (XRD), Thermogravimetric Gravimetric Analysis (TGA) and Scanning Electron Microscopy and Energy dispersive X-Ray analysis (SEM-EDAX). Characterization reveals the particle size averaging 20-50 nm with and without modification. However, the crystallite size in all cases remained ~7.0 nm with the diffractogram matching to Fe₂O₃ crystal structure. FT-IR suggested the presence of surfactants on nanoparticles’ surface, also confirmed by SEM-EDAX where mapping of elements proved their presence. TGA indicated the weight losses in S-2 to S-4 at 300°C onwards suggesting the presence of organic moiety. Hydrophobic character of modified surfaces was confirmed with contact angle analysis, all modified nanoparticles showed super hydrophobic behaviour with average contact angles ~129° for S-2, ~139.5° for S-3 and ~151° for S-4. This indicated that surface modified particles are super hydrophobic and they are easily dispersible in non-polar media. These modified particles could be ideal candidates to be suspended in oil-based fluids, polymer matrices, etc. We are pursuing elaborate suspension/sedimentation studies of these particles in various oils to establish this conjecture.

Keywords: iron nanoparticles, modification, hydrophobic, dispersion

Procedia PDF Downloads 141
291 Prediction of SOC Stock using ROTH-C Model and Mapping in Different Agroclimatic Zones of Tamil Nadu

Authors: R. Rajeswari

Abstract:

An investigation was carried out to know the SOC stock and its change over time in benchmark soils of different agroclimatic zones of Tamil Nadu. Roth.C model was used to assess SOC stock under existing and alternate cropping pattern. Soil map prepared on 1:50,000 scale from Natural Resources Information System (NRIS) employed under satellite data (IRS-1C/1D-PAN sharpened LISS-III image) was used to estimate SOC stock in different agroclimatic zones of Tamil Nadu. Fifteen benchmark soils were selected in different agroclimatic zones of Tamil Nadu based on their land use and the areal extent to assess SOC level and its change overtime. This revealed that, between eleven years of period (1997 - 2007). SOC buildup was higher in soils under horticulture system, followed by soils under rice cultivation. Among different agroclimatic zones of Tamil Nadu hilly zone have the highest SOC stock, followed by north eastern, southern, western, cauvery delta, north western, and high rainfall zone. Although organic carbon content in the soils of North eastern, southern, western, North western, Cauvery delta were less than high rainfall zone, the SOC stock was high. SOC density was higher in high rainfall and hilly zone than other agroclimatic zones of Tamil Nadu. Among low rainfall regions of Tamil Nadu cauvery delta zone recorded higher SOC density. Roth.C model was used to assess SOC stock under existing and alternate cropping pattern in viz., Periyanaickenpalayam series (western zone), Peelamedu series (southern zone), Vallam series (north eastern zone), Vannappatti series (north western zone) and Padugai series (cauvery delta zone). Padugai series recorded higher TOC, BIO, and HUM, followed by Periyanaickenpalayam series, Peelamedu series, Vallam series, and Vannappatti series. Vannappatti and Padugai series develop high TOC, BIO, and HUM under existing cropping pattern. Periyanaickenpalayam, Peelamedu, and Vallam series develop high TOC, BIO, and HUM under alternate cropping pattern. Among five selected soil series, Periyanaickenpalayam, Peelamedu, and Padugai series recorded 0.75 per cent TOC during 2025 and 2018, 2100 and 2035, 2013 and 2014 under existing and alternate cropping pattern, respectively.

Keywords: agro climatic zones, benchmark soil, land use, soil organic carbon

Procedia PDF Downloads 96
290 Community Observatory for Territorial Information Control and Management

Authors: A. Olivi, P. Reyes Cabrera

Abstract:

Ageing and urbanization are two of the main trends that characterize the twenty-first century. Its trending is especially accelerated in the emerging countries of Asia and Latin America. Chile is one of the countries in the Latin American region, where the demographic transition to ageing is becoming increasingly visible. The challenges that the new demographic scenario poses to urban administrators call for searching innovative solutions to maximize the functional and psycho-social benefits derived from the relationship between older people and the environment in which they live. Although mobility is central to people's everyday practices and social relationships, it is not distributed equitably. On the contrary, it can be considered another factor of inequality in our cities. Older people are a particularly sensitive and vulnerable group to mobility. In this context, based on the ageing in place strategy and following the social innovation approach within a spatial context, the "Community Observatory of Territorial Information Control and Management" project aims at the collective search and validation of solutions for the satisfaction of mobility and accessibility specific needs of urban aged people. Specifically, the Observatory intends to: i) promote the direct participation of the aged population in order to generate relevant information on the territorial situation and the satisfaction of the mobility needs of this group; ii) co-create dynamic and efficient mechanisms for the reporting and updating of territorial information; iii) increase the capacity of the local administration to plan and manage solutions to environmental problems at the neighborhood scale. Based on a participatory mapping methodology and on the application of digital technology, the Observatory designed and developed, together with aged people, a crowdsourcing platform for smartphones, called DIMEapp, for reporting environmental problems affecting mobility and accessibility. DIMEapp has been tested at a prototype level in two neighborhoods of the city of Valparaiso. The results achieved in the testing phase have shown high potential in order to i) contribute to establishing coordination mechanisms with the local government and the local community; ii) improve a local governance system that guides and regulates the allocation of goods and services destined to solve those problems.

Keywords: accessibility, ageing, city, digital technology, local governance

Procedia PDF Downloads 133
289 Control for Fluid Flow Behaviours of Viscous Fluids and Heat Transfer in Mini-Channel: A Case Study Using Numerical Simulation Method

Authors: Emmanuel Ophel Gilbert, Williams Speret

Abstract:

The control for fluid flow behaviours of viscous fluids and heat transfer occurrences within heated mini-channel is considered. Heat transfer and flow characteristics of different viscous liquids, such as engine oil, automatic transmission fluid, one-half ethylene glycol, and deionized water were numerically analyzed. Some mathematical applications such as Fourier series and Laplace Z-Transforms were employed to ascertain the behaviour-wave like structure of these each viscous fluids. The steady, laminar flow and heat transfer equations are reckoned by the aid of numerical simulation technique. Further, this numerical simulation technique is endorsed by using the accessible practical values in comparison with the anticipated local thermal resistances. However, the roughness of this mini-channel that is one of the physical limitations was also predicted in this study. This affects the frictional factor. When an additive such as tetracycline was introduced in the fluid, the heat input was lowered, and this caused pro rata effect on the minor and major frictional losses, mostly at a very minute Reynolds number circa 60-80. At this ascertained lower value of Reynolds numbers, there exists decrease in the viscosity and minute frictional losses as a result of the temperature of these viscous liquids been increased. It is inferred that the three equations and models are identified which supported the numerical simulation via interpolation and integration of the variables extended to the walls of the mini-channel, yields the utmost reliance for engineering and technology calculations for turbulence impacting jets in the near imminent age. Out of reasoning with a true equation that could support this control for the fluid flow, Navier-stokes equations were found to tangential to this finding. Though, other physical factors with respect to these Navier-stokes equations are required to be checkmated to avoid uncertain turbulence of the fluid flow. This paradox is resolved within the framework of continuum mechanics using the classical slip condition and an iteration scheme via numerical simulation method that takes into account certain terms in the full Navier-Stokes equations. However, this resulted in dropping out in the approximation of certain assumptions. Concrete questions raised in the main body of the work are sightseen further in the appendices.

Keywords: frictional losses, heat transfer, laminar flow, mini-channel, number simulation, Reynolds number, turbulence, viscous fluids

Procedia PDF Downloads 177
288 Fuzzy Expert Approach for Risk Mitigation on Functional Urban Areas Affected by Anthropogenic Ground Movements

Authors: Agnieszka A. Malinowska, R. Hejmanowski

Abstract:

A number of European cities are strongly affected by ground movements caused by anthropogenic activities or post-anthropogenic metamorphosis. Those are mainly water pumping, current mining operation, the collapse of post-mining underground voids or mining-induced earthquakes. These activities lead to large and small-scale ground displacements and a ground ruptures. The ground movements occurring in urban areas could considerably affect stability and safety of structures and infrastructures. The complexity of the ground deformation phenomenon in relation to the structures and infrastructures vulnerability leads to considerable constraints in assessing the threat of those objects. However, the increase of access to the free software and satellite data could pave the way for developing new methods and strategies for environmental risk mitigation and management. Open source geographical information systems (OS GIS), may support data integration, management, and risk analysis. Lately, developed methods based on fuzzy logic and experts methods for buildings and infrastructure damage risk assessment could be integrated into OS GIS. Those methods were verified base on back analysis proving their accuracy. Moreover, those methods could be supported by ground displacement observation. Based on freely available data from European Space Agency and free software, ground deformation could be estimated. The main innovation presented in the paper is the application of open source software (OS GIS) for integration developed models and assessment of the threat of urban areas. Those approaches will be reinforced by analysis of ground movement based on free satellite data. Those data would support the verification of ground movement prediction models. Moreover, satellite data will enable our mapping of ground deformation in urbanized areas. Developed models and methods have been implemented in one of the urban areas hazarded by underground mining activity. Vulnerability maps supported by satellite ground movement observation would mitigate the hazards of land displacements in urban areas close to mines.

Keywords: fuzzy logic, open source geographic information science (OS GIS), risk assessment on urbanized areas, satellite interferometry (InSAR)

Procedia PDF Downloads 160
287 Optimizing Production Yield Through Process Parameter Tuning Using Deep Learning Models: A Case Study in Precision Manufacturing

Authors: Tolulope Aremu

Abstract:

This paper is based on the idea of using deep learning methodology for optimizing production yield by tuning a few key process parameters in a manufacturing environment. The study was explicitly on how to maximize production yield and minimize operational costs by utilizing advanced neural network models, specifically Long Short-Term Memory and Convolutional Neural Networks. These models were implemented using Python-based frameworks—TensorFlow and Keras. The targets of the research are the precision molding processes in which temperature ranges between 150°C and 220°C, the pressure ranges between 5 and 15 bar, and the material flow rate ranges between 10 and 50 kg/h, which are critical parameters that have a great effect on yield. A dataset of 1 million production cycles has been considered for five continuous years, where detailed logs are present showing the exact setting of parameters and yield output. The LSTM model would model time-dependent trends in production data, while CNN analyzed the spatial correlations between parameters. Models are designed in a supervised learning manner. For the model's loss, an MSE loss function is used, optimized through the Adam optimizer. After running a total of 100 training epochs, 95% accuracy was achieved by the models recommending optimal parameter configurations. Results indicated that with the use of RSM and DOE traditional methods, there was an increase in production yield of 12%. Besides, the error margin was reduced by 8%, hence consistent quality products from the deep learning models. The monetary value was annually around $2.5 million, the cost saved from material waste, energy consumption, and equipment wear resulting from the implementation of optimized process parameters. This system was deployed in an industrial production environment with the help of a hybrid cloud system: Microsoft Azure, for data storage, and the training and deployment of their models were performed on Google Cloud AI. The functionality of real-time monitoring of the process and automatic tuning of parameters depends on cloud infrastructure. To put it into perspective, deep learning models, especially those employing LSTM and CNN, optimize the production yield by fine-tuning process parameters. Future research will consider reinforcement learning with a view to achieving further enhancement of system autonomy and scalability across various manufacturing sectors.

Keywords: production yield optimization, deep learning, tuning of process parameters, LSTM, CNN, precision manufacturing, TensorFlow, Keras, cloud infrastructure, cost saving

Procedia PDF Downloads 34
286 Deep Learning for Qualitative and Quantitative Grain Quality Analysis Using Hyperspectral Imaging

Authors: Ole-Christian Galbo Engstrøm, Erik Schou Dreier, Birthe Møller Jespersen, Kim Steenstrup Pedersen

Abstract:

Grain quality analysis is a multi-parameterized problem that includes a variety of qualitative and quantitative parameters such as grain type classification, damage type classification, and nutrient regression. Currently, these parameters require human inspection, a multitude of instruments employing a variety of sensor technologies, and predictive model types or destructive and slow chemical analysis. This paper investigates the feasibility of applying near-infrared hyperspectral imaging (NIR-HSI) to grain quality analysis. For this study two datasets of NIR hyperspectral images in the wavelength range of 900 nm - 1700 nm have been used. Both datasets contain images of sparsely and densely packed grain kernels. The first dataset contains ~87,000 image crops of bulk wheat samples from 63 harvests where protein value has been determined by the FOSS Infratec NOVA which is the golden industry standard for protein content estimation in bulk samples of cereal grain. The second dataset consists of ~28,000 image crops of bulk grain kernels from seven different wheat varieties and a single rye variety. In the first dataset, protein regression analysis is the problem to solve while variety classification analysis is the problem to solve in the second dataset. Deep convolutional neural networks (CNNs) have the potential to utilize spatio-spectral correlations within a hyperspectral image to simultaneously estimate the qualitative and quantitative parameters. CNNs can autonomously derive meaningful representations of the input data reducing the need for advanced preprocessing techniques required for classical chemometric model types such as artificial neural networks (ANNs) and partial least-squares regression (PLS-R). A comparison between different CNN architectures utilizing 2D and 3D convolution is conducted. These results are compared to the performance of ANNs and PLS-R. Additionally, a variety of preprocessing techniques from image analysis and chemometrics are tested. These include centering, scaling, standard normal variate (SNV), Savitzky-Golay (SG) filtering, and detrending. The results indicate that the combination of NIR-HSI and CNNs has the potential to be the foundation for an automatic system unifying qualitative and quantitative grain quality analysis within a single sensor technology and predictive model type.

Keywords: deep learning, grain analysis, hyperspectral imaging, preprocessing techniques

Procedia PDF Downloads 100
285 The Use of Optical-Radar Remotely-Sensed Data for Characterizing Geomorphic, Structural and Hydrologic Features and Modeling Groundwater Prospective Zones in Arid Zones

Authors: Mohamed Abdelkareem

Abstract:

Remote sensing data contributed on predicting the prospective areas of water resources. Integration of microwave and multispectral data along with climatic, hydrologic, and geological data has been used here. In this article, Sentinel-2, Landsat-8 Operational Land Imager (OLI), Shuttle Radar Topography Mission (SRTM), Tropical Rainfall Measuring Mission (TRMM), and Advanced Land Observing Satellite (ALOS) Phased Array Type L‐band Synthetic Aperture Radar (PALSAR) data were utilized to identify the geological, hydrologic and structural features of Wadi Asyuti which represents a defunct tributary of the Nile basin, in the eastern Sahara. The image transformation of Sentinel-2 and Landsat-8 data allowed characterizing the different varieties of rock units. Integration of microwave remotely-sensed data and GIS techniques provided information on physical characteristics of catchments and rainfall zones that are of a crucial role for mapping groundwater prospective zones. A fused Landsat-8 OLI and ALOS/PALSAR data improved the structural elements that difficult to reveal using optical data. Lineament extraction and interpretation indicated that the area is clearly shaped by the NE-SW graben that is cut by NW-SE trend. Such structures allowed the accumulation of thick sediments in the downstream area. Processing of recent OLI data acquired on March 15, 2014, verified the flood potential maps and offered the opportunity to extract the extent of the flooding zone of the recent flash flood event (March 9, 2014), as well as revealed infiltration characteristics. Several layers including geology, slope, topography, drainage density, lineament density, soil characteristics, rainfall, and morphometric characteristics were combined after assigning a weight for each using a GIS-based knowledge-driven approach. The results revealed that the predicted groundwater potential zones (GPZs) can be arranged into six distinctive groups, depending on their probability for groundwater, namely very low, low, moderate, high very, high, and excellent. Field and well data validated the delineated zones.

Keywords: GIS, remote sensing, groundwater, Egypt

Procedia PDF Downloads 98
284 Weal: The Human Core of Well-Being as Attested by Social and Life Sciences

Authors: Gyorgy Folk

Abstract:

A finite set of cardinal needs define the human core of living well shaped on the evolutionary time scale as attested by social and life sciences of the last decades. Well-being is the purported state of living well. Living of humans akin any other living beings involves the exchange of vital substance with nature, maintaining a supportive symbiosis with an array of other living beings, living up to bonds to kin and exerting efforts to sustain living. A supportive natural environment, access to material resources, the nearness to fellow beings, and life sustaining activity are prerequisites of well-being. Well-living is prone to misinterpretation as an individual achievement, one lives well only and only if bonded to human relationships, related to a place, incorporated in nature. Akin all other forms of it, human life is a self-sustaining arrangement. One may say that the substance of life is life, and not materials, products, and services converted into life. The human being remains shaped on an evolutionary time scale and is enabled within the non-altering core of human being, invariant of cultural differences in earthly space and time. Present paper proposes the introduction of weal, the missing link in the causal chain of societal performance and the goodness of life. Interpreted differently over the ages, cultures and disciplines, instead of well-being, the construct in general use, weal is proposed as the underlying foundation of well-being. Weal stands for the totality of socialised reality as framing well-being for the individual beyond the possibility of deliberate choice. The descriptive approach to weal, mapping it under the guidance of discrete scientific disciplines reveals a limited set of cardinal aspects, labeled here the cardinal needs. Cardinal expresses the fundamental reorientation weal can bring about, needs deliver the sense of sine qua non. Weal is conceived as a oneness mapped along eight cardinal needs. The needs, approached as aspects instead of analytically isolated factors do not require mutually exclusive definitions. To serve the purpose of reorientation, weal is operationalised as a domain in multidimensional space, each dimension encompassing an optimal level of availability of the fundamental satisfiers between the extremes of drastic insufficiency and harmful excess, ensured by actual human effort. Weal seeks balance among the material and social aspects of human being while allows for cultural and individual uniqueness in attaining human flourishing.

Keywords: human well-being, development, economic theory, human needs

Procedia PDF Downloads 228
283 Social Media Data Analysis for Personality Modelling and Learning Styles Prediction Using Educational Data Mining

Authors: Srushti Patil, Preethi Baligar, Gopalkrishna Joshi, Gururaj N. Bhadri

Abstract:

In designing learning environments, the instructional strategies can be tailored to suit the learning style of an individual to ensure effective learning. In this study, the information shared on social media like Facebook is being used to predict learning style of a learner. Previous research studies have shown that Facebook data can be used to predict user personality. Users with a particular personality exhibit an inherent pattern in their digital footprint on Facebook. The proposed work aims to correlate the user's’ personality, predicted from Facebook data to the learning styles, predicted through questionnaires. For Millennial learners, Facebook has become a primary means for information sharing and interaction with peers. Thus, it can serve as a rich bed for research and direct the design of learning environments. The authors have conducted this study in an undergraduate freshman engineering course. Data from 320 freshmen Facebook users was collected. The same users also participated in the learning style and personality prediction survey. The Kolb’s Learning style questionnaires and Big 5 personality Inventory were adopted for the survey. The users have agreed to participate in this research and have signed individual consent forms. A specific page was created on Facebook to collect user data like personal details, status updates, comments, demographic characteristics and egocentric network parameters. This data was captured by an application created using Python program. The data captured from Facebook was subjected to text analysis process using the Linguistic Inquiry and Word Count dictionary. An analysis of the data collected from the questionnaires performed reveals individual student personality and learning style. The results obtained from analysis of Facebook, learning style and personality data were then fed into an automatic classifier that was trained by using the data mining techniques like Rule-based classifiers and Decision trees. This helps to predict the user personality and learning styles by analysing the common patterns. Rule-based classifiers applied for text analysis helps to categorize Facebook data into positive, negative and neutral. There were totally two models trained, one to predict the personality from Facebook data; another one to predict the learning styles from the personalities. The results show that the classifier model has high accuracy which makes the proposed method to be a reliable one for predicting the user personality and learning styles.

Keywords: educational data mining, Facebook, learning styles, personality traits

Procedia PDF Downloads 231
282 Comparative Analysis of the Impact of Urbanization on Land Surface Temperature in the United Arab Emirates

Authors: A. O. Abulibdeh

Abstract:

The aim of this study is to investigate and compare the changes in the Land Surface Temperature (LST) as a function of urbanization, particularly land use/land cover changes, in three cities in the UAE, mainly Abu Dhabi, Dubai, and Al Ain. The scale of this assessment will be at the macro- and micro-levels. At the macro-level, a comparative assessment will take place to compare between the four cities in the UAE. At the micro-level, the study will compare between the effects of different land use/land cover on the LST. This will provide a clear and quantitative city-specific information related to the relationship between urbanization and local spatial intra-urban LST variation in three cities in the UAE. The main objectives of this study are 1) to investigate the development of LST on the macro- and micro-level between and in three cities in the UAE over two decades time period, 2) to examine the impact of different types of land use/land cover on the spatial distribution of LST. Because these three cities are facing harsh arid climate, it is hypothesized that (1) urbanization is affecting and connected to the spatial changes in LST; (2) different land use/land cover have different impact on the LST; and (3) changes in spatial configuration of land use and vegetation concentration over time would control urban microclimate on a city scale and control macroclimate on the country scale. This study will be carried out over a 20-year period (1996-2016) and throughout the whole year. The study will compare between two distinct periods with different thermal characteristics which are the cool/cold period from November to March and warm/hot period between April and October. The best practice research method for this topic is to use remote sensing data to target different aspects of natural and anthropogenic systems impacts. The project will follow classical remote sensing and mapping techniques to investigate the impact of urbanization, mainly changes in land use/land cover, on LST. The investigation in this study will be performed in two stages. Stage one remote sensing data will be used to investigate the impact of urbanization on LST on a macroclimate level where the LST and Urban Heat Island (UHI) will be compared in the three cities using data from the past two decades. Stage two will investigate the impact on microclimate scale by investigating the LST and UHI using a particular land use/land cover type. In both stages, an LST and urban land cover maps will be generated over the study area. The outcome of this study should represent an important contribution to recent urban climate studies, particularly in the UAE. Based on the aim and objectives of this study, the expected outcomes are as follow: i) to determine the increase or decrease of LST as a result of urbanization in these four cities, ii) to determine the effect of different land uses/land covers on increasing or decreasing the LST.

Keywords: land use/land cover, global warming, land surface temperature, remote sensing

Procedia PDF Downloads 248
281 Flash Flood in Gabes City (Tunisia): Hazard Mapping and Vulnerability Assessment

Authors: Habib Abida, Noura Dahri

Abstract:

Flash floods are among the most serious natural hazards that have disastrous environmental and human impacts. They are associated with exceptional rain events, characterized by short durations, very high intensities, rapid flows and small spatial extent. Flash floods happen very suddenly and are difficult to forecast. They generally cause damage to agricultural crops and property, infrastructures, and may even result in the loss of human lives. The city of Gabes (South-eastern Tunisia) has been exposed to numerous damaging floods because of its mild topography, clay soil, high urbanization rate and erratic rainfall distribution. The risks associated with this situation are expected to increase further in the future because of climate change, deemed responsible for the increase of the frequency and the severity of this natural hazard. Recently, exceptional events hit Gabes City causing death and major property losses. A major flooding event hit the region on June 2nd, 2014, causing human deaths and major material losses. It resulted in the stagnation of storm water in the numerous low zones of the study area, endangering thereby human health and causing disastrous environmental impacts. The characterization of flood risk in Gabes Watershed (South-eastern Tunisia) is considered an important step for flood management. Analytical Hierarchy Process (AHP) method coupled with Monte Carlo simulation and geographic information system were applied to delineate and characterize flood areas. A spatial database was developed based on geological map, digital elevation model, land use, and rainfall data in order to evaluate the different factors susceptible to affect flood analysis. Results obtained were validated by remote sensing data for the zones that showed very high flood hazard during the extreme rainfall event of June 2014 that hit the study basin. Moreover, a survey was conducted from different areas of the city in order to understand and explore the different causes of this disaster, its extent and its consequences.

Keywords: analytical hierarchy process, flash floods, Gabes, remote sensing, Tunisia

Procedia PDF Downloads 109
280 Biofilm Text Classifiers Developed Using Natural Language Processing and Unsupervised Learning Approach

Authors: Kanika Gupta, Ashok Kumar

Abstract:

Biofilms are dense, highly hydrated cell clusters that are irreversibly attached to a substratum, to an interface or to each other, and are embedded in a self-produced gelatinous matrix composed of extracellular polymeric substances. Research in biofilm field has become very significant, as biofilm has shown high mechanical resilience and resistance to antibiotic treatment and constituted as a significant problem in both healthcare and other industry related to microorganisms. The massive information both stated and hidden in the biofilm literature are growing exponentially therefore it is not possible for researchers and practitioners to automatically extract and relate information from different written resources. So, the current work proposes and discusses the use of text mining techniques for the extraction of information from biofilm literature corpora containing 34306 documents. It is very difficult and expensive to obtain annotated material for biomedical literature as the literature is unstructured i.e. free-text. Therefore, we considered unsupervised approach, where no annotated training is necessary and using this approach we developed a system that will classify the text on the basis of growth and development, drug effects, radiation effects, classification and physiology of biofilms. For this, a two-step structure was used where the first step is to extract keywords from the biofilm literature using a metathesaurus and standard natural language processing tools like Rapid Miner_v5.3 and the second step is to discover relations between the genes extracted from the whole set of biofilm literature using pubmed.mineR_v1.0.11. We used unsupervised approach, which is the machine learning task of inferring a function to describe hidden structure from 'unlabeled' data, in the above-extracted datasets to develop classifiers using WinPython-64 bit_v3.5.4.0Qt5 and R studio_v0.99.467 packages which will automatically classify the text by using the mentioned sets. The developed classifiers were tested on a large data set of biofilm literature which showed that the unsupervised approach proposed is promising as well as suited for a semi-automatic labeling of the extracted relations. The entire information was stored in the relational database which was hosted locally on the server. The generated biofilm vocabulary and genes relations will be significant for researchers dealing with biofilm research, making their search easy and efficient as the keywords and genes could be directly mapped with the documents used for database development.

Keywords: biofilms literature, classifiers development, text mining, unsupervised learning approach, unstructured data, relational database

Procedia PDF Downloads 172
279 Identification of Damage Mechanisms in Interlock Reinforced Composites Using a Pattern Recognition Approach of Acoustic Emission Data

Authors: M. Kharrat, G. Moreau, Z. Aboura

Abstract:

The latest advances in the weaving industry, combined with increasingly sophisticated means of materials processing, have made it possible to produce complex 3D composite structures. Mainly used in aeronautics, composite materials with 3D architecture offer better mechanical properties than 2D reinforced composites. Nevertheless, these materials require a good understanding of their behavior. Because of the complexity of such materials, the damage mechanisms are multiple, and the scenario of their appearance and evolution depends on the nature of the exerted solicitations. The AE technique is a well-established tool for discriminating between the damage mechanisms. Suitable sensors are used during the mechanical test to monitor the structural health of the material. Relevant AE-features are then extracted from the recorded signals, followed by a data analysis using pattern recognition techniques. In order to better understand the damage scenarios of interlock composite materials, a multi-instrumentation was set-up in this work for tracking damage initiation and development, especially in the vicinity of the first significant damage, called macro-damage. The deployed instrumentation includes video-microscopy, Digital Image Correlation, Acoustic Emission (AE) and micro-tomography. In this study, a multi-variable AE data analysis approach was developed for the discrimination between the different signal classes representing the different emission sources during testing. An unsupervised classification technique was adopted to perform AE data clustering without a priori knowledge. The multi-instrumentation and the clustered data served to label the different signal families and to build a learning database. This latter is useful to construct a supervised classifier that can be used for automatic recognition of the AE signals. Several materials with different ingredients were tested under various solicitations in order to feed and enrich the learning database. The methodology presented in this work was useful to refine the damage threshold for the new generation materials. The damage mechanisms around this threshold were highlighted. The obtained signal classes were assigned to the different mechanisms. The isolation of a 'noise' class makes it possible to discriminate between the signals emitted by damages without resorting to spatial filtering or increasing the AE detection threshold. The approach was validated on different material configurations. For the same material and the same type of solicitation, the identified classes are reproducible and little disturbed. The supervised classifier constructed based on the learning database was able to predict the labels of the classified signals.

Keywords: acoustic emission, classifier, damage mechanisms, first damage threshold, interlock composite materials, pattern recognition

Procedia PDF Downloads 156
278 Algorithm for Modelling Land Surface Temperature and Land Cover Classification and Their Interaction

Authors: Jigg Pelayo, Ricardo Villar, Einstine Opiso

Abstract:

The rampant and unintended spread of urban areas resulted in increasing artificial component features in the land cover types of the countryside and bringing forth the urban heat island (UHI). This paved the way to wide range of negative influences on the human health and environment which commonly relates to air pollution, drought, higher energy demand, and water shortage. Land cover type also plays a relevant role in the process of understanding the interaction between ground surfaces with the local temperature. At the moment, the depiction of the land surface temperature (LST) at city/municipality scale particularly in certain areas of Misamis Oriental, Philippines is inadequate as support to efficient mitigations and adaptations of the surface urban heat island (SUHI). Thus, this study purposely attempts to provide application on the Landsat 8 satellite data and low density Light Detection and Ranging (LiDAR) products in mapping out quality automated LST model and crop-level land cover classification in a local scale, through theoretical and algorithm based approach utilizing the principle of data analysis subjected to multi-dimensional image object model. The paper also aims to explore the relationship between the derived LST and land cover classification. The results of the presented model showed the ability of comprehensive data analysis and GIS functionalities with the integration of object-based image analysis (OBIA) approach on automating complex maps production processes with considerable efficiency and high accuracy. The findings may potentially lead to expanded investigation of temporal dynamics of land surface UHI. It is worthwhile to note that the environmental significance of these interactions through combined application of remote sensing, geographic information tools, mathematical morphology and data analysis can provide microclimate perception, awareness and improved decision-making for land use planning and characterization at local and neighborhood scale. As a result, it can aid in facilitating problem identification, support mitigations and adaptations more efficiently.

Keywords: LiDAR, OBIA, remote sensing, local scale

Procedia PDF Downloads 283
277 Modeling the Acquisition of Expertise in a Sequential Decision-Making Task

Authors: Cristóbal Moënne-Loccoz, Rodrigo C. Vergara, Vladimir López, Domingo Mery, Diego Cosmelli

Abstract:

Our daily interaction with computational interfaces is plagued of situations in which we go from inexperienced users to experts through self-motivated exploration of the same task. In many of these interactions, we must learn to find our way through a sequence of decisions and actions before obtaining the desired result. For instance, when drawing cash from an ATM machine, choices are presented in a step-by-step fashion so that a specific sequence of actions must be performed in order to produce the expected outcome. But, as they become experts in the use of such interfaces, do users adopt specific search and learning strategies? Moreover, if so, can we use this information to follow the process of expertise development and, eventually, predict future actions? This would be a critical step towards building truly adaptive interfaces that can facilitate interaction at different moments of the learning curve. Furthermore, it could provide a window into potential mechanisms underlying decision-making behavior in real world scenarios. Here we tackle this question using a simple game interface that instantiates a 4-level binary decision tree (BDT) sequential decision-making task. Participants have to explore the interface and discover an underlying concept-icon mapping in order to complete the game. We develop a Hidden Markov Model (HMM)-based approach whereby a set of stereotyped, hierarchically related search behaviors act as hidden states. Using this model, we are able to track the decision-making process as participants explore, learn and develop expertise in the use of the interface. Our results show that partitioning the problem space into such stereotyped strategies is sufficient to capture a host of exploratory and learning behaviors. Moreover, using the modular architecture of stereotyped strategies as a Mixture of Experts, we are able to simultaneously ask the experts about the user's most probable future actions. We show that for those participants that learn the task, it becomes possible to predict their next decision, above chance, approximately halfway through the game. Our long-term goal is, on the basis of a better understanding of real-world decision-making processes, to inform the construction of interfaces that can establish dynamic conversations with their users in order to facilitate the development of expertise.

Keywords: behavioral modeling, expertise acquisition, hidden markov models, sequential decision-making

Procedia PDF Downloads 252
276 MB-Slam: A Slam Framework for Construction Monitoring

Authors: Mojtaba Noghabaei, Khashayar Asadi, Kevin Han

Abstract:

Simultaneous Localization and Mapping (SLAM) technology has recently attracted the attention of construction companies for real-time performance monitoring. To effectively use SLAM for construction performance monitoring, SLAM results should be registered to a Building Information Models (BIM). Registring SLAM and BIM can provide essential insights for construction managers to identify construction deficiencies in real-time and ultimately reduce rework. Also, registering SLAM to BIM in real-time can boost the accuracy of SLAM since SLAM can use features from both images and 3d models. However, registering SLAM with the BIM in real-time is a challenge. In this study, a novel SLAM platform named Model-Based SLAM (MB-SLAM) is proposed, which not only provides automated registration of SLAM and BIM but also improves the localization accuracy of the SLAM system in real-time. This framework improves the accuracy of SLAM by aligning perspective features such as depth, vanishing points, and vanishing lines from the BIM to the SLAM system. This framework extracts depth features from a monocular camera’s image and improves the localization accuracy of the SLAM system through a real-time iterative process. Initially, SLAM can be used to calculate a rough camera pose for each keyframe. In the next step, each SLAM video sequence keyframe is registered to the BIM in real-time by aligning the keyframe’s perspective with the equivalent BIM view. The alignment method is based on perspective detection that estimates vanishing lines and points by detecting straight edges on images. This process will generate the associated BIM views from the keyframes' views. The calculated poses are later improved during a real-time gradient descent-based iteration method. Two case studies were presented to validate MB-SLAM. The validation process demonstrated promising results and accurately registered SLAM to BIM and significantly improved the SLAM’s localization accuracy. Besides, MB-SLAM achieved real-time performance in both indoor and outdoor environments. The proposed method can fully automate past studies and generate as-built models that are aligned with BIM. The main contribution of this study is a SLAM framework for both research and commercial usage, which aims to monitor construction progress and performance in a unified framework. Through this platform, users can improve the accuracy of the SLAM by providing a rough 3D model of the environment. MB-SLAM further boosts the application to practical usage of the SLAM.

Keywords: perspective alignment, progress monitoring, slam, stereo matching.

Procedia PDF Downloads 226
275 Affects Associations Analysis in Emergency Situations

Authors: Joanna Grzybowska, Magdalena Igras, Mariusz Ziółko

Abstract:

Association rule learning is an approach for discovering interesting relationships in large databases. The analysis of relations, invisible at first glance, is a source of new knowledge which can be subsequently used for prediction. We used this data mining technique (which is an automatic and objective method) to learn about interesting affects associations in a corpus of emergency phone calls. We also made an attempt to match revealed rules with their possible situational context. The corpus was collected and subjectively annotated by two researchers. Each of 3306 recordings contains information on emotion: (1) type (sadness, weariness, anxiety, surprise, stress, anger, frustration, calm, relief, compassion, contentment, amusement, joy) (2) valence (negative, neutral, or positive) (3) intensity (low, typical, alternating, high). Also, additional information, that is a clue to speaker’s emotional state, was annotated: speech rate (slow, normal, fast), characteristic vocabulary (filled pauses, repeated words) and conversation style (normal, chaotic). Exponentially many rules can be extracted from a set of items (an item is a previously annotated single information). To generate the rules in the form of an implication X → Y (where X and Y are frequent k-itemsets) the Apriori algorithm was used - it avoids performing needless computations. Then, two basic measures (Support and Confidence) and several additional symmetric and asymmetric objective measures (e.g. Laplace, Conviction, Interest Factor, Cosine, correlation coefficient) were calculated for each rule. Each applied interestingness measure revealed different rules - we selected some top rules for each measure. Owing to the specificity of the corpus (emergency situations), most of the strong rules contain only negative emotions. There are though strong rules including neutral or even positive emotions. Three examples of the strongest rules are: {sadness} → {anxiety}; {sadness, weariness, stress, frustration} → {anger}; {compassion} → {sadness}. Association rule learning revealed the strongest configurations of affects (as well as configurations of affects with affect-related information) in our emergency phone calls corpus. The acquired knowledge can be used for prediction to fulfill the emotional profile of a new caller. Furthermore, a rule-related possible context analysis may be a clue to the situation a caller is in.

Keywords: data mining, emergency phone calls, emotional profiles, rules

Procedia PDF Downloads 408
274 Spatial Analysis of the Socio-Environmental Vulnerability in Medium-Sized Cities: Case Study of Municipality of Caraguatatuba SP-Brazil

Authors: Katia C. Bortoletto, Maria Isabel C. de Freitas, Rodrigo B. N. de Oliveira

Abstract:

The environmental vulnerability studies are essential for priority actions to the reduction of disasters risk. The aim of this study is to analyze the socio-environmental vulnerability obtained through a Census survey, followed by both a statistical analysis (PCA/SPSS/IBM) and a spatial analysis by GIS (ArcGis/ESRI), taking as a case study the Municipality of Caraguatatuba-SP, Brazil. In the municipal development plan analysis the emphasis was given to the Special Zone of Social Interest (ZEIS), the Urban Expansion Zone (ZEU) and the Environmental Protection Zone (ZPA). For the mapping of the social and environmental vulnerabilities of the study area the exposure of people (criticality) and of the place (support capacity) facing disaster risk were obtained from the 2010 Census from the Brazilian Institute of Geography and Statistics (IBGE). Considering the criticality, the variables of greater influence were related to literate persons responsible for the household and literate persons with 5 or more years of age; persons with 60 years or more of age and income of the person responsible for the household. In the Support Capacity analysis, the predominant influence was on the good household infrastructure in districts with low population density and also the presence of neighborhoods with little urban infrastructure and inadequate housing. The results of the comparative analysis show that the areas with high and very high vulnerability classes cover the classes of the ZEIS and the ZPA, whose zoning includes: Areas occupied by low-income population, presence of children and young people, irregular occupations and land suitable to urbanization but underutilized. The presence of zones of urban sprawl (ZEU) in areas of high to very high socio-environmental vulnerability reflects the inadequate use of the urban land in relation to the spatial distribution of the population and the territorial infrastructure, which favors the increase of disaster risk. It can be concluded that the study allowed observing the convergence between the vulnerability analysis and the classified areas in urban zoning. The occupation of areas unsuitable for housing due to its characteristics of risk was confirmed, thus concluding that the methodologies applied are agile instruments to subsidize actions to the reduction disasters risk.

Keywords: socio-environmental vulnerability, urban zoning, reduction disasters risk, methodologies

Procedia PDF Downloads 298
273 A Critical Discourse Analysis of Corporate Annual Reports in a Cross-Cultural Perspective: Views from Grammatical Metaphor and Systemic Functional Linguistics

Authors: Antonio Piga

Abstract:

The study of language strategies in financial and corporate discourse has always been vital for understanding how companies manage to communicate effectively with a wider customer base and offers new perspectives on how companies interact with key stakeholders, not only to convey transparency and an image of trustworthiness, but also to create affiliation and attract investment. In the light of Systemic Functional Linguistics, the purpose of this study is to examine and analyse the annual reports of Asian and Western joint-stock companies involved in oil refining and power generation from the point of view of the functions and frequency of grammatical metaphors. More specifically, grammatical metaphor - through the lens of Critical Discourse Analysis (CDA) - is used as a theoretical tool for analysing a synchronic cross-cultural study of the communicative strategies adopted by Asian and Western companies to communicate social and environmental sustainability and showcase their ethical values, performance and competitiveness to local and global communities and key stakeholders. According to Systemic Functional Linguistics, grammatical metaphor can be divided into two broad areas: ideational and interpersonal. This study focuses on the first type, ideational grammatical metaphor (IGM), which includes de-adjectival and de-verbal nominalisation. The dominant and more effective grammatical tropes used by Asian and Western corporations in their annual reports were examined from both a qualitative and quantitative perspective. The aim was to categorise and explain how ideational grammatical metaphor is constructed cross-culturally and presented through structural language patterns involving re-mapping between semantics and lexico-grammatical features. The results show that although there seem to be more differences than similarities in terms of the categorisation of the ideational grammatical metaphors conceptualised in the two case studies analysed, there are more similarities than differences in terms of the occurrence, the congruence of process types and the role and function of IGM. Through the immediacy and essentialism of compacting and condensing information, IGM seems to be an important linguistic strategy adopted in the rhetoric of corporate annual reports, contributing to the ideologies and actions of companies to report and promote efficiency, profit and social and environmental sustainability, thus advocating the engagement and investment of key stakeholders.

Keywords: corporate annual reports, cross-cultural perspective, ideational grammatical metaphor, rhetoric, systemic functional linguistics

Procedia PDF Downloads 50
272 [Keynote Talk]: Monitoring of Ultrafine Particle Number and Size Distribution at One Urban Background Site in Leicester

Authors: Sarkawt M. Hama, Paul S. Monks, Rebecca L. Cordell

Abstract:

Within the Joaquin project, ultrafine particles (UFP) are continuously measured at one urban background site in Leicester. The main aims are to examine the temporal and seasonal variations in UFP number concentration and size distribution in an urban environment, and to try to assess the added value of continuous UFP measurements. In addition, relations of UFP with more commonly monitored pollutants such as black carbon (BC), nitrogen oxides (NOX), particulate matter (PM2.5), and the lung deposited surface area(LDSA) were evaluated. The effects of meteorological conditions, particularly wind speed and direction, and also temperature on the observed distribution of ultrafine particles will be detailed. The study presents the results from an experimental investigation into the particle number concentration size distribution of UFP, BC, and NOX with measurements taken at the Automatic Urban and Rural Network (AURN) monitoring site in Leicester. The monitoring was performed as part of the EU project JOAQUIN (Joint Air Quality Initiative) supported by the INTERREG IVB NWE program. The total number concentrations (TNC) were measured by a water-based condensation particle counter (W-CPC) (TSI model 3783), the particle number concentrations (PNC) and size distributions were measured by an ultrafine particle monitor (UFP TSI model 3031), the BC by MAAP (Thermo-5012), the NOX by NO-NO2-NOx monitor (Thermos Scientific 42i), and a Nanoparticle Surface Area Monitor (NSAM, TSI 3550) was used to measure the LDSA (reported as μm2 cm−3) corresponding to the alveolar region of the lung between November 2013 and November 2015. The average concentrations of particle number concentrations were observed in summer with lower absolute values of PNC than in winter might be related mainly to particles directly emitted by traffic and to the more favorable conditions of atmospheric dispersion. Results showed a traffic-related diurnal variation of UFP, BC, NOX and LDSA with clear morning and evening rush hour peaks on weekdays, only an evening peak at the weekends. Correlation coefficients were calculated between UFP and other pollutants (BC and NOX). The highest correlation between them was found in winter months. Overall, the results support the notion that local traffic emissions were a major contributor of the atmospheric particles pollution and a clear seasonal pattern was found, with higher values during the cold season.

Keywords: size distribution, traffic emissions, UFP, urban area

Procedia PDF Downloads 330
271 Industry Symbiosis and Waste Glass Upgrading: A Feasibility Study in Liverpool Towards Circular Economy

Authors: Han-Mei Chen, Rongxin Zhou, Taige Wang

Abstract:

Glass is widely used in everyday life, from glass bottles for beverages to architectural glass for various forms of glazing. Although the mainstream of used glass is recycled in the UK, the single-use and then recycling procedure results in a lot of waste as it incorporates intact glass with smashing, re-melting, and remanufacturing. These processes bring massive energy consumption with a huge loss of high embodied energy and economic value, compared to re-use, which’s towards a ‘zero carbon’ target. As a tourism city, Liverpool has more glass bottle consumption than most less leisure-focused cities. It’s therefore vital for Liverpool to find an upgrading approach for the single-use glass bottles with low carbon output. This project aims to assess the feasibility of industrial symbiosis and upgrading the framework of glass and to investigate the ways of achieving them. It is significant to Liverpool’s future industrial strategy since it provides an opportunity to target economic recovery for post-COVID by industry symbiosis and up-grading waste management in Liverpool to respond to the climate emergency. In addition, it will influence the local government policy for glass bottle reuse and recycling in North West England and as a good practice to be further recommended to other areas of the UK. First, a critical literature review of glass waste strategies has been conducted in the UK and worldwide industrial symbiosis practices. Second, mapping, data collection, and analysis have shown the current life cycle chain and the strong links of glass reuse and upgrading potentials via site visits to 16 local waste recycling centres. The results of this research have demonstrated the understanding of the influence of key factors on the development of a circular industrial symbiosis business model for beverage glass bottles. The current waste management procedures of the glass bottle industry, its business model, supply chain, and material flow have been reviewed. The various potential opportunities for glass bottle up-valuing have been investigated towards an industrial symbiosis in Liverpool. Finally, an up-valuing business model has been developed for an industrial symbiosis framework of glass in Liverpool. For glass bottles, there are two possibilities 1) focus on upgrading processes towards re-use rather than single-use and recycling and 2) focus on ‘smart’ re-use and recycling, leading to optimised values in other sectors to create a wider industry symbiosis for a multi-level and circular economy.

Keywords: glass bottles, industry symbiosis, smart re-use, waste upgrading

Procedia PDF Downloads 107
270 Genome Sequencing, Assembly and Annotation of Gelidium Pristoides from Kenton-on-Sea, South Africa

Authors: Sandisiwe Mangali, Graeme Bradley

Abstract:

Genome is complete set of the organism's hereditary information encoded as either deoxyribonucleic acid or ribonucleic acid in most viruses. The three different types of genomes are nuclear, mitochondrial and the plastid genome and their sequences which are uncovered by genome sequencing are known as an archive for all genetic information and enable researchers to understand the composition of a genome, regulation of gene expression and also provide information on how the whole genome works. These sequences enable researchers to explore the population structure, genetic variations, and recent demographic events in threatened species. Particularly, genome sequencing refers to a process of figuring out the exact arrangement of the basic nucleotide bases of a genome and the process through which all the afore-mentioned genomes are sequenced is referred to as whole or complete genome sequencing. Gelidium pristoides is South African endemic Rhodophyta species which has been harvested in the Eastern Cape since the 1950s for its high economic value which is one motivation for its sequencing. Its endemism further motivates its sequencing for conservation biology as endemic species are more vulnerable to anthropogenic activities endangering a species. As sequencing, mapping and annotating the Gelidium pristoides genome is the aim of this study. To accomplish this aim, the genomic DNA was extracted and quantified using the Nucleospin Plank Kit, Qubit 2.0 and Nanodrop. Thereafter, the Ion Plus Fragment Library was used for preparation of a 600bp library which was then sequenced through the Ion S5 sequencing platform for two runs. The produced reads were then quality-controlled and assembled through the SPAdes assembler with default parameters and the genome assembly was quality assessed through the QUAST software. From this assembly, the plastid and the mitochondrial genomes were then sampled out using Gelidiales organellar genomes as search queries and ordered according to them using the Geneious software. The Qubit and the Nanodrop instruments revealed an A260/A280 and A230/A260 values of 1.81 and 1.52 respectively. A total of 30792074 reads were obtained and produced a total of 94140 contigs with resulted into a sequence length of 217.06 Mbp with N50 value of 3072 bp and GC content of 41.72%. A total length of 179281bp and 25734 bp was obtained for plastid and mitochondrial respectively. Genomic data allows a clear understanding of the genomic constituent of an organism and is valuable as foundation information for studies of individual genes and resolving the evolutionary relationships between organisms including Rhodophytes and other seaweeds.

Keywords: Gelidium pristoides, genome, genome sequencing and assembly, Ion S5 sequencing platform

Procedia PDF Downloads 151
269 Mapping Actors in Sao Paulo's Urban Development Policies: Interests at Stake in the Challenge to Sustainability

Authors: A. G. Back

Abstract:

In the context of global climate change, extreme weather events are increasingly intense and frequent, challenging the adaptability of urban space. In this sense, urban planning is a relevant instrument for addressing, in a systemic manner, various sectoral policies capable of linking the urban agenda to the reduction of social and environmental risks. The Master Plan of the Municipality of Sao Paulo, 2014, presents innovations capable of promoting the transition to sustainability in the urban space. Among such innovations, the following stand out: i) promotion of density in the axes of mass transport involving mixture of commercial, residential, services, and leisure uses (principles related to the compact city); ii) vulnerabilities reduction based on housing policies, including regular sources of funds for social housing and land reservation in urbanized areas; iii) reserve of green areas in the city to create parks and environmental regulations for new buildings focused on reducing the effects of heat island and improving urban drainage. However, long-term implementation involves distributive conflicts and may change in different political, economic, and social contexts over time. Thus, the central objective of this paper is to identify which factors limit or support the implementation of these policies. That is, to map the challenges and interests of converging and/or divergent urban actors in the sustainable urban development agenda and what resources they mobilize to support or limit these actions in the city of Sao Paulo. Recent proposals to amend the urban zoning law undermine the implementation of the Master Plan guidelines. In this context, three interest groups with different views of the city come into dispute: the real estate market, upper middle class neighborhood associations ('not in my backyard' movements), and social housing rights movements. This paper surveys the different interests and visions of these groups taking into account their convergences, or not, with the principles of sustainable urban development. This approach seeks to fill a gap in the international literature on the causes that underpin or hinder the continued implementation of policies aimed at the transition to urban sustainability in the medium and long term.

Keywords: adaptation, ecosystem-based adaptation, interest groups, urban planning, urban transition to sustainability

Procedia PDF Downloads 122
268 Online Course of Study and Job Crafting for University Students: Development Work and Feedback

Authors: Hannele Kuusisto, Paivi Makila, Ursula Hyrkkanen

Abstract:

Introduction: There have been arguments about the skills university students should have when graduated. Current trends argue that as well as the specific job-related skills the graduated students need problem-solving, interaction and networking skills as well as self-management skills. Skills required in working life are also considered in the Finnish national project called VALTE (short for 'prepared for working life'). The project involves 11 Finnish school organizations. As one result of this project, a five-credit independent online course in study and job engagement as well as in study and job crafting was developed at Turku University of Applied Sciences. The aim of the oral or e-poster presentation is to present the online course developed in the project. The purpose of this abstract is to present the development work of the online course and the feedback received from the pilots. Method: As the University of Turku is the leading partner of the VALTE project, the collaborative education platform ViLLE (https://ville.utu.fi, developed by the University of Turku) was chosen as the online platform for the course. Various exercise types with automatic assessment were used; for example, quizzes, multiple-choice questions, classification exercises, gap filling exercises, model answer questions, self-assessment tasks, case tasks, and collaboration in Padlet. In addition, the free material and free platforms on the Internet were used (Youtube, Padlet, Todaysmeet, and Prezi) as well as the net-based questionnaires about the study engagement and study crafting (made with Webropol). Three teachers with long teaching experience (also with job crafting and online pedagogy) and three students working as trainees in the project developed the content of the course. The online course was piloted twice in 2017 as an elective course for the students at Turku University of Applied Sciences, a higher education institution of about 10 000 students. After both pilots, feedback from the students was gathered and the online course was developed. Results: As the result, the functional five-credit independent online course suitable for students of different educational institutions was developed. The student feedback shows that students themselves think that the developed online course really enhanced their job and study crafting skills. After the course, 91% of the students considered their knowledge in job and study engagement as well as in job and study crafting to be at a good or excellent level. About two-thirds of the students were going to exploit their knowledge significantly in the future. Students appreciated the variability and the game-like feeling of the exercises as well as the opportunity to study online at the time and place they chose themselves. On a five-point scale (1 being poor and 5 being excellent), the students graded the clarity of the ViLLE platform as 4.2, the functionality of the platform as 4.0 and the easiness of operating as 3.9.

Keywords: job crafting, job engagement, online course, study crafting, study engagement

Procedia PDF Downloads 153
267 Development and Application of an Intelligent Masonry Modulation in BIM Tools: Literature Review

Authors: Sara A. Ben Lashihar

Abstract:

The heritage building information modelling (HBIM) of the historical masonry buildings has expanded lately to meet the urgent needs for conservation and structural analysis. The masonry structures are unique features for ancient building architectures worldwide that have special cultural, spiritual, and historical significance. However, there is a research gap regarding the reliability of the HBIM modeling process of these structures. The HBIM modeling process of the masonry structures faces significant challenges due to the inherent complexity and uniqueness of their structural systems. Most of these processes are based on tracing the point clouds and rarely follow documents, archival records, or direct observation. The results of these techniques are highly abstracted models where the accuracy does not exceed LOD 200. The masonry assemblages, especially curved elements such as arches, vaults, and domes, are generally modeled with standard BIM components or in-place models, and the brick textures are graphically input. Hence, future investigation is necessary to establish a methodology to generate automatically parametric masonry components. These components are developed algorithmically according to mathematical and geometric accuracy and the validity of the survey data. The main aim of this paper is to provide a comprehensive review of the state of the art of the existing researches and papers that have been conducted on the HBIM modeling of the masonry structural elements and the latest approaches to achieve parametric models that have both the visual fidelity and high geometric accuracy. The paper reviewed more than 800 articles, proceedings papers, and book chapters focused on "HBIM and Masonry" keywords from 2017 to 2021. The studies were downloaded from well-known, trusted bibliographic databases such as Web of Science, Scopus, Dimensions, and Lens. As a starting point, a scientometric analysis was carried out using VOSViewer software. This software extracts the main keywords in these studies to retrieve the relevant works. It also calculates the strength of the relationships between these keywords. Subsequently, an in-depth qualitative review followed the studies with the highest frequency of occurrence and the strongest links with the topic, according to the VOSViewer's results. The qualitative review focused on the latest approaches and the future suggestions proposed in these researches. The findings of this paper can serve as a valuable reference for researchers, and BIM specialists, to make more accurate and reliable HBIM models for historic masonry buildings.

Keywords: HBIM, masonry, structure, modeling, automatic, approach, parametric

Procedia PDF Downloads 168
266 An Informative Marketing Platform: Methodology and Architecture

Authors: Martina Marinelli, Samanta Vellante, Francesco Pilotti, Daniele Di Valerio, Gaetanino Paolone

Abstract:

Any development in web marketing technology requires changes in information engineering to identify instruments and techniques suitable for the production of software applications for informative marketing. Moreover, for large web solutions, designing an interface that enables human interactions is a complex process that must bridge between informative marketing requirements and the developed solution. A user-friendly interface in web marketing applications is crucial for a successful business. The paper introduces mkInfo - a software platform that implements informative marketing. Informative marketing is a new interpretation of marketing which places the information at the center of every marketing action. The creative team includes software engineering researchers who have recently authored an article on automatic code generation. The authors have created the mkInfo software platform to generate informative marketing web applications. For each web application, it is possible to automatically implement an opt in page, a landing page, a sales page, and a thank you page: one only needs to insert the content. mkInfo implements an autoresponder to send mail according to a predetermined schedule. The mkInfo platform also includes e-commerce for a product or service. The stakeholder can access any opt-in page and get basic information about a product or service. If he wants to know more, he will need to provide an e-mail address to access a landing page that will generate an e-mail sequence. It will provide him with complete information about the product or the service. From this point on, the stakeholder becomes a user and is now able to purchase the product or related services through the mkInfo platform. This paper suggests a possible definition for Informative Marketing, illustrates its basic principles, and finally details the mkInfo platform that implements it. This paper also offers some Informative Marketing models, which are implemented in the mkInfo platform. Informative marketing can be applied to products or services. It is necessary to realize a web application for each product or service. The mkInfo platform enables the product or the service producer to send information concerning a specific product or service to all stakeholders. In conclusion, the technical contributions of this paper are: a different interpretation of marketing based on information; a modular architecture for web applications, particularly for one with standard features such as information storage, exchange, and delivery; multiple models to implement informative marketing; a software platform enabling the implementation of such models in a web application. Future research aims to enable stakeholders to provide information about a product or a service so that the information gathered about a product or a service includes both the producer’s and the stakeholders' point of view. The purpose is to create an all-inclusive management system of the knowledge regarding a specific product or service: a system that includes everything about the product or service and is able to address even unexpected questions.

Keywords: informative marketing, opt in page, software platform, web application

Procedia PDF Downloads 127
265 Digital Image Correlation: Metrological Characterization in Mechanical Analysis

Authors: D. Signore, M. Ferraiuolo, P. Caramuta, O. Petrella, C. Toscano

Abstract:

The Digital Image Correlation (DIC) is a newly developed optical technique that is spreading in all engineering sectors because it allows the non-destructive estimation of the entire surface deformation without any contact with the component under analysis. These characteristics make the DIC very appealing in all the cases the global deformation state is to be known without using strain gages, which are the most used measuring device. The DIC is applicable to any material subjected to distortion caused by either thermal or mechanical load, allowing to obtain high-definition mapping of displacements and deformations. That is why in the civil and the transportation industry, DIC is very useful for studying the behavior of metallic materials as well as of composite materials. DIC is also used in the medical field for the characterization of the local strain field of the vascular tissues surface subjected to uniaxial tensile loading. DIC can be carried out in the two dimension mode (2D DIC) if a single camera is used or in a three dimension mode (3D DIC) if two cameras are involved. Each point of the test surface framed by the cameras can be associated with a specific pixel of the image, and the coordinates of each point are calculated knowing the relative distance between the two cameras together with their orientation. In both arrangements, when a component is subjected to a load, several images related to different deformation states can be are acquired through the cameras. A specific software analyzes the images via the mutual correlation between the reference image (obtained without any applied load) and those acquired during the deformation giving the relative displacements. In this paper, a metrological characterization of the digital image correlation is performed on aluminum and composite targets both in static and dynamic loading conditions by comparison between DIC and strain gauges measures. In the static test, interesting results have been obtained thanks to an excellent agreement between the two measuring techniques. In addition, the deformation detected by the DIC is compliant with the result of a FEM simulation. In the dynamic test, the DIC was able to follow with a good accuracy the periodic deformation of the specimen giving results coherent with the ones given by FEM simulation. In both situations, it was seen that the DIC measurement accuracy depends on several parameters such as the optical focusing, the parameters chosen to perform the mutual correlation between the images and, finally, the reference points on image to be analyzed. In the future, the influence of these parameters will be studied, and a method to increase the accuracy of the measurements will be developed in accordance with the requirements of the industries especially of the aerospace one.

Keywords: accuracy, deformation, image correlation, mechanical analysis

Procedia PDF Downloads 311
264 Spatial Mapping of Variations in Groundwater of Taluka Islamkot Thar Using GIS and Field Data

Authors: Imran Aziz Tunio

Abstract:

Islamkot is an underdeveloped sub-district (Taluka) in the Tharparkar district Sindh province of Pakistan located between latitude 24°25'19.79"N to 24°47'59.92"N and longitude 70° 1'13.95"E to 70°32'15.11"E. The Islamkot has an arid desert climate and the region is generally devoid of perennial rivers, canals, and streams. It is highly dependent on rainfall which is not considered a reliable surface water source and groundwater is the only key source of water for many centuries. To assess groundwater’s potential, an electrical resistivity survey (ERS) was conducted in Islamkot Taluka. Groundwater investigations for 128 Vertical Electrical Sounding (VES) were collected to determine the groundwater potential and obtain qualitatively and quantitatively layered resistivity parameters. The PASI Model 16 GL-N Resistivity Meter was used by employing a Schlumberger electrode configuration, with half current electrode spacing (AB/2) ranging from 1.5 to 100 m and the potential electrode spacing (MN/2) from 0.5 to 10 m. The data was acquired with a maximum current electrode spacing of 200 m. The data processing for the delineation of dune sand aquifers involved the technique of data inversion, and the interpretation of the inversion results was aided by the use of forward modeling. The measured geo-electrical parameters were examined by Interpex IX1D software, and apparent resistivity curves and synthetic model layered parameters were mapped in the ArcGIS environment using the inverse Distance Weighting (IDW) interpolation technique. Qualitative interpretation of vertical electrical sounding (VES) data shows the number of geo-electrical layers in the area varies from three to four with different resistivity values detected. Out of 128 VES model curves, 42 nos. are 3 layered, and 86 nos. are 4 layered. The resistivity of the first subsurface layers (Loose surface sand) varied from 16.13 Ωm to 3353.3 Ωm and thickness varied from 0.046 m to 17.52m. The resistivity of the second subsurface layer (Semi-consolidated sand) varied from 1.10 Ωm to 7442.8 Ωm and thickness varied from 0.30 m to 56.27 m. The resistivity of the third subsurface layer (Consolidated sand) varied from 0.00001 Ωm to 3190.8 Ωm and thickness varied from 3.26 m to 86.66 m. The resistivity of the fourth subsurface layer (Silt and Clay) varied from 0.0013 Ωm to 16264 Ωm and thickness varied from 13.50 m to 87.68 m. The Dar Zarrouk parameters, i.e. longitudinal unit conductance S is from 0.00024 to 19.91 mho; transverse unit resistance T from 7.34 to 40080.63 Ωm2; longitudinal resistance RS is from 1.22 to 3137.10 Ωm and transverse resistivity RT from 5.84 to 3138.54 Ωm. ERS data and Dar Zarrouk parameters were mapped which revealed that the study area has groundwater potential in the subsurface.

Keywords: electrical resistivity survey, GIS & RS, groundwater potential, environmental assessment, VES

Procedia PDF Downloads 110
263 Dose Saving and Image Quality Evaluation for Computed Tomography Head Scanning with Eye Protection

Authors: Yuan-Hao Lee, Chia-Wei Lee, Ming-Fang Lin, Tzu-Huei Wu, Chih-Hsiang Ko, Wing P. Chan

Abstract:

Computed tomography (CT) scan of the head is a good method for investigating cranial lesions. However, radiation-induced oxidative stress can be accumulated in the eyes and promote carcinogenesis and cataract. In this regard, we aimed to protect the eyes with barium sulfate shield(s) during CT scans and investigate the resultant image quality and radiation dose to the eye. Patients who underwent health examinations were selectively enrolled in this study in compliance with the protocol approved by the Ethics Committee of the Joint Institutional Review Board at Taipei Medical University. Participants’ brains were scanned with a water-based marker simultaneously by a multislice CT scanner (SOMATON Definition Flash) under a fixed tube current-time setting or automatic tube current modulation (TCM). The lens dose was measured by Gafchromic films, whose dose response curve was previously fitted using thermoluminescent dosimeters, with or without barium sulfate or bismuth-antimony shield laid above. For the assessment of image quality CT images at slice planes that exhibit the interested regions on the zygomatic, orbital and nasal bones of the head phantom as well as the water-based marker were used for calculating the signal-to-noise and contrast-to-noise ratios. The application of barium sulfate and bismuth-antimony shields decreased 24% and 47% of the lens dose on average, respectively. Under topogram-based TCM, the dose saving power of bismuth-antimony shield was mitigated whereas that of barium sulfate shield was enhanced. On the other hand, the signal-to-noise and contrast-to-noise ratios of DSCT images were decreased separately by barium sulfate and bismuth-antimony shield, resulting in an overall reduction of the CNR. In contrast, the integration of topogram-based TCM elevated signal difference between the ROIs on the zygomatic bones and eyeballs while preferentially decreasing the signal-to-noise ratios upon the use of barium sulfate shield. The results of this study indicate that the balance between eye exposure and image quality can be optimized by combining eye shields with topogram-based TCM on the multislice scanner. Eye shielding could change the photon attenuation characteristics of tissues that are close to the shield. The application of both shields on eye protection hence is not recommended for seeking intraorbital lesions.

Keywords: computed tomography, barium sulfate shield, dose saving, image quality

Procedia PDF Downloads 269