Search results for: high resolution synthetic imagery
20373 Oil Recovery Study by Low Temperature Carbon Dioxide Injection in High-Pressure High-Temperature Micromodels
Authors: Zakaria Hamdi, Mariyamni Awang
Abstract:
For the past decades, CO2 flooding has been used as a successful method for enhanced oil recovery (EOR). However, high mobility ratio and fingering effect are considered as important drawbacka of this process. Low temperature injection of CO2 into high temperature reservoirs may improve the oil recovery, but simulating multiphase flow in the non-isothermal medium is difficult, and commercial simulators are very unstable in these conditions. Furthermore, to best of authors’ knowledge, no experimental work was done to verify the results of the simulations and to understand the pore-scale process. In this paper, we present results of investigations on injection of low temperature CO2 into a high-pressure high-temperature micromodel with injection temperature range from 34 to 75 °F. Effect of temperature and saturation changes of different fluids are measured in each case. The results prove the proposed method. The injection of CO2 at low temperatures increased the oil recovery in high temperature reservoirs significantly. Also, CO2 rich phases available in the high temperature system can affect the oil recovery through the better sweep of the oil which is initially caused by penetration of LCO2 inside the system. Furthermore, no unfavorable effect was detected using this method. Low temperature CO2 is proposed to be used as early as secondary recovery.Keywords: enhanced oil recovery, CO₂ flooding, micromodel studies, miscible flooding
Procedia PDF Downloads 35220372 Multimodal Optimization of Density-Based Clustering Using Collective Animal Behavior Algorithm
Authors: Kristian Bautista, Ruben A. Idoy
Abstract:
A bio-inspired metaheuristic algorithm inspired by the theory of collective animal behavior (CAB) was integrated to density-based clustering modeled as multimodal optimization problem. The algorithm was tested on synthetic, Iris, Glass, Pima and Thyroid data sets in order to measure its effectiveness relative to CDE-based Clustering algorithm. Upon preliminary testing, it was found out that one of the parameter settings used was ineffective in performing clustering when applied to the algorithm prompting the researcher to do an investigation. It was revealed that fine tuning distance δ3 that determines the extent to which a given data point will be clustered helped improve the quality of cluster output. Even though the modification of distance δ3 significantly improved the solution quality and cluster output of the algorithm, results suggest that there is no difference between the population mean of the solutions obtained using the original and modified parameter setting for all data sets. This implies that using either the original or modified parameter setting will not have any effect towards obtaining the best global and local animal positions. Results also suggest that CDE-based clustering algorithm is better than CAB-density clustering algorithm for all data sets. Nevertheless, CAB-density clustering algorithm is still a good clustering algorithm because it has correctly identified the number of classes of some data sets more frequently in a thirty trial run with a much smaller standard deviation, a potential in clustering high dimensional data sets. Thus, the researcher recommends further investigation in the post-processing stage of the algorithm.Keywords: clustering, metaheuristics, collective animal behavior algorithm, density-based clustering, multimodal optimization
Procedia PDF Downloads 23020371 Political Antinomy and Its Resolution in Islam
Authors: Abdul Nasir Zamir
Abstract:
After the downfall of Ottoman Caliphate, it scattered into different small Muslim states. Muslim leaders, intellectuals, revivalists as well as modernists started trying to boost up their nation. Some Muslims are also trying to establish the caliphate. Every Muslim country has its own political system, i.e., kingship, dictatorship or democracy, etc. But these are not in their original forms as the historian or political science discussed in their studies. The laws and their practice are mixed, i.e., others with Islamic laws, e.g., Saudi Arabia (K.S.A) and the Islamic Republic of Pakistan, etc. There is great conflict among the revivalist Muslim parties (groups) and governments about political systems. The question is that the subject matter is Sharia or political system? Leaders of Modern Muslim states are alleged as disbelievers due to neglecting the revelation in their laws and decisions. There are two types of laws; Islamic laws and management laws. The conflict is that the non-Islamic laws are in practice in Muslim states. Non-Islamic laws can be gradually changed with Islamic laws with a legal and peaceful process according to the practice of former Muslim leaders and scholars. The bloodshed of Muslims is not allowed in any case. Weak Muslim state is a blessing than nothing. The political system after Muhammad and guided caliphs is considered as kingship. But during this period Muslims not only developed in science and technology but conquered many territories also. If the original aim is in practice, then the Modern Muslim states can be stabled with different political systems. Modern Muslim states are the hope of survival, stability, and development of Muslim Ummah. Islam does not allow arm clash with Muslim army or Muslim civilians. The caliphate is based on believing in one Allah Almighty and good deeds according to Quran and Sunnah. As faith became weak and good deeds became less from its standard level, caliphate automatically became weak and even ended. The last weak caliphate was Ottoman Caliphate which was a hope of all the Muslims of the world. There is no caliphate or caliph present in the world. But every Muslim country or state is like an Amarat (a part of caliphate or small and alternate form of the caliphate) of Muslims. It is the duty of all Muslims to stable these modern Muslim states with tolerance.Keywords: caliphate, conflict resolution, modern Muslim state, political conflicts, political systems, tolerance
Procedia PDF Downloads 15520370 Brain-Computer Interfaces That Use Electroencephalography
Authors: Arda Ozkurt, Ozlem Bozkurt
Abstract:
Brain-computer interfaces (BCIs) are devices that output commands by interpreting the data collected from the brain. Electroencephalography (EEG) is a non-invasive method to measure the brain's electrical activity. Since it was invented by Hans Berger in 1929, it has led to many neurological discoveries and has become one of the essential components of non-invasive measuring methods. Despite the fact that it has a low spatial resolution -meaning it is able to detect when a group of neurons fires at the same time-, it is a non-invasive method, making it easy to use without possessing any risks. In EEG, electrodes are placed on the scalp, and the voltage difference between a minimum of two electrodes is recorded, which is then used to accomplish the intended task. The recordings of EEGs include, but are not limited to, the currents along dendrites from synapses to the soma, the action potentials along the axons connecting neurons, and the currents through the synaptic clefts connecting axons with dendrites. However, there are some sources of noise that may affect the reliability of the EEG signals as it is a non-invasive method. For instance, the noise from the EEG equipment, the leads, and the signals coming from the subject -such as the activity of the heart or muscle movements- affect the signals detected by the electrodes of the EEG. However, new techniques have been developed to differentiate between those signals and the intended ones. Furthermore, an EEG device is not enough to analyze the data from the brain to be used by the BCI implication. Because the EEG signal is very complex, to analyze it, artificial intelligence algorithms are required. These algorithms convert complex data into meaningful and useful information for neuroscientists to use the data to design BCI devices. Even though for neurological diseases which require highly precise data, invasive BCIs are needed; non-invasive BCIs - such as EEGs - are used in many cases to help disabled people's lives or even to ease people's lives by helping them with basic tasks. For example, EEG is used to detect before a seizure occurs in epilepsy patients, which can then prevent the seizure with the help of a BCI device. Overall, EEG is a commonly used non-invasive BCI technique that has helped develop BCIs and will continue to be used to detect data to ease people's lives as more BCI techniques will be developed in the future.Keywords: BCI, EEG, non-invasive, spatial resolution
Procedia PDF Downloads 7120369 Recycling Biomass of Constructed Wetlands as Precursors of Electrodes for Removing Heavy Metals and Persistent Pollutants
Authors: Álvaro Ramírez Vidal, Martín Muñoz Morales, Francisco Jesús Fernández Morales, Luis Rodríguez Romero, José Villaseñor Camacho, Javier Llanos López
Abstract:
In recent times, environmental problems have led to the extensive use of biological systems to solve them. Among the different types of biological systems, the use of plants such as aquatic macrophytes in constructed wetlands and terrestrial plant species for treating polluted soils and sludge has gained importance. Though the use of constructed wetlands for wastewater treatment is a well-researched domain, the slowness of pollutant degradation and high biomass production pose some challenges. Plants used in CW participate in different mechanisms for the capture and degradation of pollutants that also can retain some pharmaceutical and personal care products (PPCPs) that are very persistent in the environment. Thus, these systems present advantages in line with the guidelines published for the transition towards friendly and ecological procedures as they are environmentally friendly systems, consume low energy, or capture atmospheric CO₂. However, the use of CW presents some drawbacks, as the slowness of pollutant degradation or the production of important amounts of plant biomass, which need to be harvested and managed periodically. Taking this opportunity in mind, it is important to highlight that this residual biomass (of lignocellulosic nature) could be used as the feedstock for the generation of carbonaceous materials using thermochemical transformations such as slow pyrolysis or hydrothermal carbonization to produce high-value biomass-derived carbons through sustainable processes as adsorbents, catalysts…, thereby improving the circular carbon economy. Thus, this work carried out the analysis of some PPCPs commonly found in urban wastewater, as salicylic acid or ibuprofen, to evaluate the remediation carried out for the Phragmites Australis. Then, after the harvesting, this biomass can be used to synthesize electrodes through hydrothermal carbonization (HTC) and produce high-value biomass-derived carbons with electrocatalytic activity to remove heavy metals and persistent pollutants, promoting circular economy concepts. To do this, it was chosen biomass derived from the natural environment in high environmental risk as the Daimiel Wetlands National Park in the center of Spain, and the rest of the biomass developed in a CW specifically designed to remove pollutants. The research emphasizes the impact of the composition of the biomass waste and the synthetic parameters applied during HTC on the electrocatalytic activity. Additionally, this parameter can be related to the physicochemical properties, as porosity, surface functionalization, conductivity, and mass transfer of the electrodes lytic inks. Data revealed that carbon materials synthesized have good surface properties (good conductivities and high specific surface area) that enhance the electro-oxidants generated and promote the removal of PPCPs and the chemical oxygen demand of polluted waters.Keywords: constructed wetlands, carbon materials, heavy metals, pharmaceutical and personal care products, hydrothermal carbonization
Procedia PDF Downloads 9420368 Biological Monitoring: Vegetation Cover, Bird Assemblages, Rodents, Terrestrial and Aquatic Invertebrates from a Closed Landfill
Authors: A. Cittadino, P. Gantes, C. Coviella, M. Casset, A. Sanchez Caro
Abstract:
Three currently active landfills receive the waste from Buenos Aires city and the Great Buenos Aires suburbs. One of the first landfills to receive solid waste from this area was located in Villa Dominico, some 7 km south from Buenos Aires City. With an area of some 750 ha, including riparian habitats, divided into 14 cells, it received solid wastes from June 1979 through February 2004. In December 2010, a biological monitoring program was set up by CEAMSE and Universidad Nacional de Lujan, still operational to date. The aim of the monitoring program is to assess the state of several biological groups within the landfill and to follow their dynamics overtime in order to identify if any, early signs of damage the landfill activities might have over the biota present. Bird and rodent populations, aquatic and terrestrial invertebrates’ populations, cells vegetation coverage, and surrounding areas vegetation coverage and main composition are followed by quarterly samplings. Bird species richness and abundance were estimated by observation over walk transects on each environment. A total of 74 different species of birds were identified. Species richness and diversity were high for both riparian surrounding areas and within the landfill. Several grassland -typical of the 'Pampa'- bird species were found within the landfill, as well as some migratory and endangered bird species. Sherman and Tomahawk traps are set overnight for small mammal sampling. Rodent populations are just above detection limits, and the few specimens captured belong mainly to species common to rural areas, instead of city-dwelling species. The two marsupial species present in the region were captured on occasions. Aquatic macroinvertebrates were sampled on a watercourse upstream and downstream the outlet of the landfill’s wastewater treatment plant and are used to follow water quality using biological indices. Water quality ranged between weak and severe pollution; benthic invertebrates sampled before and after the landfill, show no significant differences in water quality using the IBMWP index. Insect biota from yellow sticky cards and pitfall traps showed over 90 different morphospecies, with Shannon diversity index running from 1.9 to 3.9, strongly affected by the season. An easy-to-perform non-expert demandant method was used to assess vegetation coverage. Two scales of determination are utilized: field observation (1 m resolution), and Google Earth images (that allow for a better than 5 m resolution). Over the eight year period of the study, vegetation coverage over the landfill cells run from a low 83% to 100% on different cells, with an average between 95 to 99% for the entire landfill depending on seasonality. Surrounding area vegetation showed almost 100% coverage during the entire period, with an average density from 2 to 6 species per sq meter and no signs of leachate damaged vegetation.Keywords: biological indicators, biota monitoring, landfill species diversity, waste management
Procedia PDF Downloads 13920367 Enhancement of Fracture Toughness for Low-Temperature Applications in Mild Steel Weldments
Authors: Manjinder Singh, Jasvinder Singh
Abstract:
Existing theories of Titanic/Liberty ship, Sydney bridge accidents and practical experience generated an interest in developing weldments those has high toughness under sub-zero temperature conditions. The purpose was to protect the joint from undergoing DBT (Ductile to brittle transition), when ambient temperature reach sub-zero levels. Metallurgical improvement such as low carbonization or addition of deoxidization elements like Mn and Si was effective to prevent fracture in weldments (crack) at low temperature. In the present research, an attempt has been made to investigate the reason behind ductile to brittle transition of mild steel weldments when subjected to sub-zero temperatures and method of its mitigation. Nickel is added to weldments using manual metal arc welding (MMAW) preventing the DBT, but progressive reduction in charpy impact values as temperature is lowered. The variation in toughness with respect to nickel content being added to the weld pool is analyzed quantitatively to evaluate the rise in toughness value with increasing nickel amount. The impact performance of welded specimens was evaluated by Charpy V-notch impact tests at various temperatures (20 °C, 0 °C, -20 °C, -40 °C, -60 °C). Notch is made in the weldments, as notch sensitive failure is particularly likely to occur at zones of high stress concentration caused by a notch. Then the effect of nickel to weldments is investigated at various temperatures was studied by mechanical and metallurgical tests. It was noted that a large gain in impact toughness could be achieved by adding nickel content. The highest yield strength (462J) in combination with good impact toughness (over 220J at – 60 °C) was achieved with an alloying content of 16 wt. %nickel. Based on metallurgical behavior it was concluded that the weld metals solidify as austenite with increase in nickel. The microstructure was characterized using optical and high resolution SEM (scanning electron microscopy). At inter-dendritic regions mainly martensite was found. In dendrite core regions of the low carbon weld metals a mixture of upper bainite, lower bainite and a novel constituent coalesced bainite formed. Coalesced bainite was characterized by large bainitic ferrite grains with cementite precipitates and is believed to form when the bainite and martensite start temperatures are close to each other. Mechanical properties could be rationalized in terms of micro structural constituents as a function of nickel content.Keywords: MMAW, Toughness, DBT, Notch, SEM, Coalesced bainite
Procedia PDF Downloads 52620366 The Cut-Off Value of TG/HDL Ratio of High Pericardial Adipose Tissue
Authors: Nam-Seok Joo, Da-Eun Jung, Beom-Hee Choi
Abstract:
Background and Objectives: Recently, the triglyceride/high-density lipoprotine cholesterol (TG/HDL) ratio and pericardial adipose tissue (PAT) has gained attention as an indicator related to metabolic syndrome (MS). To date, there has been no research on the relationship between TG/HDL and PAT, we aimed to investigate the association between the TG/HDL and PAT. Methods: In this cross-sectional study, we investigated 627 patients who underwent coronary multidetector computed tomography and metabolic parameters. We divided subjects into two groups according to the cut-off PAT volume associated with MS, which is 142.2 cm³, and we compared metabolic parameters between those groups. We divided the TG/HDL ratio into tertiles according to Log(TG/HDL) and compared PAT-related parameters by analysis of variance. Finally, we applied logistic regression analysis to obtain the odds ratio of high PAT (PAT volume≥142.2 cm³) in each tertile, and we performed receiver operating characteristic (ROC) analysis to get the cut-off of TG/HDL ratio according to high PAT. Results: The mean TG/ HDL ratio of the high PAT volume group was 3.6, and TG/ HDL ratio had a strong positive correlation with various metabolic parameters. In addition, in the Log (TG/HDL) tertile groups, the higher tertile had more metabolic derangements, including PAT, and showed higher odds ratios of having high PAT (OR=4.10 in the second tertile group and OR=5.06 in their third tertile group, respectively) after age, sex, smoking adjustments. TG/HDL ratio according to the having increased PAT by ROC curve showed 1.918 (p < 0.001). Conclusion: TG/HDL ratio and high PAT volume have a significant positive correlation, and higher TG/HDL ratio showed high PAT. The cut-off value of the TG/HDL ratio was 1.918 to have a high PAT.Keywords: triglyceride, high-density lipoprotein, pericardial adipose tissue, cut-off value
Procedia PDF Downloads 1520365 Detection of High Fructose Corn Syrup in Honey by Near Infrared Spectroscopy and Chemometrics
Authors: Mercedes Bertotto, Marcelo Bello, Hector Goicoechea, Veronica Fusca
Abstract:
The National Service of Agri-Food Health and Quality (SENASA), controls honey to detect contamination by synthetic or natural chemical substances and establishes and controls the traceability of the product. The utility of near-infrared spectroscopy for the detection of adulteration of honey with high fructose corn syrup (HFCS) was investigated. First of all, a mixture of different authentic artisanal Argentinian honey was prepared to cover as much heterogeneity as possible. Then, mixtures were prepared by adding different concentrations of high fructose corn syrup (HFCS) to samples of the honey pool. 237 samples were used, 108 of them were authentic honey and 129 samples corresponded to honey adulterated with HFCS between 1 and 10%. They were stored unrefrigerated from time of production until scanning and were not filtered after receipt in the laboratory. Immediately prior to spectral collection, honey was incubated at 40°C overnight to dissolve any crystalline material, manually stirred to achieve homogeneity and adjusted to a standard solids content (70° Brix) with distilled water. Adulterant solutions were also adjusted to 70° Brix. Samples were measured by NIR spectroscopy in the range of 650 to 7000 cm⁻¹. The technique of specular reflectance was used, with a lens aperture range of 150 mm. Pretreatment of the spectra was performed by Standard Normal Variate (SNV). The ant colony optimization genetic algorithm sample selection (ACOGASS) graphical interface was used, using MATLAB version 5.3, to select the variables with the greatest discriminating power. The data set was divided into a validation set and a calibration set, using the Kennard-Stone (KS) algorithm. A combined method of Potential Functions (PF) was chosen together with Partial Least Square Linear Discriminant Analysis (PLS-DA). Different estimators of the predictive capacity of the model were compared, which were obtained using a decreasing number of groups, which implies more demanding validation conditions. The optimal number of latent variables was selected as the number associated with the minimum error and the smallest number of unassigned samples. Once the optimal number of latent variables was defined, we proceeded to apply the model to the training samples. With the calibrated model for the training samples, we proceeded to study the validation samples. The calibrated model that combines the potential function methods and PLSDA can be considered reliable and stable since its performance in future samples is expected to be comparable to that achieved for the training samples. By use of Potential Functions (PF) and Partial Least Square Linear Discriminant Analysis (PLS-DA) classification, authentic honey and honey adulterated with HFCS could be identified with a correct classification rate of 97.9%. The results showed that NIR in combination with the PT and PLS-DS methods can be a simple, fast and low-cost technique for the detection of HFCS in honey with high sensitivity and power of discrimination.Keywords: adulteration, multivariate analysis, potential functions, regression
Procedia PDF Downloads 12520364 Changes in When and Where People Are Spending Time in Response to COVID-19
Authors: Nicholas Reinicke, Brennan Borlaug, Matthew Moniot
Abstract:
The COVID-19 pandemic has resulted in a significant change in driving behavior as people respond to the new environment. However, existing methods for analyzing driver behavior, such as travel surveys and travel demand models, are not suited for incorporating abrupt environmental disruptions. To address this, we analyze a set of high-resolution trip data and introduce two new metrics for quantifying driving behavioral shifts as a function of time, allowing us to compare the time periods before and after the pandemic began. We apply these metrics to the Denver, Colorado metropolitan statistical area (MSA) to demonstrate the utility of the metrics. Then, we present a case study for comparing two distinct MSAs, Louisville, Kentucky, and Des Moines, Iowa, which exhibit significant differences in the makeup of their labor markets. The results indicate that although the regions of study exhibit certain unique driving behavioral shifts, emerging trends can be seen when comparing between seemingly distinct regions. For instance, drivers in all three MSAs are generally shown to have spent more time at residential locations and less time in workplaces in the time period after the pandemic started. In addition, workplaces that may be incompatible with remote working, such as hospitals and certain retail locations, generally retained much of their pre-pandemic travel activity.Keywords: COVID-19, driver behavior, GPS data, signal analysis, telework
Procedia PDF Downloads 11120363 Investigating the Role of Dystrophin in Neuronal Homeostasis
Authors: Samantha Shallop, Hakinya Karra, Tytus Bernas, Gladys Shaw, Gretchen Neigh, Jeffrey Dupree, Mathula Thangarajh
Abstract:
Abnormal neuronal homeostasis is considered a structural correlate of cognitive deficits in Duchenne Muscular Dystrophy. Neurons are highly polarized cells with multiple dendrites but a single axon. Trafficking of cellular organelles are highly regulated, with the cargo in the somatodendritic region of the neuron not permitted to enter the axonal compartment. We investigated the molecular mechanisms that regular organelle trafficking in neurons using a multimodal approach, including high-resolution structural illumination, proteomics, immunohistochemistry, and computational modeling. We investigated the expression of ankyrin-G, the master regulator controlling neuronal polarity. The expression of ankyrin G and the morphology of the axon initial segment was profoundly abnormal in the CA1 hippocampal neurons in the mdx52 animal model of DMD. Ankyrin-G colocalized with kinesin KIF5a, the anterograde protein transporter, with higher levels in older mdx52 mice than younger mdx52 mice. These results suggest that the functional trafficking from the somatodendritic compartment is abnormal. Our data suggests that dystrophin deficiency compromised neuronal homeostasis via ankyrin-G-based mechanisms.Keywords: neurons, axonal transport, duchenne muscular dystrophy, organelle transport
Procedia PDF Downloads 9520362 High Techno-Parks in the Economy of Azerbaijan and Their Management Problems
Authors: Rasim M. Alguliyev, Alovsat G. Aliyev, Roza O. Shahverdiyeva
Abstract:
The paper investigated the role and position of high techno-parks, which is one of the priorities of Azerbaijan. The main objectives, functions and features of the establishment of high-techno parks, as well as organization of the activity of the structural elements, which are the parking complex and their interactions were analyzed. The development, organization and management of high techno-parks were studied. The key features and functions of innovative structures’ management were explained. The need for a comprehensive management system for the development of high-techno parks was emphasized and the major problems were analyzed. In addition, the methods were proposed for the development of information systems supporting decision making in systematic and sustainable management of the parks.Keywords: innovative development, innovation processes, innovation economy, innovation infrastructure, high technology park, efficient management, management decisions, information insurance
Procedia PDF Downloads 47220361 Embedded Semantic Segmentation Network Optimized for Matrix Multiplication Accelerator
Authors: Jaeyoung Lee
Abstract:
Autonomous driving systems require high reliability to provide people with a safe and comfortable driving experience. However, despite the development of a number of vehicle sensors, it is difficult to always provide high perceived performance in driving environments that vary from time to season. The image segmentation method using deep learning, which has recently evolved rapidly, provides high recognition performance in various road environments stably. However, since the system controls a vehicle in real time, a highly complex deep learning network cannot be used due to time and memory constraints. Moreover, efficient networks are optimized for GPU environments, which degrade performance in embedded processor environments equipped simple hardware accelerators. In this paper, a semantic segmentation network, matrix multiplication accelerator network (MMANet), optimized for matrix multiplication accelerator (MMA) on Texas instrument digital signal processors (TI DSP) is proposed to improve the recognition performance of autonomous driving system. The proposed method is designed to maximize the number of layers that can be performed in a limited time to provide reliable driving environment information in real time. First, the number of channels in the activation map is fixed to fit the structure of MMA. By increasing the number of parallel branches, the lack of information caused by fixing the number of channels is resolved. Second, an efficient convolution is selected depending on the size of the activation. Since MMA is a fixed, it may be more efficient for normal convolution than depthwise separable convolution depending on memory access overhead. Thus, a convolution type is decided according to output stride to increase network depth. In addition, memory access time is minimized by processing operations only in L3 cache. Lastly, reliable contexts are extracted using the extended atrous spatial pyramid pooling (ASPP). The suggested method gets stable features from an extended path by increasing the kernel size and accessing consecutive data. In addition, it consists of two ASPPs to obtain high quality contexts using the restored shape without global average pooling paths since the layer uses MMA as a simple adder. To verify the proposed method, an experiment is conducted using perfsim, a timing simulator, and the Cityscapes validation sets. The proposed network can process an image with 640 x 480 resolution for 6.67 ms, so six cameras can be used to identify the surroundings of the vehicle as 20 frame per second (FPS). In addition, it achieves 73.1% mean intersection over union (mIoU) which is the highest recognition rate among embedded networks on the Cityscapes validation set.Keywords: edge network, embedded network, MMA, matrix multiplication accelerator, semantic segmentation network
Procedia PDF Downloads 12920360 Tunable Crystallinity of Zinc Gallogermanate Nanoparticles via Organic Ligand-Assisted Biphasic Hydrothermal Synthesis
Authors: Sarai Guerrero, Lijia Liu
Abstract:
Zinc gallogermanate (ZGGO) is a persistent phosphor that can emit in the near infrared (NIR) range once dopped with Cr³⁺ enabling its use for in-vivo deep-tissue bio-imaging. Such a property also allows for its application in cancer diagnosis and therapy. Given this, work into developing a synthetic procedure that can be done using common laboratory instruments and equipment as well as understanding ZGGO overall, is in demand. However, the ZGGO nanoparticles must have a size compatible for cell intake to occur while still maintaining sufficient photoluminescence. The nanoparticle must also be made biocompatible by functionalizing the surface for hydrophilic solubility and for high particle uniformity in the final product. Additionally, most research is completed on doped ZGGO, leaving a gap in understanding the base form of ZGGO. It also leaves a gap in understanding how doping affects the synthesis of ZGGO. In this work, the first step of optimizing the particle size via the crystalline size of ZGGO was done with undoped ZGGO using the organic acid, oleic acid (OA) for organic ligand-assisted biphasic hydrothermal synthesis. The effects of this synthesis procedure on ZGGO’s crystallinity were evaluated using Powder X-Ray Diffraction (PXRD). OA was selected as the capping ligand as experiments have shown it beneficial in synthesizing sub-10 nm zinc gallate (ZGO) nanoparticles as well as palladium nanocrystals and magnetite (Fe₃O₄) nanoparticles. Later it is possible to substitute OA with a different ligand allowing for hydrophilic solubility. Attenuated Total Reflection Fourier-Transform Infrared (ATR-FTIR) was used to investigate the surface of the nanoparticle to investigate and verify that OA had capped the nanoparticle. PXRD results showed that using this procedure led to improved crystallinity, comparable to the high-purity reagents used on the ZGGO nanoparticles. There was also a change in the crystalline size of the ZGGO nanoparticles. ATR-FTIR showed that once capped ZGGO cannot be annealed as doing so will affect the OA. These results point to this new procedure positively affecting the crystallinity of ZGGO nanoparticles. There are also repeatable implying the procedure is a reliable source of highly crystalline ZGGO nanoparticles. With this completed, the next step will be working on substituting the OA with a hydrophilic ligand. As these ligands effect the solubility of the nanoparticle as well as the pH that the nanoparticles can dissolve in, further research is needed to verify which ligand is best suited for preparing ZGGO for bio-imaging.Keywords: biphasic hydrothermal synthesis, crystallinity, oleic acid, zinc gallogermanate
Procedia PDF Downloads 13320359 First and Second Order Gm-C Filters
Authors: Rana Mahmoud
Abstract:
This study represents a systematic study of the Operational Transconductance Amplifiers capacitance (OTA-C) filters or as it is often called Gm-C filters. OTA-C filters have been paid a great attention for the last decades. As Gm-C filters operate in an open loop topology, this makes them flexible to perform in low and high frequencies. As such, Gm-C filters can be used in various wireless communication applications. Another property of Gm-C filters is its electronic tunability, thus different filter frequency characteristics can be obtained without changing the inductance and resistance values. This can be achieved by an OTA (Operational Transconductance Amplifier) and a capacitor. By tuning the OTA transconductance, the cut-off frequency will be tuned and different frequency responses are achieved. Different high-order analog filters can be design using Gm-C filters including low pass, high pass and band pass filters. 1st and 2nd order low pass, high pass and band pass filters are presented in this paper.Keywords: Gm-C, filters, low-pass, high-pass, band-pass
Procedia PDF Downloads 13020358 Hydrochemical Contamination Profiling and Spatial-Temporal Mapping with the Support of Multivariate and Cluster Statistical Analysis
Authors: Sofia Barbosa, Mariana Pinto, José António Almeida, Edgar Carvalho, Catarina Diamantino
Abstract:
The aim of this work was to test a methodology able to generate spatial-temporal maps that can synthesize simultaneously the trends of distinct hydrochemical indicators in an old radium-uranium tailings dam deposit. Multidimensionality reduction derived from principal component analysis and subsequent data aggregation derived from clustering analysis allow to identify distinct hydrochemical behavioural profiles and to generate synthetic evolutionary hydrochemical maps.Keywords: Contamination plume migration, K-means of PCA scores, groundwater and mine water monitoring, spatial-temporal hydrochemical trends
Procedia PDF Downloads 23520357 Tall Building Transit-Oriented Development (TB-TOD) and Energy Efficiency in Suburbia: Case Studies, Sydney, Toronto, and Washington D.C.
Authors: Narjes Abbasabadi
Abstract:
As the world continues to urbanize and suburbanize, where suburbanization associated with mass sprawl has been the dominant form of this expansion, sustainable development challenges will be more concerned. Sprawling, characterized by low density and automobile dependency, presents significant environmental issues regarding energy consumption and Co2 emissions. This paper examines the vertical expansion of suburbs integrated into mass transit nodes as a planning strategy for boosting density, intensification of land use, conversion of single family homes to multifamily dwellings or mixed use buildings and development of viable alternative transportation choices. It analyzes the spatial patterns of tall building transit-oriented development (TB-TOD) of suburban regions in Sydney (Australia), Toronto (Canada), and Washington D.C. (United States). The main objectives of this research seek to understand the effect of the new morphology of suburban tall, the physical dimensions of individual buildings and their arrangement at a larger scale with energy efficiency. This study aims to answer these questions: 1) why and how can the potential phenomenon of vertical expansion or high-rise development be integrated into suburb settings? 2) How can this phenomenon contribute to an overall denser development of suburbs? 3) Which spatial pattern or typologies/ sub-typologies of the TB-TOD model do have the greatest energy efficiency? It addresses these questions by focusing on 1) energy, heat energy demand (excluding cooling and lighting) related to design issues at two levels: macro, urban scale and micro, individual buildings—physical dimension, height, morphology, spatial pattern of tall buildings and their relationship with each other and transport infrastructure; 2) Examining TB-TOD to provide more evidence of how the model works regarding ridership. The findings of the research show that the TB-TOD model can be identified as the most appropriate spatial patterns of tall buildings in suburban settings. And among the TB-TOD typologies/ sub-typologies, compact tall building blocks can be the most energy efficient one. This model is associated with much lower energy demands in buildings at the neighborhood level as well as lower transport needs in an urban scale while detached suburban high rise or low rise suburban housing will have the lowest energy efficiency. The research methodology is based on quantitative study through applying the available literature and static data as well as mapping and visual documentations of urban regions such as Google Earth, Microsoft Bing Bird View and Streetview. It will examine each suburb within each city through the satellite imagery and explore the typologies/ sub-typologies which are morphologically distinct. The study quantifies heat energy efficiency of different spatial patterns through simulation via GIS software.Keywords: energy efficiency, spatial pattern, suburb, tall building transit-oriented development (TB-TOD)
Procedia PDF Downloads 26020356 A Study on the Construction Process and Sustainable Renewal Development of High-Rise Residential Areas in Chongqing (1978-2023)
Authors: Xiaoting Jing, Ling Huang
Abstract:
After the reform and opening up, Chongqing has formed far more high-rise residential areas than other cities in its more than 40 years of urban construction. High-rise residential areas have become one of the main modern living models in Chongqing and an important carrier reflecting the city's high quality of life. Reviewing the construction process and renewal work helps understand the characteristics of high-rise residential areas in Chongqing at different stages, clarify current development demands, and look forward to the focus of future renewal work. Based on socio-economic development and policy background, the article sorts the construction process of high-rise residential areas in Chongqing into four stages: the early experimental construction period of high-rise residential areas (1978-1996), the rapid start-up period of high-rise commodity housing construction (1997-2006), the large-scale construction period of high-rise commodity housing and public rental housing (2007-2014), and the period of renewal and renovation of high-rise residential areas and step-by-step construction of quality commodity housing (2015-present). Based on the construction demands and main construction types of each stage, the article summarizes that the construction of high-rise residential areas in Chongqing features large scale, high speed, and high density. It points out that a large number of high-rise residential areas built after 2000 will become important objects of renewal and renovation in the future. Based on existing renewal work experience, it is urgent to explore a path for sustainable renewal and development in terms of policy mechanisms, digital supervision, and renewal and renovation models, leading the high-rise living in Chongqing toward high-quality development.Keywords: high-rise residential areas, construction process, renewal and renovation, Chongqing
Procedia PDF Downloads 6720355 The Mechanical Strength and Durability of High Performance Concrete Using Local Materials
Authors: I. Guemidi, Y. Abdelaziz, T. Rikioui
Abstract:
In this work, an experimental investigation was carried out to evaluate the mechanical and durability properties of high performance concretes (HPC) containing local southwest Algerian materials. The mechanical properties were assessed from the compressive strength and the flexural strength, whilst the durability characteristics were investigated in terms of sulphate attack. The results obtained allow us to conclude that it is possible to make a high performance concrete (HPC) based on existing materials in the local market, if these are carefully selected and properly mixed in such away to optimize grain size distribution.Keywords: durability, high performance concrete, high strength, local materials, Southwest Algerian, sulphate attack
Procedia PDF Downloads 39020354 Modeling of the Attitude Control Reaction Wheels of a Spacecraft in Software in the Loop Test Bed
Authors: Amr AbdelAzim Ali, G. A. Elsheikh, Moutaz M. Hegazy
Abstract:
Reaction wheels (RWs) are generally used as main actuator in the attitude control system (ACS) of spacecraft (SC) for fast orientation and high pointing accuracy. In order to achieve the required accuracy for the RWs model, the main characteristics of the RWs that necessitate analysis during the ACS design phase include: technical features, sequence of operating and RW control logic are included in function (behavior) model. A mathematical model is developed including the various errors source. The errors in control torque including relative, absolute, and error due to time delay. While the errors in angular velocity due to differences between average and real speed, resolution error, loose in installation of angular sensor, and synchronization errors. The friction torque is presented in the model include the different feature of friction phenomena: steady velocity friction, static friction and break-away torque, and frictional lag. The model response is compared with the experimental torque and frequency-response characteristics of tested RWs. Based on the created RW model, some criteria of optimization based control torque allocation problem can be recommended like: avoiding the zero speed crossing, bias angular velocity, or preventing wheel from running on the same angular velocity.Keywords: friction torque, reaction wheels modeling, software in the loop, spacecraft attitude control
Procedia PDF Downloads 26620353 Pozzolanic Properties of Synthetic Zeolites as Materials Used for the Production of Building Materials
Authors: Joanna Styczen, Wojciech Franus
Abstract:
Currently, cement production reaches 3-6 Gt per year. The production of one ton of cement is associated with the emission of 0.5 to 1 ton of carbon dioxide into the atmosphere, which means that this process is responsible for 5% of global CO2 emissions. Simply improving the cement manufacturing process is not enough. An effective solution is the use of pozzolanic materials, which can partly replace clinker and thus reduce energy consumption, and emission of pollutants and give mortars the desired characteristics, shaping their microstructure. Pozzolanic additives modify the phase composition of cement, reducing the amount of portlandite and changing the CaO/SiO2 ratio in the C-S-H phase. Zeolites are a pozzolanic additive that is not commonly used. Three types of zeolites were synthesized in work: Na-A, sodalite and ZSM-5 (these zeolites come from three different structural groups). Zeolites were obtained by hydrothermal synthesis of fly ash in an aqueous NaOH solution. Then, the pozzolanicity of the obtained materials was assessed. The pozzolanic activity of the zeolites synthesized for testing was tested by chemical methods in accordance with the ASTM C 379-65 standard. The method consisted in determining the percentage content of active ingredients (soluble silicon oxide and aluminum).in alkaline solutions, i.e. those that are potentially reactive towards calcium hydroxide. The highest amount of active silica was found in zeolite ZSM-5 - 88.15%. The amount of active Al2O3 was small - 1%. The smallest pozzolanic activity was found in the Na-A zeolite (active SiO2 - 4.4%, and active Al2O3 - 2.52). The tests carried out using the XRD, SEM, XRF and textural tests showed that the obtained zeolites are characterized by high porosity, which makes them a valuable addition to mortars.Keywords: pozzolanic properties, hydration, zeolite, alite
Procedia PDF Downloads 7820352 Revolutionizing Oil Palm Replanting: Geospatial Terrace Design for High-precision Ground Implementation Compared to Conventional Methods
Authors: Nursuhaili Najwa Masrol, Nur Hafizah Mohammed, Nur Nadhirah Rusyda Rosnan, Vijaya Subramaniam, Sim Choon Cheak
Abstract:
Replanting in oil palm cultivation is vital to enable the introduction of planting materials and provides an opportunity to improve the road, drainage, terrace design, and planting density. Oil palm replanting is fundamentally necessary every 25 years. The adoption of the digital replanting blueprint is imperative as it can assist the Malaysia Oil Palm industry in addressing challenges such as labour shortages and limited expertise related to replanting tasks. Effective replanting planning should commence at least 6 months prior to the actual replanting process. Therefore, this study will help to plan and design the replanting blueprint with high-precision translation on the ground. With the advancement of geospatial technology, it is now feasible to engage in thoroughly researched planning, which can help maximize the potential yield. A blueprint designed before replanting is to enhance management’s ability to optimize the planting program, address manpower issues, or even increase productivity. In terrace planting blueprints, geographic tools have been utilized to design the roads, drainages, terraces, and planting points based on the ARM standards. These designs are mapped with location information and undergo statistical analysis. The geospatial approach is essential in precision agriculture and ensuring an accurate translation of design to the ground by implementing high-accuracy technologies. In this study, geospatial and remote sensing technologies played a vital role. LiDAR data was employed to determine the Digital Elevation Model (DEM), enabling the precise selection of terraces, while ortho imagery was used for validation purposes. Throughout the designing process, Geographical Information System (GIS) tools were extensively utilized. To assess the design’s reliability on the ground compared with the current conventional method, high-precision GPS instruments like EOS Arrow Gold and HIPER VR GNSS were used, with both offering accuracy levels between 0.3 cm and 0.5cm. Nearest Distance Analysis was generated to compare the design with actual planting on the ground. The analysis revealed that it could not be applied to the roads due to discrepancies between actual roads and the blueprint design, which resulted in minimal variance. In contrast, the terraces closely adhered to the GPS markings, with the most variance distance being less than 0.5 meters compared to actual terraces constructed. Considering the required slope degrees for terrace planting, which must be greater than 6 degrees, the study found that approximately 65% of the terracing was constructed at a 12-degree slope, while over 50% of the terracing was constructed at slopes exceeding the minimum degrees. Utilizing blueprint replanting promising strategies for optimizing land utilization in agriculture. This approach harnesses technology and meticulous planning to yield advantages, including increased efficiency, enhanced sustainability, and cost reduction. From this study, practical implementation of this technique can lead to tangible and significant improvements in agricultural sectors. In boosting further efficiencies, future initiatives will require more sophisticated techniques and the incorporation of precision GPS devices for upcoming blueprint replanting projects besides strategic progression aims to guarantee the precision of both blueprint design stages and its subsequent implementation on the field. Looking ahead, automating digital blueprints are necessary to reduce time, workforce, and costs in commercial production.Keywords: replanting, geospatial, precision agriculture, blueprint
Procedia PDF Downloads 8220351 In-silico Antimicrobial Activity of Bioactive Compounds of Ricinus communis against DNA Gyrase of Staphylococcus aureus as Molecular Target
Authors: S. Rajeswari
Abstract:
Medicinal Plant extracts and their bioactive compounds have been used for antimicrobial activities and have significant remedial properties. In the recent years, a wide range of investigations have been carried out throughout the world to confirm antimicrobial properties of different medicinally important plants. A number of plants showed efficient antimicrobial activities, which were comparable to that of synthetic standard drugs or antimicrobial agents. The large family Euphorbiaceae contains nearly about 300 genera and 7,500 speciesand one among is Ricinus communis or castor plant which has high traditional and medicinal value for disease free healthy life. Traditionally the plant is used as laxative, purgative, fertilizer and fungicide etc. whereas the plant possess beneficial effects such as anti-oxidant, antihistamine, antinociceptive, antiasthmatic, antiulcer, immunomodulatory anti diabetic, hepatoprotective, anti inflammatory, antimicrobial, and many other medicinal properties. This activity of the plant possess due to the important phytochemical constituents like flavonoids, saponins, glycosides, alkaloids and steroids. The presents study includes the phytochemical properties of Ricinus communis and to prediction of the anti-microbial activity of Ricinus communis using DNA gyrase of Staphylococcus aureus as molecular target. Docking results of varies chemicals compounds of Ricinus communis against DNA gyrase of Staphylococcus aureus by maestro 9.8 of Schrodinger show that the phytochemicals are effective against the target protein DNA gyrase. our studies suggest that the phytochemical from Ricinus communis such has INDICAN (G.Score 4.98) and SUPLOPIN-2(G.Score 5.74) can be used as lead molecule against Staphylococcus infections.Keywords: euphorbiaceae, antimicrobial activity, Ricinus communis, Staphylococcus aureus
Procedia PDF Downloads 47920350 Topographic Characteristics Derived from UAV Images to Detect Ephemeral Gully Channels
Authors: Recep Gundogan, Turgay Dindaroglu, Hikmet Gunal, Mustafa Ulukavak, Ron Bingner
Abstract:
A majority of total soil losses in agricultural areas could be attributed to ephemeral gullies caused by heavy rains in conventionally tilled fields; however, ephemeral gully erosion is often ignored in conventional soil erosion assessments. Ephemeral gullies are often easily filled from normal soil tillage operations, which makes capturing the existing ephemeral gullies in croplands difficult. This study was carried out to determine topographic features, including slope and aspect composite topographic index (CTI) and initiation points of gully channels, using images obtained from unmanned aerial vehicle (UAV) images. The study area was located in Topcu stream watershed in the eastern Mediterranean Region, where intense rainfall events occur over very short time periods. The slope varied between 0.7 and 99.5%, and the average slope was 24.7%. The UAV (multi-propeller hexacopter) was used as the carrier platform, and images were obtained with the RGB camera mounted on the UAV. The digital terrain models (DTM) of Topçu stream micro catchment produced using UAV images and manual field Global Positioning System (GPS) measurements were compared to assess the accuracy of UAV based measurements. Eighty-one gully channels were detected in the study area. The mean slope and CTI values in the micro-catchment obtained from DTMs generated using UAV images were 19.2% and 3.64, respectively, and both slope and CTI values were lower than those obtained using GPS measurements. The total length and volume of the gully channels were 868.2 m and 5.52 m³, respectively. Topographic characteristics and information on ephemeral gully channels (location of initial point, volume, and length) were estimated with high accuracy using the UAV images. The results reveal that UAV-based measuring techniques can be used in lieu of existing GPS and total station techniques by using images obtained with high-resolution UAVs.Keywords: aspect, compound topographic index, digital terrain model, initial gully point, slope, unmanned aerial vehicle
Procedia PDF Downloads 11420349 Method for Improving ICESAT-2 ATL13 Altimetry Data Utility on Rivers
Authors: Yun Chen, Qihang Liu, Catherine Ticehurst, Chandrama Sarker, Fazlul Karim, Dave Penton, Ashmita Sengupta
Abstract:
The application of ICESAT-2 altimetry data in river hydrology critically depends on the accuracy of the mean water surface elevation (WSE) at a virtual station (VS) where satellite observations intersect with water. The ICESAT-2 track generates multiple VSs as it crosses the different water bodies. The difficulties are particularly pronounced in large river basins where there are many tributaries and meanders often adjacent to each other. One challenge is to split photon segments along a beam to accurately partition them to extract only the true representative water height for individual elements. As far as we can establish, there is no automated procedure to make this distinction. Earlier studies have relied on human intervention or river masks. Both approaches are unsatisfactory solutions where the number of intersections is large, and river width/extent changes over time. We describe here an automated approach called “auto-segmentation”. The accuracy of our method was assessed by comparison with river water level observations at 10 different stations on 37 different dates along the Lower Murray River, Australia. The congruence is very high and without detectable bias. In addition, we compared different outlier removal methods on the mean WSE calculation at VSs post the auto-segmentation process. All four outlier removal methods perform almost equally well with the same R2 value (0.998) and only subtle variations in RMSE (0.181–0.189m) and MAE (0.130–0.142m). Overall, the auto-segmentation method developed here is an effective and efficient approach to deriving accurate mean WSE at river VSs. It provides a much better way of facilitating the application of ICESAT-2 ATL13 altimetry to rivers compared to previously reported studies. Therefore, the findings of our study will make a significant contribution towards the retrieval of hydraulic parameters, such as water surface slope along the river, water depth at cross sections, and river channel bathymetry for calculating flow velocity and discharge from remotely sensed imagery at large spatial scales.Keywords: lidar sensor, virtual station, cross section, mean water surface elevation, beam/track segmentation
Procedia PDF Downloads 6220348 Leveraging Remote Sensing Information for Drought Disaster Risk Management
Authors: Israel Ropo Orimoloye, Johanes A. Belle, Olusola Adeyemi, Olusola O. Ololade
Abstract:
With more than 100,000 orbits during the past 20 years, Terra has significantly improved our knowledge of the Earth's climate and its implications on societies and ecosystems of human activity and natural disasters, including drought events. With Terra instrument's performance and the free distribution of its products, this study utilised Terra MOD13Q1 satellite data to assess drought disaster events and its spatiotemporal patterns over the Free State Province of South Africa between 2001 and 2019 for summer, autumn, winter, and spring seasons. The study also used high-resolution downscaled climate change projections under three representative concentration pathways (RCP). Three future periods comprising the short (the 2030s), medium (2040s), and long term (2050s) compared to the current period are analysed to understand the potential magnitude of projected climate change-related drought. The study revealed that the year 2001 and 2016 witnessed extreme drought conditions where the drought index is between 0 and 20% across the entire province during summer, while the year 2003, 2004, 2007, and 2015 observed severe drought conditions across the region with variation from one part to the another. The result shows that from -24.5 to -25.5 latitude, the area witnessed a decrease in precipitation (80 to 120mm) across the time slice and an increase in the latitude -26° to -28° S for summer seasons, which is more prominent in the year 2041 to 2050. This study emphasizes the strong spatio-environmental impacts within the province and highlights the associated factors that characterise high drought stress risk, especially on the environment and ecosystems. This study contributes to a disaster risk framework to identify areas for specific research and adaptation activities on drought disaster risk and for environmental planning in the study area, which is characterised by both rural and urban contexts, to address climate change-related drought impacts.Keywords: remote sensing, drought disaster, climate scenario, assessment
Procedia PDF Downloads 18720347 AI-Powered Models for Real-Time Fraud Detection in Financial Transactions to Improve Financial Security
Authors: Shanshan Zhu, Mohammad Nasim
Abstract:
Financial fraud continues to be a major threat to financial institutions across the world, causing colossal money losses and undermining public trust. Fraud prevention techniques, based on hard rules, have become ineffective due to evolving patterns of fraud in recent times. Against such a background, the present study probes into distinct methodologies that exploit emergent AI-driven techniques to further strengthen fraud detection. We would like to compare the performance of generative adversarial networks and graph neural networks with other popular techniques, like gradient boosting, random forests, and neural networks. To this end, we would recommend integrating all these state-of-the-art models into one robust, flexible, and smart system for real-time anomaly and fraud detection. To overcome the challenge, we designed synthetic data and then conducted pattern recognition and unsupervised and supervised learning analyses on the transaction data to identify which activities were fishy. With the use of actual financial statistics, we compare the performance of our model in accuracy, speed, and adaptability versus conventional models. The results of this study illustrate a strong signal and need to integrate state-of-the-art, AI-driven fraud detection solutions into frameworks that are highly relevant to the financial domain. It alerts one to the great urgency that banks and related financial institutions must rapidly implement these most advanced technologies to continue to have a high level of security.Keywords: AI-driven fraud detection, financial security, machine learning, anomaly detection, real-time fraud detection
Procedia PDF Downloads 4220346 Electrical Properties of Cement-Based Piezoelectric Nanoparticles
Authors: Moustafa Shawkey, Ahmed G. El-Deen, H. M. Mahmoud, M. M. Rashad
Abstract:
Piezoelectric based cement nanocomposite is a promising technology for generating an electric charge upon mechanical stress of concrete structure. Moreover, piezoelectric nanomaterials play a vital role for providing accurate system of structural health monitoring (SHM) of the concrete structure. In light of increasing awareness of environmental protection and energy crises, generating renewable and green energy form cement based on piezoelectric nanomaterials attracts the attention of the researchers. Herein, we introduce a facial synthesis for bismuth ferrite nanoparticles (BiFeO3 NPs) as piezoelectric nanomaterial via sol gel strategy. The fabricated piezoelectric nanoparticles are uniformly distributed to cement-based nanomaterials with different ratios. The morphological shape was characterized by field emission scanning electron microscopy (FESEM) and high-resolution transmission electron microscopy (HR-TEM) as well as the crystal structure has been confirmed using X-ray diffraction (XRD). The ferroelectric and magnetic behaviours of BiFeO3 NPs have been investigated. Then, dielectric constant for the prepared cement samples nanocomposites (εr) is calculated. Intercalating BiFeO3 NPs into cement materials achieved remarkable results as piezoelectric cement materials, distinct enhancement in ferroelectric and magnetic properties. Overall, this present study introduces an effective approach to improve the electrical properties based cement applications.Keywords: piezoelectric nanomaterials, cement technology, bismuth ferrite nanoparticles, dielectric
Procedia PDF Downloads 24820345 Determination of Slope of Hilly Terrain by Using Proposed Method of Resolution of Forces
Authors: Reshma Raskar-Phule, Makarand Landge, Saurabh Singh, Vijay Singh, Jash Saparia, Shivam Tripathi
Abstract:
For any construction project, slope calculations are necessary in order to evaluate constructability on the site, such as the slope of parking lots, sidewalks, and ramps, the slope of sanitary sewer lines, slope of roads and highways. When slopes and grades are to be determined, designers are concerned with establishing proper slopes and grades for their projects to assess cut and fill volume calculations and determine inverts of pipes. There are several established instruments commonly used to determine slopes, such as Dumpy level, Abney level or Hand Level, Inclinometer, Tacheometer, Henry method, etc., and surveyors are very familiar with the use of these instruments to calculate slopes. However, they have some other drawbacks which cannot be neglected while major surveying works. Firstly, it requires expert surveyors and skilled staff. The accessibility, visibility, and accommodation to remote hilly terrain with these instruments and surveying teams are difficult. Also, determination of gentle slopes in case of road and sewer drainage constructions in congested urban places with these instruments is not easy. This paper aims to develop a method that requires minimum field work, minimum instruments, no high-end technology or instruments or software, and low cost. It requires basic and handy surveying accessories like a plane table with a fixed weighing machine, standard weights, alidade, tripod, and ranging rods should be able to determine the terrain slope in congested areas as well as in remote hilly terrain. Also, being simple and easy to understand and perform the people of that local rural area can be easily trained for the proposed method. The idea for the proposed method is based on the principle of resolution of weight components. When any object of standard weight ‘W’ is placed on an inclined surface with a weighing machine below it, then its cosine component of weight is presently measured by that weighing machine. The slope can be determined from the relation between the true or actual weight and the apparent weight. A proper procedure is to be followed, which includes site location, centering and sighting work, fixing the whole set at the identified station, and finally taking the readings. A set of experiments for slope determination, mild and moderate slopes, are carried out by the proposed method and by the theodolite instrument in a controlled environment, on the college campus, and uncontrolled environment actual site. The slopes determined by the proposed method were compared with those determined by the established instruments. For example, it was observed that for the same distances for mild slope, the difference in the slope obtained by the proposed method and by the established method ranges from 4’ for a distance of 8m to 2o15’20” for a distance of 16m for an uncontrolled environment. Thus, for mild slopes, the proposed method is suitable for a distance of 8m to 10m. The correlation between the proposed method and the established method shows a good correlation of 0.91 to 0.99 for various combinations, mild and moderate slope, with the controlled and uncontrolled environment.Keywords: surveying, plane table, weight component, slope determination, hilly terrain, construction
Procedia PDF Downloads 9620344 Application of Machine Learning on Google Earth Engine for Forest Fire Severity, Burned Area Mapping and Land Surface Temperature Analysis: Rajasthan, India
Authors: Alisha Sinha, Laxmi Kant Sharma
Abstract:
Forest fires are a recurring issue in many parts of the world, including India. These fires can have various causes, including human activities (such as agricultural burning, campfires, or discarded cigarettes) and natural factors (such as lightning). This study presents a comprehensive and advanced methodology for assessing wildfire susceptibility by integrating diverse environmental variables and leveraging cutting-edge machine learning techniques across Rajasthan, India. The primary goal of the study is to utilize Google Earth Engine to compare locations in Sariska National Park, Rajasthan (India), before and after forest fires. High-resolution satellite data were used to assess the amount and types of changes caused by forest fires. The present study meticulously analyzes various environmental variables, i.e., slope orientation, elevation, normalized difference vegetation index (NDVI), drainage density, precipitation, and temperature, to understand landscape characteristics and assess wildfire susceptibility. In addition, a sophisticated random forest regression model is used to predict land surface temperature based on a set of environmental parameters.Keywords: wildfire susceptibility mapping, LST, random forest, GEE, MODIS, climatic parameters
Procedia PDF Downloads 22