Search results for: tabular data
25199 Generative AI: A Comparison of Conditional Tabular Generative Adversarial Networks and Conditional Tabular Generative Adversarial Networks with Gaussian Copula in Generating Synthetic Data with Synthetic Data Vault
Authors: Lakshmi Prayaga, Chandra Prayaga. Aaron Wade, Gopi Shankar Mallu, Harsha Satya Pola
Abstract:
Synthetic data generated by Generative Adversarial Networks and Autoencoders is becoming more common to combat the problem of insufficient data for research purposes. However, generating synthetic data is a tedious task requiring extensive mathematical and programming background. Open-source platforms such as the Synthetic Data Vault (SDV) and Mostly AI have offered a platform that is user-friendly and accessible to non-technical professionals to generate synthetic data to augment existing data for further analysis. The SDV also provides for additions to the generic GAN, such as the Gaussian copula. We present the results from two synthetic data sets (CTGAN data and CTGAN with Gaussian Copula) generated by the SDV and report the findings. The results indicate that the ROC and AUC curves for the data generated by adding the layer of Gaussian copula are much higher than the data generated by the CTGAN.Keywords: synthetic data generation, generative adversarial networks, conditional tabular GAN, Gaussian copula
Procedia PDF Downloads 8225198 Deepnic, A Method to Transform Each Variable into Image for Deep Learning
Authors: Nguyen J. M., Lucas G., Brunner M., Ruan S., Antonioli D.
Abstract:
Deep learning based on convolutional neural networks (CNN) is a very powerful technique for classifying information from an image. We propose a new method, DeepNic, to transform each variable of a tabular dataset into an image where each pixel represents a set of conditions that allow the variable to make an error-free prediction. The contrast of each pixel is proportional to its prediction performance and the color of each pixel corresponds to a sub-family of NICs. NICs are probabilities that depend on the number of inputs to each neuron and the range of coefficients of the inputs. Each variable can therefore be expressed as a function of a matrix of 2 vectors corresponding to an image whose pixels express predictive capabilities. Our objective is to transform each variable of tabular data into images into an image that can be analysed by CNNs, unlike other methods which use all the variables to construct an image. We analyse the NIC information of each variable and express it as a function of the number of neurons and the range of coefficients used. The predictive value and the category of the NIC are expressed by the contrast and the color of the pixel. We have developed a pipeline to implement this technology and have successfully applied it to genomic expressions on an Affymetrix chip.Keywords: tabular data, deep learning, perfect trees, NICS
Procedia PDF Downloads 9025197 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks
Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam
Abstract:
In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion
Procedia PDF Downloads 12325196 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs
Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.
Abstract:
Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification
Procedia PDF Downloads 12525195 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography
Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya
Abstract:
In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography
Procedia PDF Downloads 29025194 Geographic Information System Cloud for Sustainable Digital Water Management: A Case Study
Authors: Mohamed H. Khalil
Abstract:
Water is one of the most crucial elements which influence human lives and development. Noteworthy, over the last few years, GIS plays a significant role in optimizing water management systems, especially after exponential developing in this sector. In this context, the Egyptian government initiated an advanced ‘GIS-Web Based System’. This system is efficiently designed to tangibly assist and optimize the complement and integration of data between departments of Call Center, Operation and Maintenance, and laboratory. The core of this system is a unified ‘Data Model’ for all the spatial and tabular data of the corresponding departments. The system is professionally built to provide advanced functionalities such as interactive data collection, dynamic monitoring, multi-user editing capabilities, enhancing data retrieval, integrated work-flow, different access levels, and correlative information record/track. Noteworthy, this cost-effective system contributes significantly not only in the completeness of the base-map (93%), the water network (87%) in high level of details GIS format, enhancement of the performance of the customer service, but also in reducing the operating costs/day-to-day operations (~ 5-10 %). In addition, the proposed system facilitates data exchange between different departments (Call Center, Operation and Maintenance, and laboratory), which allowed a better understanding/analyzing of complex situations. Furthermore, this system reflected tangibly on: (i) dynamic environmental monitor/water quality indicators (ammonia, turbidity, TDS, sulfate, iron, pH, etc.), (ii) improved effectiveness of the different water departments, (iii) efficient deep advanced analysis, (iv) advanced web-reporting tools (daily, weekly, monthly, quarterly, and annually), (v) tangible planning synthesizing spatial and tabular data; and finally, (vi) scalable decision support system. It is worth to highlight that the proposed future plan (second phase) of this system encompasses scalability will extend to include integration with departments of Billing and SCADA. This scalability will comprise advanced functionalities in association with the existing one to allow further sustainable contributions.Keywords: GIS Web-Based, base-map, water network, decision support system
Procedia PDF Downloads 9625193 Comparative Study on Performance of Air-Cooled Condenser (ACC) Steel Platform Structures using SCBF Frames, Spatial Structures and CFST Frames
Authors: Hassan Gomar, Shahin Bagheri, Nader Keyvan, Mozhdeh Shirinzadeh
Abstract:
Air-Cooled Condenser (ACC) platform structures are the most complicated and principal structures in power plants and other industrial parts which need to condense the low-pressure steam in the cycle. Providing large spans for this structure has great merit as there would be more space for other subordinate buildings and pertinent equipment. Moreover, applying methods to reduce the overall cost of construction while maintaining its strength against severe seismic loading is of high significance. Tabular spatial structures and composite frames have been widely used in recent years to satisfy the need for higher strength at a reasonable price. In this research program, three different structural systems have been regarded for ACC steel platform using Special Concentrate Braced Frames (SCBF), which is the most common system (first scheme), modular spatial frames (second scheme) and finally, a modified method applying Concrete Filled Steel Tabular (CFST) columns (third scheme). The finite element method using Sap2000 and Etabs software was conducted to investigate the behavior of the structures and make a precise comparison between the models. According to the results, the total weight of the steel structure in the second scheme decreases by 13% compared to the first scheme and applying CFST columns in the third scheme causes a 3% reduction in the total weight of the structure in comparison with the second scheme while all the lateral displacements and P-M interaction ratios are in the admissible limit.Keywords: ACC, SCBF frames, spatial structures, CFST frames
Procedia PDF Downloads 19725192 BlueVision: A Visual Tool for Exploring a Blockchain Network
Authors: Jett Black, Jordyn Godsey, Gaby G. Dagher, Steve Cutchin
Abstract:
Despite the growing interest in distributed ledger technology, many data visualizations of blockchain are limited to monotonous tabular displays or overly abstract graphical representations that fail to adequately educate individuals on blockchain components and their functionalities. To address these limitations, it is imperative to develop data visualizations that offer not only comprehensive insights into these domains but education as well. This research focuses on providing a conceptual understanding of the consensus process that underlies blockchain technology. This is accomplished through the implementation of a dynamic network visualization and an interactive educational tool called BlueVision. Further, a controlled user study is conducted to measure the effectiveness and usability of BlueVision. The findings demonstrate that the tool represents significant advancements in the field of blockchain visualization, effectively catering to the educational needs of both novice and proficient users.Keywords: blockchain, visualization, consensus, distributed network
Procedia PDF Downloads 6225191 Geographic Information System Using Google Fusion Table Technology for the Delivery of Disease Data Information
Authors: I. Nyoman Mahayasa Adiputra
Abstract:
Data in the field of health can be useful for the purposes of data analysis, one example of health data is disease data. Disease data is usually in a geographical plot in accordance with the area. Where the data was collected, in the city of Denpasar, Bali. Disease data report is still published in tabular form, disease information has not been mapped in GIS form. In this research, disease information in Denpasar city will be digitized in the form of a geographic information system with the smallest administrative area in the form of district. Denpasar City consists of 4 districts of North Denpasar, East Denpasar, West Denpasar and South Denpasar. In this research, we use Google fusion table technology for map digitization process, where this technology can facilitate from the administrator and from the recipient information. From the administrator side of the input disease, data can be done easily and quickly. From the receiving end of the information, the resulting GIS application can be published in a website-based application so that it can be accessed anywhere and anytime. In general, the results obtained in this study, divided into two, namely: (1) Geolocation of Denpasar and all of Denpasar districts, the process of digitizing the map of Denpasar city produces a polygon geolocation of each - district of Denpasar city. These results can be utilized in subsequent GIS studies if you want to use the same administrative area. (2) Dengue fever mapping in 2014 and 2015. Disease data used in this study is dengue fever case data taken in 2014 and 2015. Data taken from the profile report Denpasar Health Department 2015 and 2016. This mapping can be useful for the analysis of the spread of dengue hemorrhagic fever in the city of Denpasar.Keywords: geographic information system, Google fusion table technology, delivery of disease data information, Denpasar city
Procedia PDF Downloads 12925190 TAXAPRO, A Streamlined Pipeline to Analyze Shotgun Metagenomes
Authors: Sofia Sehli, Zainab El Ouafi, Casey Eddington, Soumaya Jbara, Kasambula Arthur Shem, Islam El Jaddaoui, Ayorinde Afolayan, Olaitan I. Awe, Allissa Dillman, Hassan Ghazal
Abstract:
The ability to promptly sequence whole genomes at a relatively low cost has revolutionized the way we study the microbiome. Microbiologists are no longer limited to studying what can be grown in a laboratory and instead are given the opportunity to rapidly identify the makeup of microbial communities in a wide variety of environments. Analyzing whole genome sequencing (WGS) data is a complex process that involves multiple moving parts and might be rather unintuitive for scientists that don’t typically work with this type of data. Thus, to help lower the barrier for less-computationally inclined individuals, TAXAPRO was developed at the first Omics Codeathon held virtually by the African Society for Bioinformatics and Computational Biology (ASBCB) in June 2021. TAXAPRO is an advanced metagenomics pipeline that accurately assembles organelle genomes from whole-genome sequencing data. TAXAPRO seamlessly combines WGS analysis tools to create a pipeline that automatically processes raw WGS data and presents organism abundance information in both a tabular and graphical format. TAXAPRO was evaluated using COVID-19 patient gut microbiome data. Analysis performed by TAXAPRO demonstrated a high abundance of Clostridia and Bacteroidia genera and a low abundance of Proteobacteria genera relative to others in the gut microbiome of patients hospitalized with COVID-19, consistent with the original findings derived using a different analysis methodology. This provides crucial evidence that the TAXAPRO workflow dispenses reliable organism abundance information overnight without the hassle of performing the analysis manually.Keywords: metagenomics, shotgun metagenomic sequence analysis, COVID-19, pipeline, bioinformatics
Procedia PDF Downloads 22125189 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA
Authors: Marek Dosbaba
Abstract:
Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data
Procedia PDF Downloads 11025188 A Survey and Analysis on Inflammatory Pain Detection and Standard Protocol Selection Using Medical Infrared Thermography from Image Processing View Point
Authors: Mrinal Kanti Bhowmik, Shawli Bardhan Jr., Debotosh Bhattacharjee
Abstract:
Human skin containing temperature value more than absolute zero, discharges infrared radiation related to the frequency of the body temperature. The difference in infrared radiation from the skin surface reflects the abnormality present in human body. Considering the difference, detection and forecasting the temperature variation of the skin surface is the main objective of using Medical Infrared Thermography(MIT) as a diagnostic tool for pain detection. Medical Infrared Thermography(MIT) is a non-invasive imaging technique that records and monitors the temperature flow in the body by receiving the infrared radiated from the skin and represent it through thermogram. The intensity of the thermogram measures the inflammation from the skin surface related to pain in human body. Analysis of thermograms provides automated anomaly detection associated with suspicious pain regions by following several image processing steps. The paper represents a rigorous study based survey related to the processing and analysis of thermograms based on the previous works published in the area of infrared thermal imaging for detecting inflammatory pain diseases like arthritis, spondylosis, shoulder impingement, etc. The study also explores the performance analysis of thermogram processing accompanied by thermogram acquisition protocols, thermography camera specification and the types of pain detected by thermography in summarized tabular format. The tabular format provides a clear structural vision of the past works. The major contribution of the paper introduces a new thermogram acquisition standard associated with inflammatory pain detection in human body to enhance the performance rate. The FLIR T650sc infrared camera with high sensitivity and resolution is adopted to increase the accuracy of thermogram acquisition and analysis. The survey of previous research work highlights that intensity distribution based comparison of comparable and symmetric region of interest and their statistical analysis assigns adequate result in case of identifying and detecting physiological disorder related to inflammatory diseases.Keywords: acquisition protocol, inflammatory pain detection, medical infrared thermography (MIT), statistical analysis
Procedia PDF Downloads 34325187 Evaluation of Bucket Utility Truck In-Use Driving Performance and Electrified Power Take-Off Operation
Authors: Robert Prohaska, Arnaud Konan, Kenneth Kelly, Adam Ragatz, Adam Duran
Abstract:
In an effort to evaluate the in-use performance of electrified Power Take-off (PTO) usage on bucket utility trucks operating under real-world conditions, data from 20 medium- and heavy-duty vehicles operating in California, USA were collected, compiled, and analyzed by the National Renewable Energy Laboratory's (NREL) Fleet Test and Evaluation team. In this paper, duty-cycle statistical analyses of class 5, medium-duty quick response trucks and class 8, heavy-duty material handler trucks are performed to examine and characterize vehicle dynamics trends and relationships based on collected in-use field data. With more than 100,000 kilometers of driving data collected over 880+ operating days, researchers have developed a robust methodology for identifying PTO operation from in-field vehicle data. Researchers apply this unique methodology to evaluate the performance and utilization of the conventional and electric PTO systems. Researchers also created custom representative drive-cycles for each vehicle configuration and performed modeling and simulation activities to evaluate the potential fuel and emissions savings for hybridization of the tractive driveline on these vehicles. The results of these analyses statistically and objectively define the vehicle dynamic and kinematic requirements for each vehicle configuration as well as show the potential for further system optimization through driveline hybridization. Results are presented in both graphical and tabular formats illustrating a number of key relationships between parameters observed within the data set that relates specifically to medium- and heavy-duty utility vehicles operating under real-world conditions.Keywords: drive cycle, heavy-duty (HD), hybrid, medium-duty (MD), PTO, utility
Procedia PDF Downloads 39725186 Static Simulation of Pressure and Velocity Behaviour for NACA 0006 Blade Profile of Well’s Turbine
Authors: Chetan Apurav
Abstract:
In this journal the behavioural analysis of pressure and velocity has been done over the blade profile of Well’s turbine. The blade profile that has been taken into consideration is NACA 0006. The analysis has been done in Ansys Workbench under CFX module. The CAD model of the blade profile with certain dimensions has been made in CREO, and then is imported to Ansys for further analysis. The turbine model has been enclosed under a cylindrical body and has been analysed under a constant velocity of air at 5 m/s and zero relative pressure in static condition of the turbine. Further the results are represented in tabular as well as graphical form. It has been observed that the relative pressure of the blade profile has been stable throughout the radial length and hence will be suitable for practical usage.Keywords: Well's turbine, oscillating water column, ocean engineering, wave energy, NACA 0006
Procedia PDF Downloads 20225185 Investigating Translations of Websites of Pakistani Public Offices
Authors: Sufia Maroof
Abstract:
This empirical study investigated the web-translations of five Pakistani public offices (FPSC, FIA, HEC, USB, and Ministry of Finance) offering Urdu tab as an option to access information on their official websites. Triangulation of quantitative and qualitative research design informed the researcher of the semantic, lexical and syntactic caveats in these translations. The study hypothesized that majority of the Pakistani population is oblivious of the Supreme Court’s amendments in language policy concerning national and official language; hence, Urdu web-translations of the public departments have not been accessed effectively. Firstly, the researcher conducted an online survey, comprising of two sections, close ended and short answer based questions. Secondly, the researcher compiled corpus of the five selected websites in a tabular form to compare the data. Thirdly, the administrators of the departments had been contacted regarding the methods of translation and the expertise of the personnel involved. The corpus was assessed for TQA after examining the lexical, semantic, syntactical and technical alignment inaccuracies and imperfections. The study suggests the public offices to invest in their Urdu webs by either hiring expert translators or engaging expertise of a translation agency for this project to offer quality translation to public.Keywords: machine translations, public offices, Urdu translations, websites
Procedia PDF Downloads 12725184 The Unique Journeys from Different Pasts to Multiple Presents in the Work of the Pritzker Prize Laureates of 2010-2020
Authors: Christakis Chatzichristou Kyriakos Miltiadou, Konstantinos Gounaridis
Abstract:
The paper discusses how the Pritzker Prize Laureates of the last decade themselves identify the various ways different aspects or interpretations of the past have influenced their design methodologies. As the recipients of what is considered to be the most prestigious award in architecture, these architects are worth examining not only because of their exemplary work but also because of the strong influence they have on architectural culture in general. Rather than attempting to interpret their projects, the methodology chosen focuses on what the architects themselves have to say on the subject. The research aims at, and, as the tabular form of the findings shows, also succeeds in revealing the numerous and diverse ways different aspects of what is termed as the Past can potentially enrich contemporary design practices.Keywords: design methodology, Pritzker Prize Laureates, past, culture, tradition
Procedia PDF Downloads 4525183 Enhancing Large Language Models' Data Analysis Capability with Planning-and-Execution and Code Generation Agents: A Use Case for Southeast Asia Real Estate Market Analytics
Authors: Kien Vu, Jien Min Soh, Mohamed Jahangir Abubacker, Piyawut Pattamanon, Soojin Lee, Suvro Banerjee
Abstract:
Recent advances in Generative Artificial Intelligence (GenAI), in particular Large Language Models (LLMs) have shown promise to disrupt multiple industries at scale. However, LLMs also present unique challenges, notably, these so-called "hallucination" which is the generation of outputs that are not grounded in the input data that hinders its adoption into production. Common practice to mitigate hallucination problem is utilizing Retrieval Agmented Generation (RAG) system to ground LLMs'response to ground truth. RAG converts the grounding documents into embeddings, retrieve the relevant parts with vector similarity between user's query and documents, then generates a response that is not only based on its pre-trained knowledge but also on the specific information from the retrieved documents. However, the RAG system is not suitable for tabular data and subsequent data analysis tasks due to multiple reasons such as information loss, data format, and retrieval mechanism. In this study, we have explored a novel methodology that combines planning-and-execution and code generation agents to enhance LLMs' data analysis capabilities. The approach enables LLMs to autonomously dissect a complex analytical task into simpler sub-tasks and requirements, then convert them into executable segments of code. In the final step, it generates the complete response from output of the executed code. When deployed beta version on DataSense, the property insight tool of PropertyGuru, the approach yielded promising results, as it was able to provide market insights and data visualization needs with high accuracy and extensive coverage by abstracting the complexities for real-estate agents and developers from non-programming background. In essence, the methodology not only refines the analytical process but also serves as a strategic tool for real estate professionals, aiding in market understanding and enhancement without the need for programming skills. The implication extends beyond immediate analytics, paving the way for a new era in the real estate industry characterized by efficiency and advanced data utilization.Keywords: large language model, reasoning, planning and execution, code generation, natural language processing, prompt engineering, data analysis, real estate, data sense, PropertyGuru
Procedia PDF Downloads 8825182 Development of Academic Software for Medial Axis Determination of Porous Media from High-Resolution X-Ray Microtomography Data
Authors: S. Jurado, E. Pazmino
Abstract:
Determination of the medial axis of a porous media sample is a non-trivial problem of interest for several disciplines, e.g., hydrology, fluid dynamics, contaminant transport, filtration, oil extraction, etc. However, the computational tools available for researchers are limited and restricted. The primary aim of this work was to develop a series of algorithms to extract porosity, medial axis structure, and pore-throat size distributions from porous media domains. A complementary objective was to provide the algorithms as free computational software available to the academic community comprising researchers and students interested in 3D data processing. The burn algorithm was tested on porous media data obtained from High-Resolution X-Ray Microtomography (HRXMT) and idealized computer-generated domains. The real data and idealized domains were discretized in voxels domains of 550³ elements and binarized to denote solid and void regions to determine porosity. Subsequently, the algorithm identifies the layer of void voxels next to the solid boundaries. An iterative process removes or 'burns' void voxels in sequence of layer by layer until all the void space is characterized. Multiples strategies were tested to optimize the execution time and use of computer memory, i.e., segmentation of the overall domain in subdomains, vectorization of operations, and extraction of single burn layer data during the iterative process. The medial axis determination was conducted identifying regions where burnt layers collide. The final medial axis structure was refined to avoid concave-grain effects and utilized to determine the pore throat size distribution. A graphic user interface software was developed to encompass all these algorithms, including the generation of idealized porous media domains. The software allows input of HRXMT data to calculate porosity, medial axis, and pore-throat size distribution and provide output in tabular and graphical formats. Preliminary tests of the software developed during this study achieved medial axis, pore-throat size distribution and porosity determination of 100³, 320³ and 550³ voxel porous media domains in 2, 22, and 45 minutes, respectively in a personal computer (Intel i7 processor, 16Gb RAM). These results indicate that the software is a practical and accessible tool in postprocessing HRXMT data for the academic community.Keywords: medial axis, pore-throat distribution, porosity, porous media
Procedia PDF Downloads 11625181 Health Hazards in SME Garment Industries in India
Authors: Pranab Kumar Goswami
Abstract:
According to WHO, over 1000 million people worldwide are employed in small-scale industries. The ‘garment’ industry’ is one such industry in developing countries. These garment SMEs are mostly run by private establishments in the unorganized sector to avoid legal obligations of OSH provisions. The OSH standards are very poor and even basic health and safety provisions are not provided in such units. The study has been conducted in India among workers employed in the ‘garment’ industry with the objectives to analyze the types and extent of occupational health hazards of the garment workers and to assess the relationship of sociodemographic and occupational factors with various health hazards. The survey method, the tabular method followed by applying simple statistical technique, has been taken into account to analyze the data collected from three SME garment industries in Delhi (India-Asia). The study was conducted in Delhi from August-2019 to October-2020. A random sampling of 70 workers from three factories has been chosen for this study. The study shows that most of the workers were males (82%) and were in the 18-50 age group (78%), with none below 18 years of age. It was found that 26% of the workers were illiterate and most of them belonged to poor socioeconomic status. The study revealed that the nature of the hazards in garment industries in India is mostly physical and mechanical. We found that musculoskeletal problems (54%) were the commonest health problem. The body areas commonly affected were neck, low back, hand, wrist, finger, and shoulder. If garment workers’ health is affected by occupational hazards, it will impact on national health and economic growth of developing countries. Health is a joint responsibility of both government and employing authority.Keywords: garment, MSD, health hazard, social factor
Procedia PDF Downloads 19925180 Numerical Analysis of Catalytic Combustion in a Tabular Reactor with Methane and Air Mixtures over Platinum Catalyst
Authors: Kumaresh Selvakumar, Man Young Kim
Abstract:
The presence of a catalyst inside an engine enables complete combustion at lower temperatures which promote desired chemical reactions. The objective of this work is to design and simulate a catalytic combustor by using CHEMKIN with detailed gas and surface chemistries. The simplified approach with single catalyst channel using plug flow reactor (PFR) can be used to predict reasonably well with the effect of various operating parameters such as the inlet temperature, velocity and fuel/air ratios. The numerical results are validated by comparing the surface chemistries in single channel catalytic combustor. The catalytic combustor operates at much lower temperature than the conventional combustor since lean-fuel mixture is used where the complete methane conversion is achieved. The coupling between gas and surface reactions in the catalyst bed is studied by investigating the commencement of flame ignition with respect to the surface site species.Keywords: catalytic combustion, honeycomb monolith, plug flow reactor, surface reactions
Procedia PDF Downloads 22725179 Estimation of Natural Pozzolan Reserves in the Volcanic Province of the Moroccan Middle Atlas Using a Geographic Information System in Order to Valorize Them
Authors: Brahim Balizi, Ayoub Aziz, Abdelilah Bellil, Abdellali El Khadiri, Jamal Mabrouki
Abstract:
Mio-polio-quaternary volcanism of the Tabular Middle Atlas, which corresponds to prospective levels of exploitable usable raw minerals, is a feature of Morocco's Middle Atlas, especially the Azrou-Timahdite region. Given their importance in national policy in terms of human development by supporting the sociological and economic component, this area has consequently been the focus of various research and prospecting of these levels in order to develop these reserves. The outcome of this labor is a massive amount of data that needs to be managed appropriately because it comes from multiple sources and formats, including side points, contour lines, geology, hydrogeology, hydrology, geological and topographical maps, satellite photos, and more. In this regard, putting in place a Geographic Information System (GIS) is essential to be able to offer a side plan that makes it possible to see the most recent topography of the area being exploited, to compute the volume of exploitation that occurs every day, and to make decisions with the fewest possible restrictions in order to use the reserves for the realization of ecological light mortars The three sites' mining will follow the contour lines in five steps that are six meters high and decline. It is anticipated that each quarry produces about 90,000 m3/year. For a single quarry, this translates to a daily production of about 450 m3 (200 days/year). About 3,540,240 m3 and 10,620,720 m3, respectively, represent the possible net exploitable volume in place for a single quarry and the three exploitable zones.Keywords: GIS, topography, exploitation, quarrying, lightweight mortar
Procedia PDF Downloads 2725178 Data Transformations in Data Envelopment Analysis
Authors: Mansour Mohammadpour
Abstract:
Data transformation refers to the modification of any point in a data set by a mathematical function. When applying transformations, the measurement scale of the data is modified. Data transformations are commonly employed to turn data into the appropriate form, which can serve various functions in the quantitative analysis of the data. This study addresses the investigation of the use of data transformations in Data Envelopment Analysis (DEA). Although data transformations are important options for analysis, they do fundamentally alter the nature of the variable, making the interpretation of the results somewhat more complex.Keywords: data transformation, data envelopment analysis, undesirable data, negative data
Procedia PDF Downloads 2025177 Study of Causes and Effects of Road Projects Abandonment in Nigeria
Authors: Monsuru Oyenola Popoola, Oladapo Samson Abiola, Wusamotu Alao Adeniji
Abstract:
The prevalent and incessant abandonment of road construction projects are alarming that it creates several negative effects to social, economic and environmental values of the project. The purpose of this paper is to investigate and determined the various causes and effects of abandoning road construction projects in Nigeria. Likert Scale questionnaire design was used to administered and analysed the data obtained for the stydy. 135 (Nr) questionnaires were completed and retrieved from the respondents, out of 200 (Nr) questionnaires sent out, representing a response rate of 67.5%. The analysis utilized the Relative Importance Index (R.I.I.) method and the results are presented in tabular form. The findings confirms that at least 20 factors were the causes of road projects abandonment in Nigeria with most including Leadership Instability, Improper Project Planning, Inconsistence in government policies and Design, Contractor Incompetence, Economy Instability and Inflation, Delay in remittance of money, Improper financial analysis, Poor risk management, Climatic Conditions, Improper Project Estimates etc. The findings also show that at least eight (8) effect were identified on the system, and these include; Waste of Financial Resources, Loss of economic value, Environmental degradation, Loss of economic value, Reduction in standard of living, Litigation and Arbitration, etc. The reflection is that allocating reasonable finance, developing appropriate and effective implementation plans and monitoring, evaluation and reporting on development project activities by key actors should enhance in resolving the problem of road projects abandonment.Keywords: road construction, abandonment of road projects, climatic condition, project planning, contractor
Procedia PDF Downloads 29925176 Flow and Heat Transfer Analysis of Copper-Water Nanofluid with Temperature Dependent Viscosity past a Riga Plate
Authors: Fahad Abbasi
Abstract:
Flow of electrically conducting nanofluids is of pivotal importance in countless industrial and medical appliances. Fluctuations in thermophysical properties of such fluids due to variations in temperature have not received due attention in the available literature. Present investigation aims to fill this void by analyzing the flow of copper-water nanofluid with temperature dependent viscosity past a Riga plate. Strong wall suction and viscous dissipation have also been taken into account. Numerical solutions for the resulting nonlinear system have been obtained. Results are presented in the graphical and tabular format in order to facilitate the physical analysis. An estimated expression for skin friction coefficient and Nusselt number are obtained by performing linear regression on numerical data for embedded parameters. Results indicate that the temperature dependent viscosity alters the velocity, as well as the temperature of the nanofluid and, is of considerable importance in the processes where high accuracy is desired. Addition of copper nanoparticles makes the momentum boundary layer thinner whereas viscosity parameter does not affect the boundary layer thickness. Moreover, the regression expressions indicate that magnitude of rate of change in effective skin friction coefficient and Nusselt number with respect to nanoparticles volume fraction is prominent when compared with the rate of change with variable viscosity parameter and modified Hartmann number.Keywords: heat transfer, peristaltic flows, radially varying magnetic field, curved channel
Procedia PDF Downloads 16625175 A Teaching Method for Improving Sentence Fluency in Writing
Authors: Manssour Habbash, Srinivasa Rao Idapalapati
Abstract:
Although writing is a multifaceted task, teaching writing is a demanding task basically for two reasons: Grammar and Syntax. This article provides a method of teaching writing that was found to be effective in improving students’ academic writing composition skill. The article explains the concepts of ‘guided-discovery’ and ‘guided-construction’ upon which a method of teaching writing is grounded and developed. Providing a brief commentary on what the core could mean primarily, the article presents an exposition of understanding and identifying the core and building upon the core that can demonstrate the way a teacher can make use of the concepts in teaching for improving the writing skills of their students. The method is an adaptation of grammar translation method that has been improvised to suit to a student-centered classroom environment. An intervention of teaching writing through this method was tried out with positive outcomes in formal classroom research setup, and in view of the content’s quality that relates more to the classroom practices and also in consideration of its usefulness to the practicing teachers the process and the findings are presented in a narrative form along with the results in tabular form.Keywords: core of a text, guided construction, guided discovery, theme of a text
Procedia PDF Downloads 38125174 Causes of Jaundice and Skin Rashes Amongst Children in Selected Rural Communities in the Gambia
Authors: Alhage Drammeh
Abstract:
The research is on the occurrence of certain diseases among children in rural and far-flung parts of the Gambia and the extent to which they are caused by lack of access to clean water. A baseline survey was used to discover, describe, and explain the actual processes. The paper explains the purpose of the research, which is majorly to improve the health condition of children, especially those living in rural communities. The paper also gives a brief overview of the socio-economic situation of The Gambia, emphasizing its status as a Least Developed Country (LDC) and the majority of its population living below the poverty line, with women and children hardest hit. The research used as case studies of two rural communities in the Gambia -Basse Dampha Kunda Village and Foni Besse. Data was collected through oral interviews and medical tests conducted among people in both villages, with an emphasis on children. The demographic detail of those tested is tabulated for a clearer understanding. The results were compared, revealing that skin rashes, hepatitis, and certain other diseases are more prevalent in communities lacking access to safe drinking water. These results were also presented in a tabular form. The study established how some policy failures and neglect on the part of the Government of The Gambia are imperiling the health of many rural dwellers in the country, the most glaring being that the research team was unable to test water samples collected from the two communities, as there are no laboratory reagents for testing water anywhere in The Gambia. Many rural communities lack basic amenities, especially clean and potable water, as well as health facilities. The study findings also highlighted the need for healthcare providers and medical NGOs to voice the plight of rural dwellers and collaborate with the government to set up health facilities in rural areas of The Gambia.Keywords: jaundice, skin rashes, children, rural communities, the Gambia, causes
Procedia PDF Downloads 6625173 Effect of Internal Heat Generation on Free Convective Power Law Variable Temperature Past Vertical Plate Considering Exponential Variable Viscosity and Thermal Diffusivity
Authors: Tania Sharmin Khaleque, Mohammad Ferdows
Abstract:
The flow and heat transfer characteristics of a convection with temperature-dependent viscosity and thermal diffusivity along a vertical plate with internal heat generation effect have been studied. The plate temperature is assumed to follow a power law of the distance from the leading edge. The resulting governing two-dimensional equations are transformed using suitable transformations and then solved numerically by using fifth order Runge-Kutta-Fehlberg scheme with a modified version of the Newton-Raphson shooting method. The effects of the various parameters such as variable viscosity parameter β_1, the thermal diffusivity parameter β_2, heat generation parameter c and the Prandtl number Pr on the velocity and temperature profiles, as well as the local skin- friction coefficient and the local Nusselt number are presented in tabular form. Our results suggested that the presence of internal heat generation leads to increase flow than that of without exponentially decaying heat generation term.Keywords: free convection, heat generation, thermal diffusivity, variable viscosity
Procedia PDF Downloads 35325172 Effect of Radiation on MHD Mixed Convection Stagnation Point Flow towards a Vertical Plate in a Porous Medium with Convective Boundary Condition
Authors: H. Niranjan, S. Sivasankaran, Zailan Siri
Abstract:
This study investigates mixed convection heat transfer about a thin vertical plate in the presence of magnetohydrodynamic (MHD) and heat transfer effects in the porous medium. The fluid is assumed to be steady, laminar, incompressible and in two-dimensional flow. The nonlinear coupled parabolic partial differential equations governing the flow are transformed into the non-similar boundary layer equations, which are then solved numerically using the shooting method. The effects of the conjugate heat transfer parameter, the porous medium parameter, the permeability parameter, the mixed convection parameter, the magnetic parameter, and the thermal radiation on the velocity and temperature profiles as well as on the local skin friction and local heat transfer are presented and analyzed. The validity of the methodology and analysis is checked by comparing the results obtained for some specific cases with those available in the literature. The various parameters on local skin friction, heat and mass transfer rates are presented in tabular form.Keywords: MHD, porous medium, soret/dufour, stagnation-point
Procedia PDF Downloads 37525171 Movement of the Viscous Elastic Fixed Vertically Located Cylinder in Liquid with the Free Surface Under the Influence of Waves
Authors: T. J. Hasanova, C. N. Imamalieva
Abstract:
The problem about the movement of the rigid cylinder keeping the vertical position under the influence of running superficial waves in a liquid is considered. The indignation of a falling wave caused by the presence of the cylinder which moves is thus considered. Special decomposition on a falling harmonious wave is used. The problem dares an operational method. For a finding of the original decision, Considering that the image denominator represents a tabular function, Voltaire's integrated equation of the first sort which dares a numerical method is used. Cylinder movement in the continuous environment under the influence of waves is considered in work. Problems are solved by an operational method, thus originals of required functions are looked for by the numerical definition of poles of combinations of transcendental functions and calculation of not own integrals. Using specificity of a task below, Decisions are under construction the numerical solution of the integrated equation of Volter of the first sort that does not create computing problems of the complex roots of transcendental functions connected with search.Keywords: rigid cylinder, linear interpolation, fluctuations, Voltaire's integrated equation, harmonious wave
Procedia PDF Downloads 31925170 Recommendations for Teaching Word Formation for Students of Linguistics Using Computer Terminology as an Example
Authors: Svetlana Kostrubina, Anastasia Prokopeva
Abstract:
This research presents a comprehensive study of the word formation processes in computer terminology within English and Russian languages and provides listeners with a system of exercises for training these skills. The originality is that this study focuses on a comparative approach, which shows both general patterns and specific features of English and Russian computer terms word formation. The key point is the system of exercises development for training computer terminology based on Bloom’s taxonomy. Data contain 486 units (228 English terms from the Glossary of Computer Terms and 258 Russian terms from the Terminological Dictionary-Reference Book). The objective is to identify the main affixation models in the English and Russian computer terms formation and to develop exercises. To achieve this goal, the authors employed Bloom’s Taxonomy as a methodological framework to create a systematic exercise program aimed at enhancing students’ cognitive skills in analyzing, applying, and evaluating computer terms. The exercises are appropriate for various levels of learning, from basic recall of definitions to higher-order thinking skills, such as synthesizing new terms and critically assessing their usage in different contexts. Methodology also includes: a method of scientific and theoretical analysis for systematization of linguistic concepts and clarification of the conceptual and terminological apparatus; a method of nominative and derivative analysis for identifying word-formation types; a method of word-formation analysis for organizing linguistic units; a classification method for determining structural types of abbreviations applicable to the field of computer communication; a quantitative analysis technique for determining the productivity of methods for forming abbreviations of computer vocabulary based on the English and Russian computer terms, as well as a technique of tabular data processing for a visual presentation of the results obtained. a technique of interlingua comparison for identifying common and different features of abbreviations of computer terms in the Russian and English languages. The research shows that affixation retains its productivity in the English and Russian computer terms formation. Bloom’s taxonomy allows us to plan a training program and predict the effectiveness of the compiled program based on the assessment of the teaching methods used.Keywords: word formation, affixation, computer terms, Bloom's taxonomy
Procedia PDF Downloads 14