Search results for: computer modelling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4080

Search results for: computer modelling

3660 Exploring Public Opinions Toward the Use of Generative Artificial Intelligence Chatbot in Higher Education: An Insight from Topic Modelling and Sentiment Analysis

Authors: Samer Muthana Sarsam, Abdul Samad Shibghatullah, Chit Su Mon, Abd Aziz Alias, Hosam Al-Samarraie

Abstract:

Generative Artificial Intelligence chatbots (GAI chatbots) have emerged as promising tools in various domains, including higher education. However, their specific role within the educational context and the level of legal support for their implementation remain unclear. Therefore, this study aims to investigate the role of Bard, a newly developed GAI chatbot, in higher education. To achieve this objective, English tweets were collected from Twitter's free streaming Application Programming Interface (API). The Latent Dirichlet Allocation (LDA) algorithm was applied to extract latent topics from the collected tweets. User sentiments, including disgust, surprise, sadness, anger, fear, joy, anticipation, and trust, as well as positive and negative sentiments, were extracted using the NRC Affect Intensity Lexicon and SentiStrength tools. This study explored the benefits, challenges, and future implications of integrating GAI chatbots in higher education. The findings shed light on the potential power of such tools, exemplified by Bard, in enhancing the learning process and providing support to students throughout their educational journey.

Keywords: generative artificial intelligence chatbots, bard, higher education, topic modelling, sentiment analysis

Procedia PDF Downloads 83
3659 Construction of QSAR Models to Predict Potency on a Series of substituted Imidazole Derivatives as Anti-fungal Agents

Authors: Sara El Mansouria Beghdadi

Abstract:

Quantitative structure–activity relationship (QSAR) modelling is one of the main computer tools used in medicinal chemistry. Over the past two decades, the incidence of fungal infections has increased due to the development of resistance. In this study, the QSAR was performed on a series of esters of 2-carboxamido-3-(1H-imidazole-1-yl) propanoic acid derivatives. These compounds have showed moderate and very good antifungal activity. The multiple linear regression (MLR) was used to generate the linear 2d-QSAR models. The dataset consists of 115 compounds with their antifungal activity (log MIC) against «Candida albicans» (ATCC SC5314). Descriptors were calculated, and different models were generated using Chemoffice, Avogadro, GaussView software. The selected model was validated. The study suggests that the increase in lipophilicity and the reduction in the electronic character of the substituent in R1, as well as the reduction in the steric hindrance of the substituent in R2 and its aromatic character, supporting the potentiation of the antifungal effect. The results of QSAR could help scientists to propose new compounds with higher antifungal activities intended for immunocompromised patients susceptible to multi-resistant nosocomial infections.

Keywords: quantitative structure–activity relationship, imidazole, antifungal, candida albicans (ATCC SC5314)

Procedia PDF Downloads 84
3658 Analysis of Public Space Usage Characteristics Based on Computer Vision Technology - Taking Shaping Park as an Example

Authors: Guantao Bai

Abstract:

Public space is an indispensable and important component of the urban built environment. How to more accurately evaluate the usage characteristics of public space can help improve its spatial quality. Compared to traditional survey methods, computer vision technology based on deep learning has advantages such as dynamic observation and low cost. This study takes the public space of Shaping Park as an example and, based on deep learning computer vision technology, processes and analyzes the image data of the public space to obtain the spatial usage characteristics and spatiotemporal characteristics of the public space. Research has found that the spontaneous activity time in public spaces is relatively random with a relatively short average activity time, while social activities have a relatively stable activity time with a longer average activity time. Computer vision technology based on deep learning can effectively describe the spatial usage characteristics of the research area, making up for the shortcomings of traditional research methods and providing relevant support for creating a good public space.

Keywords: computer vision, deep learning, public spaces, using features

Procedia PDF Downloads 70
3657 Document Analysis for Modelling iTV Advertising towards Impulse Purchase

Authors: Azizah Che Omar

Abstract:

The study provides a systematic literature review which analyzed the literature for the purpose of looking for concepts, theories, approaches and guidelines in order to propose a conceptual design model of interactive television advertising toward impulse purchase (iTVAdIP). An extensive review of literature was purposely carried out to understand the concepts of interactive television (iTV). Therefore, some elements; iTV guidelines, advertising theories, persuasive approaches, and the impulse purchase elements were analyzed to reach the scope of this work. The extensive review was also a necessity to achieve the objective of this study, which was to determine the concept of iTVAdIP design model. Through systematic review analysis, this study discovered that all the previous models did not emphasize the conceptual design model of interactive television advertising. As a result, the finding showed that the concept of the proposed model should contain the iTV guidelines, advertising theory, persuasive approach and impulse purchase elements. In addition, a summary diagram for the development of the proposed model is depicted to provide clearer understanding towards the concepts of conceptual design model of iTVAdIP.

Keywords: impulse purchase, interactive television advertising, human computer interaction, advertising theories

Procedia PDF Downloads 369
3656 Human Motion Capture: New Innovations in the Field of Computer Vision

Authors: Najm Alotaibi

Abstract:

Human motion capture has become one of the major area of interest in the field of computer vision. Some of the major application areas that have been rapidly evolving include the advanced human interfaces, virtual reality and security/surveillance systems. This study provides a brief overview of the techniques and applications used for the markerless human motion capture, which deals with analyzing the human motion in the form of mathematical formulations. The major contribution of this research is that it classifies the computer vision based techniques of human motion capture based on the taxonomy, and then breaks its down into four systematically different categories of tracking, initialization, pose estimation and recognition. The detailed descriptions and the relationships descriptions are given for the techniques of tracking and pose estimation. The subcategories of each process are further described. Various hypotheses have been used by the researchers in this domain are surveyed and the evolution of these techniques have been explained. It has been concluded in the survey that most researchers have focused on using the mathematical body models for the markerless motion capture.

Keywords: human motion capture, computer vision, vision-based, tracking

Procedia PDF Downloads 319
3655 Urban Energy Demand Modelling: Spatial Analysis Approach

Authors: Hung-Chu Chen, Han Qi, Bauke de Vries

Abstract:

Energy consumption in the urban environment has attracted numerous researches in recent decades. However, it is comparatively rare to find literary works which investigated 3D spatial analysis of urban energy demand modelling. In order to analyze the spatial correlation between urban morphology and energy demand comprehensively, this paper investigates their relation by using the spatial regression tool. In addition, the spatial regression tool which is applied in this paper is ordinary least squares regression (OLS) and geographically weighted regression (GWR) model. Normalized Difference Built-up Index (NDBI), Normalized Difference Vegetation Index (NDVI), and building volume are explainers of urban morphology, which act as independent variables of Energy-land use (E-L) model. NDBI and NDVI are used as the index to describe five types of land use: urban area (U), open space (O), artificial green area (G), natural green area (V), and water body (W). Accordingly, annual electricity, gas demand and energy demand are dependent variables of the E-L model. Based on the analytical result of E-L model relation, it revealed that energy demand and urban morphology are closely connected and the possible causes and practical use are discussed. Besides, the spatial analysis methods of OLS and GWR are compared.

Keywords: energy demand model, geographically weighted regression, normalized difference built-up index, normalized difference vegetation index, spatial statistics

Procedia PDF Downloads 148
3654 An Evaluation of Neural Network Efficacies for Image Recognition on Edge-AI Computer Vision Platform

Authors: Jie Zhao, Meng Su

Abstract:

Image recognition, as one of the most critical technologies in computer vision, works to help machine-like robotics understand a scene, that is, if deployed appropriately, will trigger the revolution in remote sensing and industry automation. With the developments of AI technologies, there are many prevailing and sophisticated neural networks as technologies developed for image recognition. However, computer vision platforms as hardware, supporting neural networks for image recognition, as crucial as the neural network technologies, need to be more congruently addressed as the research subjects. In contrast, different computer vision platforms are deterministic to leverage the performance of different neural networks for recognition. In this paper, three different computer vision platforms – Jetson Nano(with 4GB), a standalone laptop(with RTX 3000s, using CUDA), and Google Colab (web-based, using GPU) are explored and four prominent neural network architectures (including AlexNet, VGG(16/19), GoogleNet, and ResNet(18/34/50)), are investigated. In the context of pairwise usage between different computer vision platforms and distinctive neural networks, with the merits of recognition accuracy and time efficiency, the performances are evaluated. In the case study using public imageNets, our findings provide a nuanced perspective on optimizing image recognition tasks across Edge-AI platforms, offering guidance on selecting appropriate neural network structures to maximize performance under hardware constraints.

Keywords: alexNet, VGG, googleNet, resNet, Jetson nano, CUDA, COCO-NET, cifar10, imageNet large scale visual recognition challenge (ILSVRC), google colab

Procedia PDF Downloads 90
3653 Modeling Aeration of Sharp Crested Weirs by Using Support Vector Machines

Authors: Arun Goel

Abstract:

The present paper attempts to investigate the prediction of air entrainment rate and aeration efficiency of a free over-fall jets issuing from a triangular sharp crested weir by using regression based modelling. The empirical equations, support vector machine (polynomial and radial basis function) models and the linear regression techniques were applied on the triangular sharp crested weirs relating the air entrainment rate and the aeration efficiency to the input parameters namely drop height, discharge, and vertex angle. It was observed that there exists a good agreement between the measured values and the values obtained using empirical equations, support vector machine (Polynomial and rbf) models, and the linear regression techniques. The test results demonstrated that the SVM based (Poly & rbf) model also provided acceptable prediction of the measured values with reasonable accuracy along with empirical equations and linear regression techniques in modelling the air entrainment rate and the aeration efficiency of a free over-fall jets issuing from triangular sharp crested weir. Further sensitivity analysis has also been performed to study the impact of input parameter on the output in terms of air entrainment rate and aeration efficiency.

Keywords: air entrainment rate, dissolved oxygen, weir, SVM, regression

Procedia PDF Downloads 436
3652 Determining Fire Resistance of Wooden Construction Elements through Experimental Studies and Artificial Neural Network

Authors: Sakir Tasdemir, Mustafa Altin, Gamze Fahriye Pehlivan, Sadiye Didem Boztepe Erkis, Ismail Saritas, Selma Tasdemir

Abstract:

Artificial intelligence applications are commonly used in industry in many fields in parallel with the developments in the computer technology. In this study, a fire room was prepared for the resistance of wooden construction elements and with the mechanism here, the experiments of polished materials were carried out. By utilizing from the experimental data, an artificial neural network (ANN) was modeled in order to evaluate the final cross sections of the wooden samples remaining from the fire. In modelling, experimental data obtained from the fire room were used. In the system developed, the first weight of samples (ws-gr), preliminary cross-section (pcs-mm2), fire time (ft-minute), fire temperature (t-oC) as input parameters and final cross-section (fcs-mm2) as output parameter were taken. When the results obtained from ANN and experimental data are compared after making statistical analyses, the data of two groups are determined to be coherent and seen to have no meaning difference between them. As a result, it is seen that ANN can be safely used in determining cross sections of wooden materials after fire and it prevents many disadvantages.

Keywords: artificial neural network, final cross-section, fire retardant polishes, fire safety, wood resistance.

Procedia PDF Downloads 385
3651 Object-Based Flow Physics for Aerodynamic Modelling in Real-Time Environments

Authors: William J. Crowther, Conor Marsh

Abstract:

Object-based flow simulation allows fast computation of arbitrarily complex aerodynamic models made up of simple objects with limited flow interactions. The proposed approach is universally applicable to objects made from arbitrarily scaled ellipsoid primitives at arbitrary aerodynamic attitude and angular rate. The use of a component-based aerodynamic modelling approach increases efficiency by allowing selective inclusion of different physics models at run-time and allows extensibility through the development of new models. Insight into the numerical stability of the model under first order fixed-time step integration schemes is provided by stability analysis of the drag component. The compute cost of model components and functions is evaluated and compared against numerical benchmarks. Model static outputs are verified against theoretical expectations and dynamic behaviour using falling plate data from the literature. The model is applied to a range of case studies to demonstrate the efficacy of its application in extensibility, ease of use, and low computational cost. Dynamically complex multi-body systems can be implemented in a transparent and efficient manner, and we successfully demonstrate large scenes with hundreds of objects interacting with diverse flow fields.

Keywords: aerodynamics, real-time simulation, low-order model, flight dynamics

Procedia PDF Downloads 102
3650 Tide Contribution in the Flood Event of Jeddah City: Mathematical Modelling and Different Field Measurements of the Groundwater Rise

Authors: Aïssa Rezzoug

Abstract:

This paper is aimed to bring new elements that demonstrate the tide caused the groundwater to rise in the shoreline band, on which the urban areas occurs, especially in the western coastal cities of the Kingdom of Saudi Arabia like Jeddah. The reason for the last events of Jeddah inundation was the groundwater rise in the city coupled at the same time to a strong precipitation event. This paper will illustrate the tide participation in increasing the groundwater level significantly. It shows that the reason for internal groundwater recharge within the urban area is not only the excess of the water supply coming from surrounding areas, due to the human activity, with lack of sufficient and efficient sewage system, but also due to tide effect. The research study follows a quantitative method to assess groundwater level rise risks through many in-situ measurements and mathematical modelling. The proposed approach highlights groundwater level, in the urban areas of the city on the shoreline band, reaching the high tide level without considering any input from precipitation. Despite the small tide in the Red Sea compared to other oceanic coasts, the groundwater level is considerably enhanced by the tide from the seaside and by the freshwater table from the landside of the city. In these conditions, the groundwater level becomes high in the city and prevents the soil to evacuate quickly enough the surface flow caused by the storm event, as it was observed in the last historical flood catastrophe of Jeddah in 2009.

Keywords: flood, groundwater rise, Jeddah, tide

Procedia PDF Downloads 114
3649 Human Computer Interaction Using Computer Vision and Speech Processing

Authors: Shreyansh Jain Jeetmal, Shobith P. Chadaga, Shreyas H. Srinivas

Abstract:

Internet of Things (IoT) is seen as the next major step in the ongoing revolution in the Information Age. It is predicted that in the near future billions of embedded devices will be communicating with each other to perform a plethora of tasks with or without human intervention. One of the major ongoing hotbed of research activity in IoT is Human Computer Interaction (HCI). HCI is used to facilitate communication between an intelligent system and a user. An intelligent system typically comprises of a system consisting of various sensors, actuators and embedded controllers which communicate with each other to monitor data collected from the environment. Communication by the user to the system is typically done using voice. One of the major ongoing applications of HCI is in home automation as a personal assistant. The prime objective of our project is to implement a use case of HCI for home automation. Our system is designed to detect and recognize the users and personalize the appliances in the house according to their individual preferences. Our HCI system is also capable of speaking with the user when certain commands are spoken such as searching on the web for information and controlling appliances. Our system can also monitor the environment in the house such as air quality and gas leakages for added safety.

Keywords: human computer interaction, internet of things, computer vision, sensor networks, speech to text, text to speech, android

Procedia PDF Downloads 362
3648 Structural Testing and the Finite Element Modelling of Anchors Loaded Against Partially Confined Surfaces

Authors: Ali Karrech, Alberto Puccini, Ben Galvin, Davide Galli

Abstract:

This paper summarises the laboratory tests, numerical models and statistical approach developed to investigate the behaviour of concrete blocks loaded in shear through metallic anchors. This research is proposed to bridge a gap in the state of the art and practice related to anchors loaded against partially confined concrete surfaces. Eight concrete blocks (420 mm x 500 mm x 1000 mm) with 150 and/or 250 deep anchors were tested. The stainless-steel anchors of diameter 16 mm were bonded with HIT-RE 500 V4 injection epoxy resin and were subjected to shear loading against partially supported edges. In addition, finite element models were constructed to validate the laboratory tests and explore the influence of key parameters such as anchor depth, anchor distance from the edge, and compressive strength on the stability of the block. Upon their validation experimentally, the numerical results were used to populate, develop and interpret a systematic parametric study based on the Design of Experiment approach through the Box-Behnken design and Response Surface Methodology. An empirical model has been derived based on this approach, which predicts the load capacity with the desirable intervals of confidence.

Keywords: finite element modelling, design of experiment, response surface methodology, Box-Behnken design, empirical model, interval of confidence, load capacity

Procedia PDF Downloads 24
3647 Concussion Prediction for Speed Skater Impacting on Crash Mats by Computer Simulation Modeling

Authors: Yilin Liao, Hewen Li, Paula McConvey

Abstract:

Concussion for speed skaters often occurs when skaters fall on the ice and impact the crash mats during practices and competition races. Gaining insight into the impact of interactions is of essential interest as it is directly related to skaters’ potential health risks and injuries. Precise concussion measurements are challenging and very difficult, making computer simulation the only reliable way to analyze accidents. This research aims to create the crash mat and skater’s multi-body model using Solidworks, develop a computer simulation model for skater-mat impact using ANSYS software, and predict the skater’s concussion degree by evaluating the “head injury criteria” (HIC) through the resulting accelerations. The developed method and results help understand the relationship between impact parameters and concussion risk for speed skaters and inform the design of crash mats and skating rink layouts more specifically by considering athletes’ health risks.

Keywords: computer simulation modeling, concussion, impact, speed skater

Procedia PDF Downloads 140
3646 Investigating the Shear Behaviour of Fouled Ballast Using Discrete Element Modelling

Authors: Ngoc Trung Ngo, Buddhima Indraratna, Cholachat Rujikiathmakjornr

Abstract:

For several hundred years, the design of railway tracks has practically remained unchanged. Traditionally, rail tracks are placed on a ballast layer due to several reasons, including economy, rapid drainage, and high load bearing capacity. The primary function of ballast is to distributing dynamic track loads to sub-ballast and subgrade layers, while also providing lateral resistance and allowing for rapid drainage. Upon repeated trainloads, the ballast becomes fouled due to ballast degradation and the intrusion of fines which adversely affects the strength and deformation behaviour of ballast. This paper presents the use of three-dimensional discrete element method (DEM) in studying the shear behaviour of the fouled ballast subjected to direct shear loading. Irregularly shaped particles of ballast were modelled by grouping many spherical balls together in appropriate sizes to simulate representative ballast aggregates. Fouled ballast was modelled by injecting a specified number of miniature spherical particles into the void spaces. The DEM simulation highlights that the peak shear stress of the ballast assembly decreases and the dilation of fouled ballast increases with an increase level of fouling. Additionally, the distributions of contact force chain and particle displacement vectors were captured during shearing progress, explaining the formation of shear band and the evolutions of volumetric change of fouled ballast.

Keywords: railway ballast, coal fouling, discrete element modelling, discrete element method

Procedia PDF Downloads 451
3645 Modern Scotland Yard: Improving Surveillance Policies Using Adversarial Agent-Based Modelling and Reinforcement Learning

Authors: Olaf Visker, Arnout De Vries, Lambert Schomaker

Abstract:

Predictive policing refers to the usage of analytical techniques to identify potential criminal activity. It has been widely implemented by various police departments. Being a relatively new area of research, there are, to the author’s knowledge, no absolute tried, and true methods and they still exhibit a variety of potential problems. One of those problems is closely related to the lack of understanding of how acting on these prediction influence crime itself. The goal of law enforcement is ultimately crime reduction. As such, a policy needs to be established that best facilitates this goal. This research aims to find such a policy by using adversarial agent-based modeling in combination with modern reinforcement learning techniques. It is presented here that a baseline model for both law enforcement and criminal agents and compare their performance to their respective reinforcement models. The experiments show that our smart law enforcement model is capable of reducing crime by making more deliberate choices regarding the locations of potential criminal activity. Furthermore, it is shown that the smart criminal model presents behavior consistent with popular crime theories and outperforms the baseline model in terms of crimes committed and time to capture. It does, however, still suffer from the difficulties of capturing long term rewards and learning how to handle multiple opposing goals.

Keywords: adversarial, agent based modelling, predictive policing, reinforcement learning

Procedia PDF Downloads 148
3644 Stimulating Young Children Social Interaction Behaviour through Computer Play Activities: The Role of Teachers and Parents Support

Authors: Mahani Razali, Nordin Mamat

Abstract:

The purpose of the study is to explore how computer technology is integrated into pre-school activities and its relationship with children’s social interaction behaviour in pre-school classroom. The major question of interest in the present study is to investigate the social interaction behaviour of children when using computers in the Malaysian pre-school classroom. This research is based on three main objectives which are to identify children`s social interaction during computer play activities, teacher’s role and parent’s participation to develop children`s social interaction. This qualitative study was carried out among 25 pre-school children, three teachers and three parents as the research sample. On the other hand, parent’s support was obtained from their discussions, supervisions and communication at home. The data collection procedures involved structured observation which was to identify social interaction behaviour among pre-school children through computer play activities; as for semi-structured interviews, it was done to study the perception of the teachers and parents on the acquired social interaction behaviour among the children. Besides, documentation analysis method was used as to triangulate acquired information with observations and interviews. In this study, the qualitative data analysis was tabulated in descriptive manner with frequency and percentage format. This study primarily focused on social interaction behaviour elements among the pre-school children. Findings revealed that the children showed positive outcomes on the social interaction behaviour during their computer play. This research summarizes that teacher’s role and parent’s support can improve children`s social interaction behaviour through computer play activities. As a whole, this research highlighted the significance of computer play activities as to stimulate social interaction behavior among the pre-school children.

Keywords: early childhood, emotional development, parent support, play

Procedia PDF Downloads 366
3643 3 Dimensional (3D) Assesment of Hippocampus in Alzheimer’s Disease

Authors: Mehmet Bulent Ozdemir, Sultan Çagirici, Sahika Pinar Akyer, Fikri Turk

Abstract:

Neuroanatomical appearance can be correlated with clinical or other characteristics of illness. With the introduction of diagnostic imaging machines, producing 3D images of anatomic structures, calculating the correlation between subjects and pattern of the structures have become possible. The aim of this study is to examine the 3D structure of hippocampus in cases with Alzheimer disease in different dementia severity. For this purpose, 62 female and 38 male- 68 patients’s (age range between 52 and 88) MR scanning were imported to the computer. 3D model of each right and left hippocampus were developed by a computer aided propramme-Surf Driver 3.5. Every reconstruction was taken by the same investigator. There were different apperance of hippocampus from normal to abnormal. In conclusion, These results might improve the understanding of the correlation between the morphological changes in hippocampus and clinical staging in Alzheimer disease.

Keywords: Alzheimer disease, hippocampus, computer-assisted anatomy, 3D

Procedia PDF Downloads 481
3642 A Finite Element Model to Study the Behaviour of Corroded Reinforced Concrete Beams Repaired with near Surface Mounted Technique

Authors: B. Almassri, F. Almahmoud, R. Francois

Abstract:

Near surface mounted reinforcement (NSM) technique is one of the promising techniques used nowadays to strengthen reinforced concrete (RC) structures. In the NSM technique, the Carbon Fibre Reinforced Polymer (CFRP) rods are placed inside pre-cut grooves and are bonded to the concrete with epoxy adhesive. This paper studies the non-classical mode of failure ‘the separation of concrete cover’ according to experimental and numerical FE modelling results. Experimental results and numerical modelling results of a 3D finite element (FE) model using the commercial software Abaqus and 2D FE model FEMIX were obtained on two beams, one corroded (25 years of corrosion procedure) and one control (A1CL3-R and A1T-R) were each repaired in bending using NSM CFRP rod and were then tested up to failure. The results showed that the NSM technique increased the overall capacity of control and corroded beams despite a non-classical mode of failure with separation of the concrete cover occurring in the corroded beam due to damage induced by corrosion. Another FE model used external steel stirrups around the repaired corroded beam A1CL3-R which failed with the separation of concrete cover, this model showed a change in the mode of failure form a non-classical mode of failure by the separation of concrete cover to the same mode of failure of the repaired control beam by the crushing of compressed concrete.

Keywords: corrosion, repair, Reinforced Concrete, FEM, CFRP, FEMIX

Procedia PDF Downloads 164
3641 An Anthropometric and Postural Risk Assessment of Students in Computer Laboratories of a State University

Authors: Sarah Louise Cruz, Jemille Venturina

Abstract:

Ergonomics considers the capabilities and limitations of a person as they interact with tools, equipment, facilities and tasks in their work environment. Workplace is one example of physical work environment, be it a workbench or a desk. In school laboratories, sitting is the most common working posture of the students. Students maintain static sitting posture as they perform different computer-aided activities. The College of Engineering and College of Information and Communication Technology of a State University consist of twenty-two computer laboratories. Normally, students aren’t usually aware of the importance of sustaining proper sitting posture while doing their long hour computer laboratory activities. The study evaluates the perceived discomfort and working postures of students as they are exposed on current workplace design of computer laboratories. The current study utilizes Rapid Upper Limb Assessment (RULA), Body Discomfort Chart using Borg’s CR-10 Scale Rating and Quick Exposure Checklist in order to assess the posture and the current working condition. The result of the study may possibly minimize the body discomfort experienced by the students. The researchers redesign the individual workstations which includes working desk, sitting stool and other workplace design components. Also, the economic variability of each alternative was considered given that the study focused on improvement of facilities of a state university.

Keywords: computer workstation, ergonomics, posture, students, workplace

Procedia PDF Downloads 310
3640 Study of Polychlorinated Dibenzo-P-Dioxins and Dibenzofurans Dispersion in the Environment of a Municipal Solid Waste Incinerator

Authors: Gómez R. Marta, Martín M. Jesús María

Abstract:

The general aim of this paper identifies the areas of highest concentration of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) around the incinerator through the use of dispersion models. Atmospheric dispersion models are useful tools for estimating and prevent the impact of emissions from a particular source in air quality. These models allow considering different factors that influence in air pollution: source characteristics, the topography of the receiving environment and weather conditions to predict the pollutants concentration. The PCDD/Fs, after its emission into the atmosphere, are deposited on water or land, near or far from emission source depending on the size of the associated particles and climatology. In this way, they are transferred and mobilized through environmental compartments. The modelling of PCDD/Fs was carried out with following tools: Atmospheric Dispersion Model Software (ADMS) and Surfer. ADMS is a dispersion model Gaussian plume, used to model the impact of air quality industrial facilities. And Surfer is a program of surfaces which is used to represent the dispersion of pollutants on a map. For the modelling of emissions, ADMS software requires the following input parameters: characterization of emission sources (source type, height, diameter, the temperature of the release, flow rate, etc.) meteorological and topographical data (coordinate system), mainly. The study area was set at 5 Km around the incinerator and the first population center nearest to focus PCDD/Fs emission is about 2.5 Km, approximately. Data were collected during one year (2013) both PCDD/Fs emissions of the incinerator as meteorology in the study area. The study has been carried out during period's average that legislation establishes, that is to say, the output parameters are taking into account the current legislation. Once all data required by software ADMS, described previously, are entered, and in order to make the representation of the spatial distribution of PCDD/Fs concentration and the areas affecting them, the modelling was proceeded. In general, the dispersion plume is in the direction of the predominant winds (Southwest and Northeast). Total levels of PCDD/Fs usually found in air samples, are from <2 pg/m3 for remote rural areas, from 2-15 pg/m3 in urban areas and from 15-200 pg/m3 for areas near to important sources, as can be an incinerator. The results of dispersion maps show that maximum concentrations are the order of 10-8 ng/m3, well below the values considered for areas close to an incinerator, as in this case.

Keywords: atmospheric dispersion, dioxin, furan, incinerator

Procedia PDF Downloads 216
3639 A Study on the Impacts of Computer Aided Design on the Architectural Design Process

Authors: Halleh Nejadriahi, Kamyar Arab

Abstract:

Computer-aided design (CAD) tools have been extensively used by the architects for the several decades. It has evolved from being a simple drafting tool to being an intelligent architectural software and a powerful means of communication for architects. CAD plays an essential role in the profession of architecture and is a basic tool for any architectural firm. It is not possible for an architectural firm to compete without taking the advantage of computer software, due to the high demand and competition in the architectural industry. The aim of this study is to evaluate the impacts of CAD on the architectural design process from conceptual level to final product, particularly in architectural practice. It examines the range of benefits of integrating CAD into the industry and discusses the possible defects limiting the architects. Method of this study is qualitatively based on data collected from the professionals’ perspective. The identified benefits and limitations of CAD on the architectural design process will raise the awareness of professionals on the potentials of CAD and proper utilization of that in the industry, which would result in a higher productivity along with a better quality in the architectural offices.

Keywords: architecture, architectural practice, computer aided design (CAD), design process

Procedia PDF Downloads 360
3638 A Comparative Study of Virus Detection Techniques

Authors: Sulaiman Al amro, Ali Alkhalifah

Abstract:

The growing number of computer viruses and the detection of zero day malware have been the concern for security researchers for a large period of time. Existing antivirus products (AVs) rely on detecting virus signatures which do not provide a full solution to the problems associated with these viruses. The use of logic formulae to model the behaviour of viruses is one of the most encouraging recent developments in virus research, which provides alternatives to classic virus detection methods. In this paper, we proposed a comparative study about different virus detection techniques. This paper provides the advantages and drawbacks of different detection techniques. Different techniques will be used in this paper to provide a discussion about what technique is more effective to detect computer viruses.

Keywords: computer viruses, virus detection, signature-based, behaviour-based, heuristic-based

Procedia PDF Downloads 484
3637 2D and 3D Unsteady Simulation of the Heat Transfer in the Sample during Heat Treatment by Moving Heat Source

Authors: Zdeněk Veselý, Milan Honner, Jiří Mach

Abstract:

The aim of the performed work is to establish the 2D and 3D model of direct unsteady task of sample heat treatment by moving source employing computer model on the basis of finite element method. The complex boundary condition on heat loaded sample surface is the essential feature of the task. Computer model describes heat treatment of the sample during heat source movement over the sample surface. It is started from the 2D task of sample cross section as a basic model. Possibilities of extension from 2D to 3D task are discussed. The effect of the addition of third model dimension on the temperature distribution in the sample is showed. Comparison of various model parameters on the sample temperatures is observed. Influence of heat source motion on the depth of material heat treatment is shown for several velocities of the movement. Presented computer model is prepared for the utilization in laser treatment of machine parts.

Keywords: computer simulation, unsteady model, heat treatment, complex boundary condition, moving heat source

Procedia PDF Downloads 393
3636 Information Technology Approaches to Literature Text Analysis

Authors: Ayse Tarhan, Mustafa Ilkan, Mohammad Karimzadeh

Abstract:

Science was considered as part of philosophy in ancient Greece. By the nineteenth century, it was understood that philosophy was very inclusive and that social and human sciences such as literature, history, and psychology should be separated and perceived as an autonomous branch of science. The computer was also first seen as a tool of mathematical science. Over time, computer science has grown by encompassing every area in which technology exists, and its growth compelled the division of computer science into different disciplines, just as philosophy had been divided into different branches of science. Now there is almost no branch of science in which computers are not used. One of the newer autonomous disciplines of computer science is digital humanities, and one of the areas of digital humanities is literature. The material of literature is words, and thanks to the software tools created using computer programming languages, data that a literature researcher would need months to complete, can be achieved quickly and objectively. In this article, three different tools that literary researchers can use in their work will be introduced. These studies were created with the computer programming languages Python and R and brought to the world of literature. The purpose of introducing the aforementioned studies is to set an example for the development of special tools or programs on Ottoman language and literature in the future and to support such initiatives. The first example to be introduced is the Stylometry tool developed with the R language. The other is The Metrical Tool, which is used to measure data in poems and was developed with Python. The latest literature analysis tool in this article is Voyant Tools, which is a multifunctional and easy-to-use tool.

Keywords: DH, literature, information technologies, stylometry, the metrical tool, voyant tools

Procedia PDF Downloads 151
3635 Management and Evaluation of Developing Medical Device Software in Compliance with Rules

Authors: Arash Sepehri bonab

Abstract:

One of the regions of critical development in medical devices has been the part of the software - as an indispensable component of a therapeutic device, as a standalone device, and more as of late, as applications on portable gadgets. The chance related to a breakdown of the standalone computer program utilized inside healthcare is in itself not a model for its capability or not as a medical device. It is, subsequently, fundamental to clarify a few criteria for the capability of a stand-alone computer program as a medical device. The number of computer program items and therapeutic apps is persistently expanding and so as well is used in wellbeing education (e. g., in clinics and doctors' surgeries) for determination and treatment. Within the last decade, the use of information innovation in healthcare has taken a developing part. In reality, the appropriation of an expanding number of computer devices has driven several benefits related to the method of quiet care and permitted simpler get to social and health care assets. At the same time, this drift gave rise to modern challenges related to the usage of these modern innovations. The program utilized in healthcare can be classified as therapeutic gadgets depending on the way they are utilized and on their useful characteristics. In the event that they are classified as therapeutic gadgets, they must fulfill particular directions. The point of this work is to show a computer program improvement system that can permit the generation of secure and tall, quality restorative gadget computer programs and to highlight the correspondence between each program advancement stage and the fitting standard and/or regulation.

Keywords: medical devices, regulation, software, development, healthcare

Procedia PDF Downloads 108
3634 System-Wide Impact of Energy Efficiency in the Industry Sector: A Comparative Study between Canada and Denmark

Authors: M. Baldini, H. K. Jacobsen, M. Jaccard

Abstract:

In light of the international efforts to comply with the Paris agreement and emission targets for future energy systems, Denmark and Canada are among the front-runner countries dealing with climate change. The experiences in the energy sector have seen both countries coping with trade-offs between investments in renewable energy technologies and energy efficiency, thus tackling the climate issue from the supply and demand side respectively. On the demand side, the industrial sector is going through a remarkable transformation, with implementation of energy efficiency measures, change of input fuel for end-use processes and forecasted electrification as main features under the spotlight. By looking at Canada and Denmark's experiences as pathfinders on the demand and supply approach to climate change, it is possible to obtain valuable experience that may be applied to other countries aiming at the same goal. This paper presents a comparative study on industrial energy efficiency between Canada and Denmark. The study focuses on technologies and system options, policy design and implementation and modelling methodologies when implementing industrial energy savings in optimization models in comparison to simulation models. The study identifies gaps and junctures in the approach towards climate change actions and, learning from each other, lessen the differences to further foster the adoption of energy efficiency measurements in the industrial sector, aiming at reducing energy consumption and, consequently, CO₂ emissions.

Keywords: industrial energy efficiency, comparative study, CO₂ reduction, energy system modelling

Procedia PDF Downloads 172
3633 Modelling Fluidization by Data-Based Recurrence Computational Fluid Dynamics

Authors: Varun Dongre, Stefan Pirker, Stefan Heinrich

Abstract:

Over the last decades, the numerical modelling of fluidized bed processes has become feasible even for industrial processes. Commonly, continuous two-fluid models are applied to describe large-scale fluidization. In order to allow for coarse grids novel two-fluid models account for unresolved sub-grid heterogeneities. However, computational efforts remain high – in the order of several hours of compute-time for a few seconds of real-time – thus preventing the representation of long-term phenomena such as heating or particle conversion processes. In order to overcome this limitation, data-based recurrence computational fluid dynamics (rCFD) has been put forward in recent years. rCFD can be regarded as a data-based method that relies on the numerical predictions of a conventional short-term simulation. This data is stored in a database and then used by rCFD to efficiently time-extrapolate the flow behavior in high spatial resolution. This study will compare the numerical predictions of rCFD simulations with those of corresponding full CFD reference simulations for lab-scale and pilot-scale fluidized beds. In assessing the predictive capabilities of rCFD simulations, we focus on solid mixing and secondary gas holdup. We observed that predictions made by rCFD simulations are highly sensitive to numerical parameters such as diffusivity associated with face swaps. We achieved a computational speed-up of four orders of magnitude (10,000 time faster than classical TFM simulation) eventually allowing for real-time simulations of fluidized beds. In the next step, we apply the checkerboarding technique by introducing gas tracers subjected to convection and diffusion. We then analyze the concentration profiles by observing mixing, transport of gas tracers, insights about the convective and diffusive pattern of the gas tracers, and further towards heat and mass transfer methods. Finally, we run rCFD simulations and calibrate them with numerical and physical parameters compared with convectional Two-fluid model (full CFD) simulation. As a result, this study gives a clear indication of the applicability, predictive capabilities, and existing limitations of rCFD in the realm of fluidization modelling.

Keywords: multiphase flow, recurrence CFD, two-fluid model, industrial processes

Procedia PDF Downloads 75
3632 Emerging Cyber Threats and Cognitive Vulnerabilities: Cyberterrorism

Authors: Oludare Isaac Abiodun, Esther Omolara Abiodun

Abstract:

The purpose of this paper is to demonstrate that cyberterrorism is existing and poses a threat to computer security and national security. Nowadays, people have become excitedly dependent upon computers, phones, the Internet, and the Internet of things systems to share information, communicate, conduct a search, etc. However, these network systems are at risk from a different source that is known and unknown. These network systems risk being caused by some malicious individuals, groups, organizations, or governments, they take advantage of vulnerabilities in the computer system to hawk sensitive information from people, organizations, or governments. In doing so, they are engaging themselves in computer threats, crime, and terrorism, thereby making the use of computers insecure for others. The threat of cyberterrorism is of various forms and ranges from one country to another country. These threats include disrupting communications and information, stealing data, destroying data, leaking, and breaching data, interfering with messages and networks, and in some cases, demanding financial rewards for stolen data. Hence, this study identifies many ways that cyberterrorists utilize the Internet as a tool to advance their malicious mission, which negatively affects computer security and safety. One could identify causes for disparate anomaly behaviors and the theoretical, ideological, and current forms of the likelihood of cyberterrorism. Therefore, for a countermeasure, this paper proposes the use of previous and current computer security models as found in the literature to help in countering cyberterrorism

Keywords: cyberterrorism, computer security, information, internet, terrorism, threat, digital forensic solution

Procedia PDF Downloads 96
3631 Modelling and Optimization of Geothermal Energy in the Gulf of Suez

Authors: Amira Abdelhafez, Rufus Brunt

Abstract:

Geothermal energy in Egypt represents a significant untapped renewable resource that can reduce reliance on conventional power generation. Exploiting these geothermal resources depends on depth, temperature range, and geological characteristics. The intracontinental rift setting of the Gulf of Suez (GoS)-Red Sea rift is a favourable tectonic setting for convection-dominated geothermal plays. The geothermal gradient across the GoS ranges from 24.9 to 86.66 °C/km, with a heat flow of 31-127.2 mW/m². Surface expressions of convective heat loss emerge along the gulf flanks as hot springs (e.g., Hammam Faraun) accompanying deeper geothermal resources. These thermal anomalies are driven mainly by the local tectonic configuration. Characterizing the structural framework of major faults and their control on reservoir properties and subsurface hydrothermal fluid circulation is vital for geothermal applications in the gulf. The geothermal play systems of the GoS depend on structural and lithological properties that contribute to heat storage and vertical transport. Potential geothermal reservoirs include the Nubia sandstones, which, due to their thickness, continuity, and contact with hot basement rocks at a mean depth of 3 km, create an extensive reservoir for geothermal fluids. To develop these geothermal resources for energy production, defining the permeability anisotropy of the reservoir due to faults and facies variation is a crucial step in our study, particularly the evaluation of influence on thermal breakthrough and production rates.

Keywords: geothermal, October field, site specific study, reservoir modelling

Procedia PDF Downloads 11