Search results for: Cloud Computing
272 [Keynote Talk]: The Challenges and Solutions for Developing Mobile Apps in a Small University
Authors: Greg Turner, Bin Lu, Cheer-Sun Yang
Abstract:
As computing technology advances, smartphone applications can assist in student learning in a pervasive way. For example, the idea of using a mobile apps for the PA Common Trees, Pests, Pathogens, in the field as a reference tool allows middle school students to learn about trees and associated pests/pathogens without bringing a textbook. In the past, some researches study the mobile software Mobile Application Software Development Life Cycle (MADLC) including traditional models such as the waterfall model, or more recent Agile Methods. Others study the issues related to the software development process. Very little research is on the development of three heterogenous mobile systems simultaneously in a small university where the availability of developers is an issue. In this paper, we propose to use a hybride model of Waterfall Model and the Agile Model, known as the Relay Race Methodology (RRM) in practice, to reflect the concept of racing and relaying for scheduling. Based on the development project, we observe that the modeling of the transition between any two phases is manifested naturally. Thus, we claim that the RRM model can provide a de fecto rather than a de jure basis for the core concept in the MADLC. In this paper, the background of the project is introduced first. Then, the challenges are pointed out followed by our solutions. Finally, the experiences learned and the future work are presented.Keywords: agile methods, mobile apps, software process model, waterfall model
Procedia PDF Downloads 409271 Integrative Analysis of Urban Transportation Network and Land Use Using GIS: A Case Study of Siddipet City
Authors: P. Priya Madhuri, J. Kamini, S. C. Jayanthi
Abstract:
Assessment of land use and transportation networks is essential for sustainable urban growth, urban planning, efficient public transportation systems, and reducing traffic congestion. The study focuses on land use, population density, and their correlation with the road network for future development. The scope of the study covers inventory and assessment of the road network dataset (line) at the city, zonal, or ward level, which is extracted from very high-resolution satellite data (spatial resolution < 0.5 m) at 1:4000 map scale and ground truth verification. Road network assessment is carried out by computing various indices that measure road coverage and connectivity. In this study, an assessment of the road network is carried out for the study region at the municipal and ward levels. In order to identify gaps, road coverage and connectivity were associated with urban land use, built-up area, and population density in the study area. Ward-wise road connectivity and coverage maps have been prepared. To assess the relationship between road network metrics, correlation analysis is applied. The study's conclusions are extremely beneficial for effective road network planning and detecting gaps in the road network at the ward level in association with urban land use, existing built-up, and population.Keywords: road connectivity, road coverage, road network, urban land use, transportation analysis
Procedia PDF Downloads 33270 Overview of Multi-Chip Alternatives for 2.5 and 3D Integrated Circuit Packagings
Authors: Ching-Feng Chen, Ching-Chih Tsai
Abstract:
With the size of the transistor gradually approaching the physical limit, it challenges the persistence of Moore’s Law due to the development of the high numerical aperture (high-NA) lithography equipment and other issues such as short channel effects. In the context of the ever-increasing technical requirements of portable devices and high-performance computing, relying on the law continuation to enhance the chip density will no longer support the prospects of the electronics industry. Weighing the chip’s power consumption-performance-area-cost-cycle time to market (PPACC) is an updated benchmark to drive the evolution of the advanced wafer nanometer (nm). The advent of two and half- and three-dimensional (2.5 and 3D)- Very-Large-Scale Integration (VLSI) packaging based on Through Silicon Via (TSV) technology has updated the traditional die assembly methods and provided the solution. This overview investigates the up-to-date and cutting-edge packaging technologies for 2.5D and 3D integrated circuits (ICs) based on the updated transistor structure and technology nodes. The author concludes that multi-chip solutions for 2.5D and 3D IC packagings are feasible to prolong Moore’s Law.Keywords: moore’s law, high numerical aperture, power consumption-performance-area-cost-cycle time to market, 2.5 and 3D- very-large-scale integration, packaging, through silicon via
Procedia PDF Downloads 114269 Outdoor Visible Light Communication Channel Modeling under Fog and Smoke Conditions
Authors: Véronique Georlette, Sebastien Bette, Sylvain Brohez, Nicolas Point, Veronique Moeyaert
Abstract:
Visible light communication (VLC) is a communication technology that is part of the optical wireless communication (OWC) family. It uses the visible and infrared spectrums to send data. For now, this technology has widely been studied for indoor use-cases, but it is sufficiently mature nowadays to consider the outdoor environment potentials. The main outdoor challenges are the meteorological conditions and the presence of smoke due to fire or pollutants in urban areas. This paper proposes a methodology to assess the robustness of an outdoor VLC system given the outdoor conditions. This methodology is put into practice in two realistic scenarios, a VLC bus stop, and a VLC streetlight. The methodology consists of computing the power margin available in the system, given all the characteristics of the VLC system and its surroundings. This is done thanks to an outdoor VLC communication channel simulator developed in Python. This simulator is able to quantify the effects of fog and smoke thanks to models taken from environmental and fire engineering scientific literature as well as the optical power reaching the receiver. These two phenomena impact the communication by increasing the total attenuation of the medium. The main conclusion drawn in this paper is that the levels of attenuation due to fog and smoke are in the same order of magnitude. The attenuation of fog being the highest under the visibility of 1 km. This gives a promising prospect for the deployment of outdoor VLC uses-cases in the near future.Keywords: channel modeling, fog modeling, meteorological conditions, optical wireless communication, smoke modeling, visible light communication
Procedia PDF Downloads 150268 Comparison Study of Machine Learning Classifiers for Speech Emotion Recognition
Authors: Aishwarya Ravindra Fursule, Shruti Kshirsagar
Abstract:
In the intersection of artificial intelligence and human-centered computing, this paper delves into speech emotion recognition (SER). It presents a comparative analysis of machine learning models such as K-Nearest Neighbors (KNN),logistic regression, support vector machines (SVM), decision trees, ensemble classifiers, and random forests, applied to SER. The research employs four datasets: Crema D, SAVEE, TESS, and RAVDESS. It focuses on extracting salient audio signal features like Zero Crossing Rate (ZCR), Chroma_stft, Mel Frequency Cepstral Coefficients (MFCC), root mean square (RMS) value, and MelSpectogram. These features are used to train and evaluate the models’ ability to recognize eight types of emotions from speech: happy, sad, neutral, angry, calm, disgust, fear, and surprise. Among the models, the Random Forest algorithm demonstrated superior performance, achieving approximately 79% accuracy. This suggests its suitability for SER within the parameters of this study. The research contributes to SER by showcasing the effectiveness of various machine learning algorithms and feature extraction techniques. The findings hold promise for the development of more precise emotion recognition systems in the future. This abstract provides a succinct overview of the paper’s content, methods, and results.Keywords: comparison, ML classifiers, KNN, decision tree, SVM, random forest, logistic regression, ensemble classifiers
Procedia PDF Downloads 45267 Weighted Data Replication Strategy for Data Grid Considering Economic Approach
Authors: N. Mansouri, A. Asadi
Abstract:
Data Grid is a geographically distributed environment that deals with data intensive application in scientific and enterprise computing. Data replication is a common method used to achieve efficient and fault-tolerant data access in Grids. In this paper, a dynamic data replication strategy, called Enhanced Latest Access Largest Weight (ELALW) is proposed. This strategy is an enhanced version of Latest Access Largest Weight strategy. However, replication should be used wisely because the storage capacity of each Grid site is limited. Thus, it is important to design an effective strategy for the replication replacement task. ELALW replaces replicas based on the number of requests in future, the size of the replica, and the number of copies of the file. It also improves access latency by selecting the best replica when various sites hold replicas. The proposed replica selection selects the best replica location from among the many replicas based on response time that can be determined by considering the data transfer time, the storage access latency, the replica requests that waiting in the storage queue and the distance between nodes. Simulation results utilizing the OptorSim show our replication strategy achieve better performance overall than other strategies in terms of job execution time, effective network usage and storage resource usage.Keywords: data grid, data replication, simulation, replica selection, replica placement
Procedia PDF Downloads 260266 Challenges and Opportunities in Computing Logistics Cost in E-Commerce Supply Chain
Authors: Pramod Ghadge, Swadesh Srivastava
Abstract:
Revenue generation of a logistics company depends on how the logistics cost of a shipment is calculated. Logistics cost of a shipment is a function of distance & speed of the shipment travel in a particular network, its volumetric size and dead weight. Logistics billing is based mainly on the consumption of the scarce resource (space or weight carrying capacity of a carrier). Shipment’s size or deadweight is a function of product and packaging weight, dimensions and flexibility. Hence, to arrive at a standard methodology to compute accurate cost to bill the customer, the interplay among above mentioned physical attributes along with their measurement plays a key role. This becomes even more complex for an ecommerce company, like Flipkart, which caters to shipments from both warehouse and marketplace in an unorganized non-standard market like India. In this paper, we will explore various methodologies to define a standard way of billing the non-standard shipments across a wide range of size, shape and deadweight. Those will be, usage of historical volumetric/dead weight data to arrive at a factor which can be used to compute the logistics cost of a shipment, also calculating the real/contour volume of a shipment to address the problem of irregular shipment shapes which cannot be solved by conventional bounding box volume measurements. We will also discuss certain key business practices and operational quality considerations needed to bring standardization and drive appropriate ownership in the ecosystem.Keywords: contour volume, logistics, real volume, volumetric weight
Procedia PDF Downloads 269265 The Impact of Artificial Intelligence on Digital Factory
Authors: Mona Awad Wanis Gad
Abstract:
The method of factory making plans has changed loads, in particular, whilst it's miles approximately making plans the factory building itself. Factory making plans have the venture of designing merchandise, plants, tactics, organization, regions, and the construction of a factory. Ordinary restructuring is turning into greater essential for you to preserve the competitiveness of a manufacturing unit. Regulations in new regions, shorter lifestyle cycles of product and manufacturing era, in addition to a VUCA global (Volatility, Uncertainty, Complexity and Ambiguity) cause extra common restructuring measures inside a factory. A digital factory model is the planning foundation for rebuilding measures and turns into a critical device. Furthermore, digital building fashions are increasingly being utilized in factories to help facility management and manufacturing processes. First, exclusive styles of digital manufacturing unit fashions are investigated, and their residences and usabilities to be used instances are analyzed. Within the scope of research are point cloud fashions, building statistics fashions, photogrammetry fashions, and those enriched with sensor information are tested. It investigated which digital fashions permit a simple integration of sensor facts and in which the variations are. In the end, viable application areas of virtual manufacturing unit models are determined by a survey, and the respective digital manufacturing facility fashions are assigned to the application areas. Ultimately, an application case from upkeep is selected and implemented with the assistance of the best virtual factory version. It is shown how a completely digitalized preservation process can be supported by a digital manufacturing facility version by offering facts. Among different functions, the virtual manufacturing facility version is used for indoor navigation, facts provision, and display of sensor statistics. In summary, the paper suggests a structuring of virtual factory fashions that concentrates on the geometric representation of a manufacturing facility building and its technical facilities. A practical application case is proven and implemented. For that reason, the systematic selection of virtual manufacturing facility models with the corresponding utility cases is evaluated.Keywords: augmented reality, digital factory model, factory planning, restructuring digital factory model, photogrammetry, factory planning, restructuring building information modeling, digital factory model, factory planning, maintenance
Procedia PDF Downloads 37264 Pushing the Boundary of Parallel Tractability for Ontology Materialization via Boolean Circuits
Authors: Zhangquan Zhou, Guilin Qi
Abstract:
Materialization is an important reasoning service for applications built on the Web Ontology Language (OWL). To make materialization efficient in practice, current research focuses on deciding tractability of an ontology language and designing parallel reasoning algorithms. However, some well-known large-scale ontologies, such as YAGO, have been shown to have good performance for parallel reasoning, but they are expressed in ontology languages that are not parallelly tractable, i.e., the reasoning is inherently sequential in the worst case. This motivates us to study the problem of parallel tractability of ontology materialization from a theoretical perspective. That is we aim to identify the ontologies for which materialization is parallelly tractable, i.e., in the NC complexity. Since the NC complexity is defined based on Boolean circuit that is widely used to investigate parallel computing problems, we first transform the problem of materialization to evaluation of Boolean circuits, and then study the problem of parallel tractability based on circuits. In this work, we focus on datalog rewritable ontology languages. We use Boolean circuits to identify two classes of datalog rewritable ontologies (called parallelly tractable classes) such that materialization over them is parallelly tractable. We further investigate the parallel tractability of materialization of a datalog rewritable OWL fragment DHL (Description Horn Logic). Based on the above results, we analyze real-world datasets and show that many ontologies expressed in DHL belong to the parallelly tractable classes.Keywords: ontology materialization, parallel reasoning, datalog, Boolean circuit
Procedia PDF Downloads 271263 Segmentation of the Liver and Spleen From Abdominal CT Images Using Watershed Approach
Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid
Abstract:
The phase of segmentation is an important step in the processing and interpretation of medical images. In this paper, we focus on the segmentation of liver and spleen from the abdomen computed tomography (CT) images. The importance of our study comes from the fact that the segmentation of ROI from CT images is usually a difficult task. This difficulty is the gray’s level of which is similar to the other organ also the ROI are connected to the ribs, heart, kidneys, etc. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to remove the surrounding and connected organs and tissues by applying morphological filters. This first step makes the extraction of interest regions easier. The second step consists of improving the quality of the image gradient. In this step, we propose a method for improving the image gradient to reduce these deficiencies by applying the spatial filters followed by the morphological filters. Thereafter we proceed to the segmentation of the liver, spleen. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.The system has been evaluated by computing the sensitivity and specificity between the semi-automatically segmented (liver and spleen) contour and the manually contour traced by radiological experts.Keywords: CT images, liver and spleen segmentation, anisotropic diffusion filter, morphological filters, watershed algorithm
Procedia PDF Downloads 495262 Role of mHealth in Effective Response to Disaster
Authors: Mohammad H. Yarmohamadian, Reza Safdari, Nahid Tavakoli
Abstract:
In recent years, many countries have suffered various natural disasters. Disaster response continues to face the challenges in health care sector in all countries. Information and communication management is a significant challenge in disaster scene. During the last decades, rapid advances in information technology have led to manage information effectively and improve communication in health care setting. Information technology is a vital solution for effective response to disasters and emergencies so that if an efficient ICT-based health information system is available, it will be highly valuable in such situation. Of that, mobile technology represents a nearly computing technology infrastructure that is accessible, convenient, inexpensive and easy to use. Most projects have not yet reached the deployment stage, but evaluation exercises show that mHealth should allow faster processing and transport of patients, improved accuracy of triage and better monitoring of unattended patients at a disaster scene. Since there is a high prevalence of cell phones among world population, it is expected the health care providers and managers to take measures for applying this technology for improvement patient safety and public health in disasters. At present there are challenges in the utilization of mhealth in disasters such as lack of structural and financial issues in our country. In this paper we will discuss about benefits and challenges of mhealth technology in disaster setting considering connectivity, usability, intelligibility, communication and teaching for implementing this technology for disaster response.Keywords: information technology, mhealth, disaster, effective response
Procedia PDF Downloads 440261 Visual Template Detection and Compositional Automatic Regular Expression Generation for Business Invoice Extraction
Authors: Anthony Proschka, Deepak Mishra, Merlyn Ramanan, Zurab Baratashvili
Abstract:
Small and medium-sized businesses receive over 160 billion invoices every year. Since these documents exhibit many subtle differences in layout and text, extracting structured fields such as sender name, amount, and VAT rate from them automatically is an open research question. In this paper, existing work in template-based document extraction is extended, and a system is devised that is able to reliably extract all required fields for up to 70% of all documents in the data set, more than any other previously reported method. The approaches are described for 1) detecting through visual features which template a given document belongs to, 2) automatically generating extraction rules for a given new template by composing regular expressions from multiple components, and 3) computing confidence scores that indicate the accuracy of the automatic extractions. The system can generate templates with as little as one training sample and only requires the ground truth field values instead of detailed annotations such as bounding boxes that are hard to obtain. The system is deployed and used inside a commercial accounting software.Keywords: data mining, information retrieval, business, feature extraction, layout, business data processing, document handling, end-user trained information extraction, document archiving, scanned business documents, automated document processing, F1-measure, commercial accounting software
Procedia PDF Downloads 130260 Automatic Identification and Monitoring of Wildlife via Computer Vision and IoT
Authors: Bilal Arshad, Johan Barthelemy, Elliott Pilton, Pascal Perez
Abstract:
Getting reliable, informative, and up-to-date information about the location, mobility, and behavioural patterns of animals will enhance our ability to research and preserve biodiversity. The fusion of infra-red sensors and camera traps offers an inexpensive way to collect wildlife data in the form of images. However, extracting useful data from these images, such as the identification and counting of animals remains a manual, time-consuming, and costly process. In this paper, we demonstrate that such information can be automatically retrieved by using state-of-the-art deep learning methods. Another major challenge that ecologists are facing is the recounting of one single animal multiple times due to that animal reappearing in other images taken by the same or other camera traps. Nonetheless, such information can be extremely useful for tracking wildlife and understanding its behaviour. To tackle the multiple count problem, we have designed a meshed network of camera traps, so they can share the captured images along with timestamps, cumulative counts, and dimensions of the animal. The proposed method takes leverage of edge computing to support real-time tracking and monitoring of wildlife. This method has been validated in the field and can be easily extended to other applications focusing on wildlife monitoring and management, where the traditional way of monitoring is expensive and time-consuming.Keywords: computer vision, ecology, internet of things, invasive species management, wildlife management
Procedia PDF Downloads 138259 Effects of Front Porch and Loft on Indoor Ventilation in the Renewal of Beijing Courtyard
Authors: Zhongzhong Zeng, Zichen Liang
Abstract:
In recent years, Beijing courtyards have been facing the problem of renewal and renovation, and the residents are faced with the problems of small house areas, large household sizes, old and dangerous houses, etc. Among the many renovation methods, the authors note two more common practices of using the front porch to expand the floor area and adding a loft. Residents and architects, however, did not give the ventilation performance of the significant interior consideration before beginning the remodeling. The aim of this article is to explore the good or negative impacts of both front porch and loft structures on the manner of interior ventilation in the courtyard. Ventilation, in turn, is crucial to the indoor environmental quality of a home. The major method utilized in this study is the comparative analysis method, in which the authors create four alternative house models with or without a front porch and an attic as two variables and examine internal ventilation using the CFD(Computational Fluid Dynamics) technique. The authors compare the indoor ventilation of four different architectural models with or without front porches and lofts as two variables. The results obtained from the analysis of the sectional airflow and the plane 1.5m height cloud are the existence of the loft, to a certain extent, disrupts the airflow organization of the building and makes the rear wall high windows of the building less effective. Occupying the front porch to become the area of the house has no significant effect on ventilation, but try not to occupy the front porch and add the loft at the same time in the building renovation. The findings of this study led to the following recommendations: strive to preserve the courtyard building's original architectural design and make adjustments to only the inappropriate elements or constructions. The ventilation in the loft portion is inadequate, and the inhabitants typically use the loft as a living area. This may lead to the building relying more on air conditioning in the summer, which would raise energy demand. The front porch serves as a transition place as well as a source of shade, weather protection, and inside ventilation. In conclusion, the examination of interior environments in upcoming studies should concentrate on cross-disciplinary, multi-angle, and multi-level research topics.Keywords: Beijing courtyard renewal, CFD, indoor environment, ventilation analysis
Procedia PDF Downloads 81258 Neuron Efficiency in Fluid Dynamics and Prediction of Groundwater Reservoirs'' Properties Using Pattern Recognition
Authors: J. K. Adedeji, S. T. Ijatuyi
Abstract:
The application of neural network using pattern recognition to study the fluid dynamics and predict the groundwater reservoirs properties has been used in this research. The essential of geophysical survey using the manual methods has failed in basement environment, hence the need for an intelligent computing such as predicted from neural network is inevitable. A non-linear neural network with an XOR (exclusive OR) output of 8-bits configuration has been used in this research to predict the nature of groundwater reservoirs and fluid dynamics of a typical basement crystalline rock. The control variables are the apparent resistivity of weathered layer (p1), fractured layer (p2), and the depth (h), while the dependent variable is the flow parameter (F=λ). The algorithm that was used in training the neural network is the back-propagation coded in C++ language with 300 epoch runs. The neural network was very intelligent to map out the flow channels and detect how they behave to form viable storage within the strata. The neural network model showed that an important variable gr (gravitational resistance) can be deduced from the elevation and apparent resistivity pa. The model results from SPSS showed that the coefficients, a, b and c are statistically significant with reduced standard error at 5%.Keywords: gravitational resistance, neural network, non-linear, pattern recognition
Procedia PDF Downloads 212257 A Psychophysiological Evaluation of an Effective Recognition Technique Using Interactive Dynamic Virtual Environments
Authors: Mohammadhossein Moghimi, Robert Stone, Pia Rotshtein
Abstract:
Recording psychological and physiological correlates of human performance within virtual environments and interpreting their impacts on human engagement, ‘immersion’ and related emotional or ‘effective’ states is both academically and technologically challenging. By exposing participants to an effective, real-time (game-like) virtual environment, designed and evaluated in an earlier study, a psychophysiological database containing the EEG, GSR and Heart Rate of 30 male and female gamers, exposed to 10 games, was constructed. Some 174 features were subsequently identified and extracted from a number of windows, with 28 different timing lengths (e.g. 2, 3, 5, etc. seconds). After reducing the number of features to 30, using a feature selection technique, K-Nearest Neighbour (KNN) and Support Vector Machine (SVM) methods were subsequently employed for the classification process. The classifiers categorised the psychophysiological database into four effective clusters (defined based on a 3-dimensional space – valence, arousal and dominance) and eight emotion labels (relaxed, content, happy, excited, angry, afraid, sad, and bored). The KNN and SVM classifiers achieved average cross-validation accuracies of 97.01% (±1.3%) and 92.84% (±3.67%), respectively. However, no significant differences were found in the classification process based on effective clusters or emotion labels.Keywords: virtual reality, effective computing, effective VR, emotion-based effective physiological database
Procedia PDF Downloads 233256 Understanding Hydrodynamic in Lake Victoria Basin in a Catchment Scale: A Literature Review
Authors: Seema Paul, John Mango Magero, Prosun Bhattacharya, Zahra Kalantari, Steve W. Lyon
Abstract:
The purpose of this review paper is to develop an understanding of lake hydrodynamics and the potential climate impact on the Lake Victoria (LV) catchment scale. This paper briefly discusses the main problems of lake hydrodynamics and its’ solutions that are related to quality assessment and climate effect. An empirical methodology in modeling and mapping have considered for understanding lake hydrodynamic and visualizing the long-term observational daily, monthly, and yearly mean dataset results by using geographical information system (GIS) and Comsol techniques. Data were obtained for the whole lake and five different meteorological stations, and several geoprocessing tools with spatial analysis are considered to produce results. The linear regression analyses were developed to build climate scenarios and a linear trend on lake rainfall data for a long period. A potential evapotranspiration rate has been described by the MODIS and the Thornthwaite method. The rainfall effect on lake water level observed by Partial Differential Equations (PDE), and water quality has manifested by a few nutrients parameters. The study revealed monthly and yearly rainfall varies with monthly and yearly maximum and minimum temperatures, and the rainfall is high during cool years and the temperature is high associated with below and average rainfall patterns. Rising temperatures are likely to accelerate evapotranspiration rates and more evapotranspiration is likely to lead to more rainfall, drought is more correlated with temperature and cloud is more correlated with rainfall. There is a trend in lake rainfall and long-time rainfall on the lake water surface has affected the lake level. The onshore and offshore have been concentrated by initial literature nutrients data. The study recommended that further studies should consider fully lake bathymetry development with flow analysis and its’ water balance, hydro-meteorological processes, solute transport, wind hydrodynamics, pollution and eutrophication these are crucial for lake water quality, climate impact assessment, and water sustainability.Keywords: climograph, climate scenarios, evapotranspiration, linear trend flow, rainfall event on LV, concentration
Procedia PDF Downloads 99255 Integral Form Solutions of the Linearized Navier-Stokes Equations without Deviatoric Stress Tensor Term in the Forward Modeling for FWI
Authors: Anyeres N. Atehortua Jimenez, J. David Lambraño, Juan Carlos Muñoz
Abstract:
Navier-Stokes equations (NSE), which describe the dynamics of a fluid, have an important application on modeling waves used for data inversion techniques as full waveform inversion (FWI). In this work a linearized version of NSE and its variables, neglecting deviatoric terms of stress tensor, is presented. In order to get a theoretical modeling of pressure p(x,t) and wave velocity profile c(x,t), a wave equation of visco-acoustic medium (VAE) is written. A change of variables p(x,t)=q(x,t)h(ρ), is made on the equation for the VAE leading to a well known Klein-Gordon equation (KGE) describing waves propagating in variable density medium (ρ) with dispersive term α^2(x). KGE is reduced to a Poisson equation and solved by proposing a specific function for α^2(x) accounting for the energy dissipation and dispersion. Finally, an integral form solution is derived for p(x,t), c(x,t) and kinematics variables like particle velocity v(x,t), displacement u(x,t) and bulk modulus function k_b(x,t). Further, it is compared this visco-acoustic formulation with another form broadly used in the geophysics; it is argued that this formalism is more general and, given its integral form, it may offer several advantages from the modern parallel computing point of view. Applications to minimize the errors in modeling for FWI applied to oils resources in geophysics are discussed.Keywords: Navier-Stokes equations, modeling, visco-acoustic, inversion FWI
Procedia PDF Downloads 520254 Detection of Defects in CFRP by Ultrasonic IR Thermographic Method
Authors: W. Swiderski
Abstract:
In the paper introduced the diagnostic technique making possible the research of internal structures in composite materials reinforced fibres using in different applications. The main reason of damages in structures of these materials is the changing distribution of load in constructions in the lifetime. Appearing defect is largely complicated because of the appearance of disturbing of continuity of reinforced fibres, binder cracks and loss of fibres adhesiveness from binders. Defect in composite materials is usually more complicated than in metals. At present, infrared thermography is the most effective method in non-destructive testing composite. One of IR thermography methods used in non-destructive evaluation is vibrothermography. The vibrothermography is not a new non-destructive method, but the new solution in this test is use ultrasonic waves to thermal stimulation of materials. In this paper, both modelling and experimental results which illustrate the advantages and limitations of ultrasonic IR thermography in inspecting composite materials will be presented. The ThermoSon computer program for computing 3D dynamic temperature distribuions in anisotropic layered solids with subsurface defects subject to ulrasonic stimulation was used to optimise heating parameters in the detection of subsurface defects in composite materials. The program allows for the analysis of transient heat conduction and ultrasonic wave propagation phenomena in solids. The experiments at MIAT were fulfilled by means of FLIR SC 7600 IR camera. Ultrasonic stimulation was performed with the frequency from 15 kHz to 30 kHz with maximum power up to 2 kW.Keywords: composite material, ultrasonic, infrared thermography, non-destructive testing
Procedia PDF Downloads 295253 Effective Supply Chain Coordination with Hybrid Demand Forecasting Techniques
Authors: Gurmail Singh
Abstract:
Effective supply chain is the main priority of every organization which is the outcome of strategic corporate investments with deliberate management action. Value-driven supply chain is defined through development, procurement and by configuring the appropriate resources, metrics and processes. However, responsiveness of the supply chain can be improved by proper coordination. So the Bullwhip effect (BWE) and Net stock amplification (NSAmp) values were anticipated and used for the control of inventory in organizations by both discrete wavelet transform-Artificial neural network (DWT-ANN) and Adaptive Network-based fuzzy inference system (ANFIS). This work presents a comparative methodology of forecasting for the customers demand which is non linear in nature for a multilevel supply chain structure using hybrid techniques such as Artificial intelligence techniques including Artificial neural networks (ANN) and Adaptive Network-based fuzzy inference system (ANFIS) and Discrete wavelet theory (DWT). The productiveness of these forecasting models are shown by computing the data from real world problems for Bullwhip effect and Net stock amplification. The results showed that these parameters were comparatively less in case of discrete wavelet transform-Artificial neural network (DWT-ANN) model and using Adaptive network-based fuzzy inference system (ANFIS).Keywords: bullwhip effect, hybrid techniques, net stock amplification, supply chain flexibility
Procedia PDF Downloads 127252 Restoration of Digital Design Using Row and Column Major Parsing Technique from the Old/Used Jacquard Punched Cards
Authors: R. Kumaravelu, S. Poornima, Sunil Kumar Kashyap
Abstract:
The optimized and digitalized restoration of the information from the old and used manual jacquard punched card in textile industry is referred to as Jacquard Punch Card (JPC) reader. In this paper, we present a novel design and development of photo electronics based system for reading old and used punched cards and storing its binary information for transforming them into an effective image file format. In our textile industry the jacquard punched cards holes diameters having the sizes of 3mm, 5mm and 5.5mm pitch. Before the adaptation of computing systems in the field of textile industry those punched cards were prepared manually without digital design source, but those punched cards are having rich woven designs. Now, the idea is to retrieve binary information from the jacquard punched cards and store them in digital (Non-Graphics) format before processing it. After processing the digital format (Non-Graphics) it is converted into an effective image file format through either by Row major or Column major parsing technique.To accomplish these activities, an embedded system based device and software integration is developed. As part of the test and trial activity the device was tested and installed for industrial service at Weavers Service Centre, Kanchipuram, Tamilnadu in India.Keywords: file system, SPI. UART, ARM controller, jacquard, punched card, photo LED, photo diode
Procedia PDF Downloads 167251 Digital Design and Fabrication: A Review of Trend and Its Impact in the African Context
Authors: Mohamed Al Araby, Amany Salman, Mostafa Amin, Mohamed Madbully, Dalia Keraa, Mariam Ali, Marah Abdelfatah, Mariam Ahmed, Ahmed Hassab
Abstract:
In recent years, the architecture, engineering, and construction (A.E.C.) industry have been exposed to important innovations, most notably the global integration of digital design and fabrication (D.D.F.) processes in the industry’s workflow. Despite this evolution in that sector, Africa was excluded from the examination of this development. The reason behind this exclusion is the preconceived view of it as a developing region that still employs traditional methods of construction. The primary objective of this review is to investigate the trend of digital construction (D.C.) in the African environment and the difficulties in its regular utilization of it. This objective can be attained by recognizing the notion of distributed computing in Africa and evaluating the impact of the projects deploying this technology on both the immediate and broader contexts. The paper’s methodology begins with the collection of data from 224 initiatives throughout Africa. Then, 50 of these projects were selected based on the criteria of the project's recency, typology variety, and location diversity. After that, a literature-based comparative analysis was undertaken. This study’s findings reveal a pattern of motivation for applying digital fabrication processes. Moreover, it is essential to evaluate the socio-economic effects of these projects on the population living near the analyzed subject. The last step in this study is identifying the influence on the neighboring nations.Keywords: Africa, digital construction, digital design, fabrication
Procedia PDF Downloads 177250 Mean Field Model Interaction for Computer and Communication Systems: Modeling and Analysis of Wireless Sensor Networks
Authors: Irina A. Gudkova, Yousra Demigha
Abstract:
Scientific research is moving more and more towards the study of complex systems in several areas of economics, biology physics, and computer science. In this paper, we will work on complex systems in communication networks, Wireless Sensor Networks (WSN) that are considered as stochastic systems composed of interacting entities. The current advancements of the sensing in computing and communication systems is an investment ground for research in several tracks. A detailed presentation was made for the WSN, their use, modeling, different problems that can occur in their application and some solutions. The main goal of this work reintroduces the idea of mean field method since it is a powerful technique to solve this type of models especially systems that evolve according to a Continuous Time Markov Chain (CTMC). Modeling of a CTMC has been focused; we obtained a large system of interacting Continuous Time Markov Chain with population entities. The main idea was to work on one entity and replace the others with an average or effective interaction. In this context to make the solution easier, we consider a wireless sensor network as a multi-body problem and we reduce it to one body problem. The method was applied to a system of WSN modeled as a Markovian queue showing the results of the used technique.Keywords: Continuous-Time Markov Chain, Hidden Markov Chain, mean field method, Wireless sensor networks
Procedia PDF Downloads 165249 Predicting Daily Patient Hospital Visits Using Machine Learning
Authors: Shreya Goyal
Abstract:
The study aims to build user-friendly software to understand patient arrival patterns and compute the number of potential patients who will visit a particular health facility for a given period by using a machine learning algorithm. The underlying machine learning algorithm used in this study is the Support Vector Machine (SVM). Accurate prediction of patient arrival allows hospitals to operate more effectively, providing timely and efficient care while optimizing resources and improving patient experience. It allows for better allocation of staff, equipment, and other resources. If there's a projected surge in patients, additional staff or resources can be allocated to handle the influx, preventing bottlenecks or delays in care. Understanding patient arrival patterns can also help streamline processes to minimize waiting times for patients and ensure timely access to care for patients in need. Another big advantage of using this software is adhering to strict data protection regulations such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States as the hospital will not have to share the data with any third party or upload it to the cloud because the software can read data locally from the machine. The data needs to be arranged in. a particular format and the software will be able to read the data and provide meaningful output. Using software that operates locally can facilitate compliance with these regulations by minimizing data exposure. Keeping patient data within the hospital's local systems reduces the risk of unauthorized access or breaches associated with transmitting data over networks or storing it in external servers. This can help maintain the confidentiality and integrity of sensitive patient information. Historical patient data is used in this study. The input variables used to train the model include patient age, time of day, day of the week, seasonal variations, and local events. The algorithm uses a Supervised learning method to optimize the objective function and find the global minima. The algorithm stores the values of the local minima after each iteration and at the end compares all the local minima to find the global minima. The strength of this study is the transfer function used to calculate the number of patients. The model has an output accuracy of >95%. The method proposed in this study could be used for better management planning of personnel and medical resources.Keywords: machine learning, SVM, HIPAA, data
Procedia PDF Downloads 65248 3D Steady and Transient Centrifugal Pump Flow within Ansys CFX and OpenFOAM
Authors: Clement Leroy, Guillaume Boitel
Abstract:
This paper presents a comparative benchmarking review of a steady and transient three-dimensional (3D) flow computations in centrifugal pump using commercial (AnsysCFX) and open source (OpenFOAM) computational fluid dynamics (CFD) software. In centrifugal rotor-dynamic pump, the fluid enters in the impeller along to the rotating axis to be accelerated in order to increase the pressure, flowing radially outward into another stage, vaned diffuser or volute casing, from where it finally exits into a downstream pipe. Simulations are carried out at the best efficiency point (BEP) and part load, for single-phase flow with several turbulence models. The results are compared with overall performance report from experimental data. The use of CFD technology in industry is still limited by the high computational costs, and even more by the high cost of commercial CFD software and high-performance computing (HPC) licenses. The main objectives of the present study are to define OpenFOAM methodology for high-quality 3D steady and transient turbomachinery CFD simulation to conduct a thorough time-accurate performance analysis. On the other hand a detailed comparisons between computational methods, features on latest Ansys release 18 and OpenFOAM is investigated to assess the accuracy and industrial applications of those solvers. Finally an automated connected workflow (IoT) for turbine blade applications is presented.Keywords: benchmarking, CFX, internet of things, openFOAM, time-accurate, turbomachinery
Procedia PDF Downloads 204247 Treating Voxels as Words: Word-to-Vector Methods for fMRI Meta-Analyses
Authors: Matthew Baucum
Abstract:
With the increasing popularity of fMRI as an experimental method, psychology and neuroscience can greatly benefit from advanced techniques for summarizing and synthesizing large amounts of data from brain imaging studies. One promising avenue is automated meta-analyses, in which natural language processing methods are used to identify the brain regions consistently associated with certain semantic concepts (e.g. “social”, “reward’) across large corpora of studies. This study builds on this approach by demonstrating how, in fMRI meta-analyses, individual voxels can be treated as vectors in a semantic space and evaluated for their “proximity” to terms of interest. In this technique, a low-dimensional semantic space is built from brain imaging study texts, allowing words in each text to be represented as vectors (where words that frequently appear together are near each other in the semantic space). Consequently, each voxel in a brain mask can be represented as a normalized vector sum of all of the words in the studies that showed activation in that voxel. The entire brain mask can then be visualized in terms of each voxel’s proximity to a given term of interest (e.g., “vision”, “decision making”) or collection of terms (e.g., “theory of mind”, “social”, “agent”), as measured by the cosine similarity between the voxel’s vector and the term vector (or the average of multiple term vectors). Analysis can also proceed in the opposite direction, allowing word cloud visualizations of the nearest semantic neighbors for a given brain region. This approach allows for continuous, fine-grained metrics of voxel-term associations, and relies on state-of-the-art “open vocabulary” methods that go beyond mere word-counts. An analysis of over 11,000 neuroimaging studies from an existing meta-analytic fMRI database demonstrates that this technique can be used to recover known neural bases for multiple psychological functions, suggesting this method’s utility for efficient, high-level meta-analyses of localized brain function. While automated text analytic methods are no replacement for deliberate, manual meta-analyses, they seem to show promise for the efficient aggregation of large bodies of scientific knowledge, at least on a relatively general level.Keywords: FMRI, machine learning, meta-analysis, text analysis
Procedia PDF Downloads 448246 Detection of Powdery Mildew Disease in Strawberry Using Image Texture and Supervised Classifiers
Authors: Sultan Mahmud, Qamar Zaman, Travis Esau, Young Chang
Abstract:
Strawberry powdery mildew (PM) is a serious disease that has a significant impact on strawberry production. Field scouting is still a major way to find PM disease, which is not only labor intensive but also almost impossible to monitor disease severity. To reduce the loss caused by PM disease and achieve faster automatic detection of the disease, this paper proposes an approach for detection of the disease, based on image texture and classified with support vector machines (SVMs) and k-nearest neighbors (kNNs). The methodology of the proposed study is based on image processing which is composed of five main steps including image acquisition, pre-processing, segmentation, features extraction and classification. Two strawberry fields were used in this study. Images of healthy leaves and leaves infected with PM (Sphaerotheca macularis) disease under artificial cloud lighting condition. Colour thresholding was utilized to segment all images before textural analysis. Colour co-occurrence matrix (CCM) was introduced for extraction of textural features. Forty textural features, related to a physiological parameter of leaves were extracted from CCM of National television system committee (NTSC) luminance, hue, saturation and intensity (HSI) images. The normalized feature data were utilized for training and validation, respectively, using developed classifiers. The classifiers have experimented with internal, external and cross-validations. The best classifier was selected based on their performance and accuracy. Experimental results suggested that SVMs classifier showed 98.33%, 85.33%, 87.33%, 93.33% and 95.0% of accuracy on internal, external-I, external-II, 4-fold cross and 5-fold cross-validation, respectively. Whereas, kNNs results represented 90.0%, 72.00%, 74.66%, 89.33% and 90.3% of classification accuracy, respectively. The outcome of this study demonstrated that SVMs classified PM disease with a highest overall accuracy of 91.86% and 1.1211 seconds of processing time. Therefore, overall results concluded that the proposed study can significantly support an accurate and automatic identification and recognition of strawberry PM disease with SVMs classifier.Keywords: powdery mildew, image processing, textural analysis, color co-occurrence matrix, support vector machines, k-nearest neighbors
Procedia PDF Downloads 120245 Blended Intensive Programmes: A Way Forward to Promote Internationalization in Higher Education
Authors: Sonja Gögele, Petra Kletzenbauer
Abstract:
International strategies are ranked as one of the core activities in the development plans of Austrian universities. This has led to numerous promising activities in terms of internationalization (i.e. development of international degree programmes, increased staff and student mobility, and blended international projects). The latest innovative approach in terms of Erasmus+ are so called Blended Intensive Programmes (BIP) which combine jointly delivered teaching and learning elements of at least three participating ERASMUS universities in a virtual and short-term mobility setup. Students who participate in BIP can maintain their study plans at their home institution and include BIP as a parallel activity. This paper presents the experiences of this programme on the topic of sustainable computing hosted by the University of Applied Sciences FH JOANNEUM. By means of an online survey and face-to-face interviews with all stakeholders (20 students, 8 professors), the empirical study addresses the challenges of hosting an international blended learning programme (i.e. virtual phase and on-site intensive phase) and discusses the impact of such activities in terms of internationalization and Englishization. In this context, key roles are assigned to the development of future transnational and transdisciplinary curricula by considering innovative aspects for learning and teaching (i.e. virtual collaboration, research-based learning).Keywords: internationalization, englishization, short-term mobility, international teaching and learning
Procedia PDF Downloads 120244 Smart Construction Sites in KSA: Challenges and Prospects
Authors: Ahmad Mohammad Sharqi, Mohamed Hechmi El Ouni, Saleh Alsulamy
Abstract:
Due to the emerging technologies revolution worldwide, the need to exploit and employ innovative technologies for other functions and purposes in different aspects has become a remarkable matter. Saudi Arabia is considered one of the most powerful economic countries in the world, where the construction sector participates effectively in its economy. Thus, the construction sector in KSA should convoy the rapid digital revolution and transformation and implement smart devices on sites. A Smart Construction Site (SCS) includes smart devices, artificial intelligence, the internet of things, augmented reality, building information modeling, geographical information systems, and cloud information. This paper aims to study the level of implementation of SCS in KSA, analyze the obstacles and challenges of adopting SCS and find out critical success factors for its implementation. A survey of close-ended questions (scale and multi-choices) has been conducted on professionals in the construction sector of Saudi Arabia. A total number of twenty-nine questions has been prepared for respondents. Twenty-four scale questions were established, and those questions were categorized into several themes: quality, scheduling, cost, occupational safety and health, technologies and applications, and general perception. Consequently, the 5-point Likert scale tool (very low to very high) was adopted for this survey. In addition, five close-ended questions with multi-choice types have also been prepared; these questions have been derived from a previous study implemented in the United Kingdom (UK) and the Dominic Republic (DR), these questions have been rearranged and organized to fit the structured survey in order to place the Kingdom of Saudi Arabia in comparison with the United Kingdom (UK) as well as the Dominican Republic (DR). A total number of one hundred respondents have participated in this survey from all regions of the Kingdom of Saudi Arabia: southern, central, western, eastern, and northern regions. The drivers, obstacles, and success factors for implementing smart devices and technologies in KSA’s construction sector have been investigated and analyzed. Besides, it has been concluded that KSA is on the right path toward adopting smart construction sites with attractive results comparable to and even better than the UK in some factors.Keywords: artificial intelligence, construction projects management, internet of things, smart construction sites, smart devices
Procedia PDF Downloads 155243 Ontology Expansion via Synthetic Dataset Generation and Transformer-Based Concept Extraction
Authors: Andrey Khalov
Abstract:
The rapid proliferation of unstructured data in IT infrastructure management demands innovative approaches for extracting actionable knowledge. This paper presents a framework for ontology-based knowledge extraction that combines relational graph neural networks (R-GNN) with large language models (LLMs). The proposed method leverages the DOLCE framework as the foundational ontology, extending it with concepts from ITSMO for domain-specific applications in IT service management and outsourcing. A key component of this research is the use of transformer-based models, such as DeBERTa-v3-large, for automatic entity and relationship extraction from unstructured texts. Furthermore, the paper explores how transfer learning techniques can be applied to fine-tune large language models (LLaMA) for using to generate synthetic datasets to improve precision in BERT-based entity recognition and ontology alignment. The resulting IT Ontology (ITO) serves as a comprehensive knowledge base that integrates domain-specific insights from ITIL processes, enabling more efficient decision-making. Experimental results demonstrate significant improvements in knowledge extraction and relationship mapping, offering a cutting-edge solution for enhancing cognitive computing in IT service environments.Keywords: ontology expansion, synthetic dataset, transformer fine-tuning, concept extraction, DOLCE, BERT, taxonomy, LLM, NER
Procedia PDF Downloads 14