Search results for: raw complex data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28307

Search results for: raw complex data

27677 Big Brain: A Single Database System for a Federated Data Warehouse Architecture

Authors: X. Gumara Rigol, I. Martínez de Apellaniz Anzuola, A. Garcia Serrano, A. Franzi Cros, O. Vidal Calbet, A. Al Maruf

Abstract:

Traditional federated architectures for data warehousing work well when corporations have existing regional data warehouses and there is a need to aggregate data at a global level. Schibsted Media Group has been maturing from a decentralised organisation into a more globalised one and needed to build both some of the regional data warehouses for some brands at the same time as the global one. In this paper, we present the architectural alternatives studied and why a custom federated approach was the notable recommendation to go further with the implementation. Although the data warehouses are logically federated, the implementation uses a single database system which presented many advantages like: cost reduction and improved data access to global users allowing consumers of the data to have a common data model for detailed analysis across different geographies and a flexible layer for local specific needs in the same place.

Keywords: data integration, data warehousing, federated architecture, Online Analytical Processing (OLAP)

Procedia PDF Downloads 231
27676 Flow Field Analysis of a Liquid Ejector Pump Using Embedded Large Eddy Simulation Methodology

Authors: Qasim Zaheer, Jehanzeb Masud

Abstract:

The understanding of entrainment and mixing phenomenon in the ejector pump is of pivotal importance for designing and performance estimation. In this paper, the existence of turbulent vortical structures due to Kelvin-Helmholtz instability at the free surface between the motive and the entrained fluids streams are simulated using Embedded LES methodology. The efficacy of Embedded LES for simulation of complex flow field of ejector pump is evaluated using ANSYS Fluent®. The enhanced mixing and entrainment process due to breaking down of larger eddies into smaller ones as a consequence of Vortex Stretching phenomenon is captured in this study. Moreover, the flow field characteristics of ejector pump like pressure velocity fields and mass flow rates are analyzed and validated against the experimental results.

Keywords: Kelvin Helmholtz instability, embedded LES, complex flow field, ejector pump

Procedia PDF Downloads 291
27675 Urban Planning Compilation Problems in China and the Corresponding Optimization Ideas under the Vision of the Hyper-Cycle Theory

Authors: Hong Dongchen, Chen Qiuxiao, Wu Shuang

Abstract:

Systematic science reveals the complex nonlinear mechanisms of behaviour in urban system. However, in China, when the current city planners face with the system, most of them are still taking simple linear thinking to consider the open complex giant system. This paper introduces the hyper-cycle theory, which is one of the basis theories of systematic science, based on the analysis of the reasons why the current urban planning failed, and proposals for optimization ideas that urban planning compilation should change, from controlling quantitative to the changes of relationship, from blueprint planning to progressive planning based on the nonlinear characteristics and from management control to dynamically monitor feedback.

Keywords: systematic science, hyper-cycle theory, urban planning, urban management

Procedia PDF Downloads 398
27674 Feedback Matrix Approach for Relativistic Runaway Electron Avalanches Dynamics in Complex Electric Field Structures

Authors: Egor Stadnichuk

Abstract:

Relativistic runaway electron avalanches (RREA) are a widely accepted source of thunderstorm gamma-radiation. In regions with huge electric field strength, RREA can multiply via relativistic feedback. The relativistic feedback is caused both by positron production and by runaway electron bremsstrahlung gamma-rays reversal. In complex multilayer thunderstorm electric field structures, an additional reactor feedback mechanism appears due to gamma-ray exchange between separate strong electric field regions with different electric field directions. The study of this reactor mechanism in conjunction with the relativistic feedback with Monte Carlo simulations or by direct solution of the kinetic Boltzmann equation requires a significant amount of computational time. In this work, a theoretical approach to study feedback mechanisms in RREA physics is developed. It is based on the matrix of feedback operators construction. With the feedback matrix, the problem of the dynamics of avalanches in complex electric structures is reduced to the problem of finding eigenvectors and eigenvalues. A method of matrix elements calculation is proposed. The proposed concept was used to study the dynamics of RREAs in multilayer thunderclouds.

Keywords: terrestrial Gamma-ray flashes, thunderstorm ground enhancement, relativistic runaway electron avalanches, gamma-rays, high-energy atmospheric physics, TGF, TGE, thunderstorm, relativistic feedback, reactor feedback, reactor model

Procedia PDF Downloads 163
27673 Synthetic Access to Complex Metal Carbonates and Hydroxycarbonates via Sol-Gel Chemistry

Authors: Schirin Hanf, Carlos Lizandara-Pueyo, Timmo P. Emmert, Ivana Jevtovikj, Roger Gläser, Stephan A. Schunk

Abstract:

Metal alkoxides are very versatile precursors for a broad array of complex functional materials. However, metal alkoxides, especially transition metal alkoxides, tend to form oligomeric structures due to the very strong M–O–M binding motif. This fact hinders their facile application in sol-gel-processes and complicates access to complex carbonate or oxidic compounds after hydrolysis of the precursors. Therefore, the development of a synthetic alternative with the aim to grant access to carbonates and hydroxycarbonates from simple metal alkoxide precursors via hydrolysis is key to this project. Our approach involves the reaction of metal alkoxides with unsaturated isoelectronic molecules, such as carbon dioxide. Subsequently, a stoichiometric insertion of the CO₂ into the alkoxide M–O bond takes place and leads to the formation of soluble metal alkyl carbonates. This strategy is a very elegant approach to solubilize metal alkoxide precursors to make them accessible for sol-gel chemistry. After hydrolysis of the metal alkyl carbonates, crystalline metal carbonates, and hydroxycarbonates can be obtained, which were then utilized for the synthesis of Cu/Zn based bulk catalysts for methanol synthesis. Using these catalysts, a comparable catalytic activity to commercially available MeOH catalysts could be reached. Based on these results, a complement for traditional precipitation techniques, which are usually utilized for the synthesis of bulk methanol catalysts, have been found based on an alternative solubilization strategy.

Keywords: metal alkoxides, metal carbonates, metal hydroxycarbonates, CO₂ insertion, solubilization

Procedia PDF Downloads 182
27672 Management of Medical Equipment Maintenance

Authors: Gholamreza Madad

Abstract:

The role of medical equipment in modern advanced hospitals is irrefutable. Despite limited financial resources, developing countries have taken an uncontrollable manner to the purchase of complex and expensive equipment, although they have not taken good maintenance to keep these huge capitals. In our country, limited studies have indicated that the irregularities exist in the management of medical equipment maintenance. Research method: The research was done as a cross-sectional one, and in this study, a questionnaire was used to collect data in 10 hospitals. After distributing and collecting questionnaires in person, the collected data were analyzed using descriptive statistics and SPSS software. Research findings: According to the obtained results from the four dimensions of the management of medical equipment maintenance, only (maintenance planning) was in a moderate position and other components with a score of less than 50% were at a low level. There was a direct relationship between the total score of maintenance management and guidance points and coordination of medical equipment maintenance, and as well as the age of hospital managers. Discussion and conclusion: In sum, we can say that problems such as lack of skilled staff in medical engineering departments of hospitals, lack of funds and unaware of the authorities of medical engineering units to their duties have caused that the maintenance situation of medical equipment maintenance is in poor condition (near average). The low inexperience of the authorities of the unit has also contributed to this problem.

Keywords: equipment, maintenance, medical equipment, hospitals

Procedia PDF Downloads 158
27671 A Review Paper on Data Mining and Genetic Algorithm

Authors: Sikander Singh Cheema, Jasmeen Kaur

Abstract:

In this paper, the concept of data mining is summarized and its one of the important process i.e KDD is summarized. The data mining based on Genetic Algorithm is researched in and ways to achieve the data mining Genetic Algorithm are surveyed. This paper also conducts a formal review on the area of data mining tasks and genetic algorithm in various fields.

Keywords: data mining, KDD, genetic algorithm, descriptive mining, predictive mining

Procedia PDF Downloads 586
27670 Time Dependent Biodistribution Modeling of 177Lu-DOTATOC Using Compartmental Analysis

Authors: M. Mousavi-Daramoroudi, H. Yousefnia, F. Abbasi-Davani, S. Zolghadri

Abstract:

In this study, 177Lu-DOTATOC was prepared under optimized conditions (radiochemical purity: > 99%, radionuclidic purity: > 99%). The percentage of injected dose per gram (%ID/g) was calculated for organs up to 168 h post injection. Compartmental model was applied to mathematical description of the drug behaviour in tissue at different times. The biodistribution data showed the significant excretion of the radioactivity from the kidneys. The adrenal and pancreas, as major expression sites for somatostatin receptor (SSTR), had significant uptake. A pharmacokinetic model of 177Lu-DOTATOC was presented by compartmental analysis which demonstrates the behavior of the complex.

Keywords: biodistribution, compartmental modeling, ¹⁷⁷Lu, Octreotide

Procedia PDF Downloads 215
27669 Data-Mining Approach to Analyzing Industrial Process Information for Real-Time Monitoring

Authors: Seung-Lock Seo

Abstract:

This work presents a data-mining empirical monitoring scheme for industrial processes with partially unbalanced data. Measurement data of good operations are relatively easy to gather, but in unusual special events or faults it is generally difficult to collect process information or almost impossible to analyze some noisy data of industrial processes. At this time some noise filtering techniques can be used to enhance process monitoring performance in a real-time basis. In addition, pre-processing of raw process data is helpful to eliminate unwanted variation of industrial process data. In this work, the performance of various monitoring schemes was tested and demonstrated for discrete batch process data. It showed that the monitoring performance was improved significantly in terms of monitoring success rate of given process faults.

Keywords: data mining, process data, monitoring, safety, industrial processes

Procedia PDF Downloads 393
27668 New Active Dioxin Response Element Sites in Regulatory Region of Human and Viral Genes

Authors: Ilya B. Tsyrlov, Dmitry Y. Oshchepkov

Abstract:

A computational search for dioxin response elements (DREs) in genes of proteins comprising the Ah receptor (AhR) cytosolic core complex was performed by highly efficient tool SITECON. Eventually, the following number of new DREs in 5’flanking region was detected by SITECON: one in AHR gene, five in XAP2, eight in HSP90AA1, and three in HSP90AB1 genes. Numerous DREs found in genes of AhR and AhR cytosolic complex members would shed a light on potential mechanisms of expression, the stoichiometry of unliganded AhR core complex, and its degradation vs biosynthesis dynamics resulted from treatment of target cells with the AhR most potent ligand, 2,3,7,8-TCDD. With human viruses, reduced susceptibility to TCDD of geneencoding HIV-1 P247 was justified by the only potential DRE determined in gag gene encoding HIV-1 P24 protein, whereas the regulatory region of CMV genes encoding IE gp/UL37 has five potent DRE, 1.65 kb/UL36 – six DRE, pp65 and pp71 – each has seven DRE, and pp150 – ten DRE. Also, from six to eight DRE were determined with SITECON in the regulatory region of HSV-1 IE genes encoding tegument proteins, UL36 and UL37, and of UL19 gene encoding bindingglycoprotein C (gC). So, TCDD in the low picomolar range may activate in human cells AhR: Arnt transcription pathway that triggers CMV and HSV-1 reactivation by binding to numerous promoter DRE within immediate-early (IE) genes UL37 and UL36, thus committing virus to the lytic cycle.

Keywords: dioxin response elements, Ah receptor, AhR: Arnt transcription pathway, human and viral genes

Procedia PDF Downloads 102
27667 Geographic Information System Cloud for Sustainable Digital Water Management: A Case Study

Authors: Mohamed H. Khalil

Abstract:

Water is one of the most crucial elements which influence human lives and development. Noteworthy, over the last few years, GIS plays a significant role in optimizing water management systems, especially after exponential developing in this sector. In this context, the Egyptian government initiated an advanced ‘GIS-Web Based System’. This system is efficiently designed to tangibly assist and optimize the complement and integration of data between departments of Call Center, Operation and Maintenance, and laboratory. The core of this system is a unified ‘Data Model’ for all the spatial and tabular data of the corresponding departments. The system is professionally built to provide advanced functionalities such as interactive data collection, dynamic monitoring, multi-user editing capabilities, enhancing data retrieval, integrated work-flow, different access levels, and correlative information record/track. Noteworthy, this cost-effective system contributes significantly not only in the completeness of the base-map (93%), the water network (87%) in high level of details GIS format, enhancement of the performance of the customer service, but also in reducing the operating costs/day-to-day operations (~ 5-10 %). In addition, the proposed system facilitates data exchange between different departments (Call Center, Operation and Maintenance, and laboratory), which allowed a better understanding/analyzing of complex situations. Furthermore, this system reflected tangibly on: (i) dynamic environmental monitor/water quality indicators (ammonia, turbidity, TDS, sulfate, iron, pH, etc.), (ii) improved effectiveness of the different water departments, (iii) efficient deep advanced analysis, (iv) advanced web-reporting tools (daily, weekly, monthly, quarterly, and annually), (v) tangible planning synthesizing spatial and tabular data; and finally, (vi) scalable decision support system. It is worth to highlight that the proposed future plan (second phase) of this system encompasses scalability will extend to include integration with departments of Billing and SCADA. This scalability will comprise advanced functionalities in association with the existing one to allow further sustainable contributions.

Keywords: GIS Web-Based, base-map, water network, decision support system

Procedia PDF Downloads 85
27666 Identification of Hepatocellular Carcinoma Using Supervised Learning Algorithms

Authors: Sagri Sharma

Abstract:

Analysis of diseases integrating multi-factors increases the complexity of the problem and therefore, development of frameworks for the analysis of diseases is an issue that is currently a topic of intense research. Due to the inter-dependence of the various parameters, the use of traditional methodologies has not been very effective. Consequently, newer methodologies are being sought to deal with the problem. Supervised Learning Algorithms are commonly used for performing the prediction on previously unseen data. These algorithms are commonly used for applications in fields ranging from image analysis to protein structure and function prediction and they get trained using a known dataset to come up with a predictor model that generates reasonable predictions for the response to new data. Gene expression profiles generated by DNA analysis experiments can be quite complex since these experiments can involve hypotheses involving entire genomes. The application of well-known machine learning algorithm - Support Vector Machine - to analyze the expression levels of thousands of genes simultaneously in a timely, automated and cost effective way is thus used. The objectives to undertake the presented work are development of a methodology to identify genes relevant to Hepatocellular Carcinoma (HCC) from gene expression dataset utilizing supervised learning algorithms and statistical evaluations along with development of a predictive framework that can perform classification tasks on new, unseen data.

Keywords: artificial intelligence, biomarker, gene expression datasets, hepatocellular carcinoma, machine learning, supervised learning algorithms, support vector machine

Procedia PDF Downloads 423
27665 Competition Between the Effects of Pesticides and Immune-activation on the Expression of Toll Pathway Genes

Authors: Dani Sukkar, Ali Kanso, Philippe Laval-Gilly, Jairo Falla-Angel

Abstract:

The honeybees' immune system is challenged by different risk factors that induce various responses. However, complex scenarios where bees are exposed to different pesticides simultaneously with immune activation are not well evaluated. The Toll pathway is one of the main signaling pathways studied in invertebrate immune responses, and it is a good indicator of the effect of such complex interactions in addition to key signaling elements of other pathways like Relish of the immune deficiency (IMD) pathway or Eater, the phagocytosis receptor or vitellogenin levels. Honeybee hemocytes extracted from 5th instar larvae were exposed to imidacloprid and/or amitraz with or without the presence of the zymosan a as an immune activator. The gene expression of multiple immune related genes were studied, including spaetzle, Toll, myD88, relish, eater and vitellogenin, by real-time polymerase chain reaction after RNA extraction. The results demonstrated that the Toll pathway is mainly affected by the pesticides; imidacloprid and amitraz, especially by their different combinations. Furthermore, immune activation by zymosan A, a fungal cell-wall component, acts to mitigate to some extent the effect of pesticides on the different levels of the Toll pathway. In addition, imidacloprid, amitraz, and zymosan A have complex and context-specific interactions depending on the levels of immune activation and the pathway evaluated affecting immune-gene expression differently.

Keywords: toll pathway, immune modulation, β-glucan, imidacloprid, amitraz, honeybees, immune genes

Procedia PDF Downloads 76
27664 Impact of Applying Bag House Filter Technology in Cement Industry on Ambient Air Quality - Case Study: Alexandria Cement Company

Authors: Haggag H. Mohamed, Ghatass F. Zekry, Shalaby A. Elsayed

Abstract:

Most sources of air pollution in Egypt are of anthropogenic origin. Alexandria Governorate is located at north of Egypt. The main contributing sectors of air pollution in Alexandria are industry, transportation and area source due to human activities. Alexandria includes more than 40% of the industrial activities in Egypt. Cement manufacture contributes a significant amount to the particulate pollution load. Alexandria Portland Cement Company (APCC) surrounding was selected to be the study area. APCC main kiln stack Total Suspended Particulate (TSP) continuous monitoring data was collected for assessment of dust emission control technology. Electro Static Precipitator (ESP) was fixed on the cement kiln since 2002. The collected data of TSP for first quarter of 2012 was compared to that one in first quarter of 2013 after installation of new bag house filter. In the present study, based on these monitoring data and metrological data a detailed air dispersion modeling investigation was carried out using the Industrial Source Complex Short Term model (ISC3-ST) to find out the impact of applying new bag house filter control technology on the neighborhood ambient air quality. The model results show a drastic reduction of the ambient TSP hourly average concentration from 44.94μg/m3 to 5.78μg/m3 which assures the huge positive impact on the ambient air quality by applying bag house filter technology on APCC cement kiln

Keywords: air pollution modeling, ambient air quality, baghouse filter, cement industry

Procedia PDF Downloads 265
27663 Evaluating the Validity of CFD Model of Dispersion in a Complex Urban Geometry Using Two Sets of Experimental Measurements

Authors: Mohammad R. Kavian Nezhad, Carlos F. Lange, Brian A. Fleck

Abstract:

This research presents the validation study of a computational fluid dynamics (CFD) model developed to simulate the scalar dispersion emitted from rooftop sources around the buildings at the University of Alberta North Campus. The ANSYS CFX code was used to perform the numerical simulation of the wind regime and pollutant dispersion by solving the 3D steady Reynolds-averaged Navier-Stokes (RANS) equations on a building-scale high-resolution grid. The validation study was performed in two steps. First, the CFD model performance in 24 cases (eight wind directions and three wind speeds) was evaluated by comparing the predicted flow fields with the available data from the previous measurement campaign designed at the North Campus, using the standard deviation method (SDM), while the estimated results of the numerical model showed maximum average percent errors of approximately 53% and 37% for wind incidents from the North and Northwest, respectively. Good agreement with the measurements was observed for the other six directions, with an average error of less than 30%. In the second step, the reliability of the implemented turbulence model, numerical algorithm, modeling techniques, and the grid generation scheme was further evaluated using the Mock Urban Setting Test (MUST) dispersion dataset. Different statistical measures, including the fractional bias (FB), the geometric mean bias (MG), and the normalized mean square error (NMSE), were used to assess the accuracy of the predicted dispersion field. Our CFD results are in very good agreement with the field measurements.

Keywords: CFD, plume dispersion, complex urban geometry, validation study, wind flow

Procedia PDF Downloads 129
27662 Patterns of Gear Substitution in Norwegian Trawl Fishery

Authors: Tannaz Alizadeh Ashrafi

Abstract:

Seasonal variability in biological and ecological factors together with relevant socio-economic determinants affect the choice of fishing gear, frequency of its usage and decision about gear conversion under multi-species situation. In order to deal with the complex dynamics of fisheries, fishers, constantly, have to make decisions about how long to fish, when to go fishing, what species to target, and which gear to deploy. In this regard, the purpose of this study is to examine the dynamics of gear/ species combination in Norwegian fishery. A comprehensive vessel-level set of data for the main economically important species including: cod, haddock, saithe, shrimp and mixed catch have been obtained from the Norwegian Directorate of Fisheries covering the daily data in 2010. The present study further analyzes the level of flexibility and rationality of the fishers operating in the trawl fishery. The results show the disproportion between intention of the trawl fishers to maximize profitability of each fishing trip and their harvesting behavior in reality. Discussion is based on so-called maximizing behavior.

Keywords: trawl fishery, gear substitution, rationality, profit maximizing behavior

Procedia PDF Downloads 271
27661 Applying Critical Realism to Qualitative Social Work Research: A Critical Realist Approach for Social Work Thematic Analysis Method

Authors: Lynne Soon-Chean Park

Abstract:

Critical Realism (CR) has emerged as an alternative to both the positivist and constructivist perspectives that have long dominated social work research. By unpacking the epistemic weakness of two dogmatic perspectives, CR provides a useful philosophical approach that incorporates the ontological objectivist and subjectivist stance. The CR perspective suggests an alternative approach for social work researchers who have long been looking to engage in the complex interplay between perceived reality at the empirical level and the objective reality that lies behind the empirical event as a causal mechanism. However, despite the usefulness of CR in informing social work research, little practical guidance is available about how CR can inform methodological considerations in social work research studies. This presentation aims to provide a detailed description of CR-informed thematic analysis by drawing examples from a social work doctoral research of Korean migrants’ experiences and understanding of trust associated with their settlement experience in New Zealand. Because of its theoretical flexibility and accessibility as a qualitative analysis method, thematic analysis can be applied as a method that works both to search for the demi-regularities of the collected data and to identify the causal mechanisms that lay behind the empirical data. In so doing, this presentation seeks to provide a concrete and detailed exemplar for social work researchers wishing to employ CR in their qualitative thematic analysis process.

Keywords: critical Realism, data analysis, epistemology, research methodology, social work research, thematic analysis

Procedia PDF Downloads 208
27660 Classification of Generative Adversarial Network Generated Multivariate Time Series Data Featuring Transformer-Based Deep Learning Architecture

Authors: Thrivikraman Aswathi, S. Advaith

Abstract:

As there can be cases where the use of real data is somehow limited, such as when it is hard to get access to a large volume of real data, we need to go for synthetic data generation. This produces high-quality synthetic data while maintaining the statistical properties of a specific dataset. In the present work, a generative adversarial network (GAN) is trained to produce multivariate time series (MTS) data since the MTS is now being gathered more often in various real-world systems. Furthermore, the GAN-generated MTS data is fed into a transformer-based deep learning architecture that carries out the data categorization into predefined classes. Further, the model is evaluated across various distinct domains by generating corresponding MTS data.

Keywords: GAN, transformer, classification, multivariate time series

Procedia PDF Downloads 123
27659 Encapsulation of Volatile Citronella Essential oil by Coacervation: Efficiency and Release Kinetic Study

Authors: Rafeqah Raslan, Mastura AbdManaf, Junaidah Jai, Istikamah Subuki, Ana Najwa Mustapa

Abstract:

The volatile citronella essential oil was encapsulated by simple coacervation and complex coacervation using gum Arabic and gelatin as wall material. Glutaraldehyde was used in the methodology as crosslinking agent. The citronella standard calibration graph was developed with R2 equal to 0.9523 for the accurate determination of encapsulation efficiency and release study. The release kinetic was analyzed based on Fick’s law of diffusion for polymeric system and linear graph of log fraction release over log time was constructed to determine the release rate constant, k and diffusion coefficient, n. Both coacervation methods in the present study produce encapsulation efficiency around 94%. The capsules morphology analysis supported the release kinetic mechanisms of produced capsules for both coacervation process.

Keywords: simple coacervation, complex coacervation, encapsulation efficiency, release kinetic study

Procedia PDF Downloads 312
27658 Generative AI: A Comparison of Conditional Tabular Generative Adversarial Networks and Conditional Tabular Generative Adversarial Networks with Gaussian Copula in Generating Synthetic Data with Synthetic Data Vault

Authors: Lakshmi Prayaga, Chandra Prayaga. Aaron Wade, Gopi Shankar Mallu, Harsha Satya Pola

Abstract:

Synthetic data generated by Generative Adversarial Networks and Autoencoders is becoming more common to combat the problem of insufficient data for research purposes. However, generating synthetic data is a tedious task requiring extensive mathematical and programming background. Open-source platforms such as the Synthetic Data Vault (SDV) and Mostly AI have offered a platform that is user-friendly and accessible to non-technical professionals to generate synthetic data to augment existing data for further analysis. The SDV also provides for additions to the generic GAN, such as the Gaussian copula. We present the results from two synthetic data sets (CTGAN data and CTGAN with Gaussian Copula) generated by the SDV and report the findings. The results indicate that the ROC and AUC curves for the data generated by adding the layer of Gaussian copula are much higher than the data generated by the CTGAN.

Keywords: synthetic data generation, generative adversarial networks, conditional tabular GAN, Gaussian copula

Procedia PDF Downloads 73
27657 Artificial Intelligence and Distributed System Computing: Application and Practice in Real Life

Authors: Lai Junzhe, Wang Lihao, Burra Venkata Durga Kumar

Abstract:

In recent years, due to today's global technological advances, big data and artificial intelligence technologies have been widely used in various industries and fields, playing an important role in reducing costs and increasing efficiency. Among them, artificial intelligence has derived another branch in its own continuous progress and the continuous development of computer personnel, namely distributed artificial intelligence computing systems. Distributed AI is a method for solving complex learning, decision-making, and planning problems, characterized by the ability to take advantage of large-scale computation and the spatial distribution of resources, and accordingly, it can handle problems with large data sets. Nowadays, distributed AI is widely used in military, medical, and human daily life and brings great convenience and efficient operation to life. In this paper, we will discuss three areas of distributed AI computing systems in vision processing, blockchain, and smart home to introduce the performance of distributed systems and the role of AI in distributed systems.

Keywords: distributed system, artificial intelligence, blockchain, IoT, visual information processing, smart home

Procedia PDF Downloads 105
27656 Topography Effects on Wind Turbines Wake Flow

Authors: H. Daaou Nedjari, O. Guerri, M. Saighi

Abstract:

A numerical study was conducted to optimize the positioning of wind turbines over complex terrains. Thus, a two-dimensional disk model was used to calculate the flow velocity deficit in wind farms for both flat and complex configurations. The wind turbine wake was assessed using the hybrid methods that combine CFD (Computational Fluid Dynamics) with the actuator disc model. The wind turbine rotor has been defined with a thrust force, coupled with the Navier-Stokes equations that were resolved by an open source computational code (Code_Saturne V3.0 developed by EDF) The simulations were conducted in atmospheric boundary layer condition considering a two-dimensional region located at the north of Algeria at 36.74°N longitude, 02.97°E latitude. The topography elevation values were collected according to a longitudinal direction of 1km downwind. The wind turbine sited over topography was simulated for different elevation variations. The main of this study is to determine the topography effect on the behavior of wind farm wake flow. For this, the wake model applied in complex terrain needs to selects the singularity effects of topography on the vertical wind flow without rotor disc first. This step allows to determine the existence of mixing scales and friction forces zone near the ground. So, according to the ground relief the wind flow waS disturbed by turbulence and a significant speed variation. Thus, the singularities of the velocity field were thoroughly collected and thrust coefficient Ct was calculated using the specific speed. In addition, to evaluate the land effect on the wake shape, the flow field was also simulated considering different rotor hub heights. Indeed, the distance between the ground and the hub height of turbine (Hhub) was tested in a flat terrain for different locations as Hhub=1.125D, Hhub = 1.5D and Hhub=2D (D is rotor diameter) considering a roughness value of z0=0.01m. This study has demonstrated that topographical farm induce a significant effect on wind turbines wakes, compared to that on flat terrain.

Keywords: CFD, wind turbine wake, k-epsilon model, turbulence, complex topography

Procedia PDF Downloads 559
27655 Workflow Based Inspection of Geometrical Adaptability from 3D CAD Models Considering Production Requirements

Authors: Tobias Huwer, Thomas Bobek, Gunter Spöcker

Abstract:

Driving forces for enhancements in production are trends like digitalization and individualized production. Currently, such developments are restricted to assembly parts. Thus, complex freeform surfaces are not addressed in this context. The need for efficient use of resources and near-net-shape production will require individualized production of complex shaped workpieces. Due to variations between nominal model and actual geometry, this can lead to changes in operations in Computer-aided process planning (CAPP) to make CAPP manageable for an adaptive serial production. In this context, 3D CAD data can be a key to realizing that objective. Along with developments in the geometrical adaptation, a preceding inspection method based on CAD data is required to support the process planner by finding objective criteria to make decisions about the adaptive manufacturability of workpieces. Nowadays, this kind of decisions is depending on the experience-based knowledge of humans (e.g. process planners) and results in subjective decisions – leading to a variability of workpiece quality and potential failure in production. In this paper, we present an automatic part inspection method, based on design and measurement data, which evaluates actual geometries of single workpiece preforms. The aim is to automatically determine the suitability of the current shape for further machining, and to provide a basis for an objective decision about subsequent adaptive manufacturability. The proposed method is realized by a workflow-based approach, keeping in mind the requirements of industrial applications. Workflows are a well-known design method of standardized processes. Especially in applications like aerospace industry standardization and certification of processes are an important aspect. Function blocks, providing a standardized, event-driven abstraction to algorithms and data exchange, will be used for modeling and execution of inspection workflows. Each analysis step of the inspection, such as positioning of measurement data or checking of geometrical criteria, will be carried out by function blocks. One advantage of this approach is its flexibility to design workflows and to adapt algorithms specific to the application domain. In general, within the specified tolerance range it will be checked if a geometrical adaption is possible. The development of particular function blocks is predicated on workpiece specific information e.g. design data. Furthermore, for different product lifecycle phases, appropriate logics and decision criteria have to be considered. For example, tolerances for geometric deviations are different in type and size for new-part production compared to repair processes. In addition to function blocks, appropriate referencing systems are important. They need to support exact determination of position and orientation of the actual geometries to provide a basis for precise analysis. The presented approach provides an inspection methodology for adaptive and part-individual process chains. The analysis of each workpiece results in an inspection protocol and an objective decision about further manufacturability. A representative application domain is the product lifecycle of turbine blades containing a new-part production and a maintenance process. In both cases, a geometrical adaptation is required to calculate individual production data. In contrast to existing approaches, the proposed initial inspection method provides information to decide between different potential adaptive machining processes.

Keywords: adaptive, CAx, function blocks, turbomachinery

Procedia PDF Downloads 294
27654 Audio-Visual Aids and the Secondary School Teaching

Authors: Shrikrishna Mishra, Badri Yadav

Abstract:

In this complex society of today where experiences are innumerable and varied, it is not at all possible to present every situation in its original colors hence the opportunities for learning by actual experiences always are not at all possible. It is only through the use of proper audio visual aids that the life situation can be trough in the class room by an enlightened teacher in their simplest form and representing the original to the highest point of similarity which is totally absent in the verbal or lecture method. In the presence of audio aids, the attention is attracted interest roused and suitable atmosphere for proper understanding is automatically created, but in the existing traditional method greater efforts are to be made in order to achieve the aforesaid essential requisite. Inspire of the best and sincere efforts on the side of the teacher the net effect as regards understanding or learning in general is quite negligible.

Keywords: Audio-Visual Aids, the secondary school teaching, complex society, audio

Procedia PDF Downloads 477
27653 Neuropsychological Testing in a Multi-Lingual Society: Normative Data for South African Adults in More Than Eight Languages

Authors: Sharon Truter, Ann B. Shuttleworth-Edwards

Abstract:

South Africa is a developing country with significant diversity in languages spoken and quality of education available, creating challenges for fair and accurate neuropsychological assessments when most available neuropsychological tests are obtained from English-speaking developed countries. The aim of this research was to compare normative data on a spectrum of commonly used neuropsychological tests for English- and Afrikaans-speaking South Africans with relatively high quality of education and South Africans with relatively low quality of education who speak Afrikaans, Sesotho, Setswana, Sepedi, Tsonga, Venda, Xhosa or Zulu. The participants were all healthy adults aged 18-60 years, with 8-12 years of education. All the participants were tested in their first language on the following tests: two non-verbal tests (Rey Osterrieth Complex Figure Test and Bell Cancellation Test), four verbal fluency tests (category, phonemic, verb and 'any words'), one verbal learning test (Rey Auditory Verbal Leaning Test) and three tests that have a verbal component (Trail Making Test A & B; Symbol Digit Modalities Test and Digit Span). Descriptive comparisons of mean scores and standard deviations across the language groups and between the groups with relatively high versus low quality of education highlight the importance of using normative data that takes into account language and quality of education.

Keywords: cross-cultural, language, multi-lingual, neuropsychological testing, quality of education

Procedia PDF Downloads 160
27652 A Privacy Protection Scheme Supporting Fuzzy Search for NDN Routing Cache Data Name

Authors: Feng Tao, Ma Jing, Guo Xian, Wang Jing

Abstract:

Named Data Networking (NDN) replaces IP address of traditional network with data name, and adopts dynamic cache mechanism. In the existing mechanism, however, only one-to-one search can be achieved because every data has a unique name corresponding to it. There is a certain mapping relationship between data content and data name, so if the data name is intercepted by an adversary, the privacy of the data content and user’s interest can hardly be guaranteed. In order to solve this problem, this paper proposes a one-to-many fuzzy search scheme based on order-preserving encryption to reduce the query overhead by optimizing the caching strategy. In this scheme, we use hash value to ensure the user’s query safe from each node in the process of search, so does the privacy of the requiring data content.

Keywords: NDN, order-preserving encryption, fuzzy search, privacy

Procedia PDF Downloads 478
27651 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton

Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani

Abstract:

Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.

Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton

Procedia PDF Downloads 318
27650 Critical Design Futures: A Foresight 3.0 Approach to Business Transformation and Innovation

Authors: Nadya Patel, Jawn Lim

Abstract:

Foresight 3.0 is a synergistic methodology that encompasses systems analysis, future studies, capacity building, and forward planning. These components are interconnected, fostering a collective anticipatory intelligence that promotes societal resilience (Ravetz, 2020). However, traditional applications of these strands can often fall short, leading to missed opportunities and narrow perspectives. Therefore, Foresight 3.0 champions a holistic approach to tackling complex issues, focusing on systemic transformations and power dynamics. Businesses are pivotal in preparing the workforce for an increasingly uncertain and complex world. This necessitates the adoption of innovative tools and methodologies, such as Foresight 3.0, that can better equip young employees to anticipate and navigate future challenges. Firstly, the incorporation of its methodology into workplace training can foster a holistic perspective among employees. This approach encourages employees to think beyond the present and consider wider social, economic, and environmental contexts, thereby enhancing their problem-solving skills and resilience. This paper discusses our research on integrating Foresight 3.0's transformative principles with a newly developed Critical Design Futures (CDF) framework to equip organisations with the ability to innovate for the world's most complex social problems. This approach is grounded in 'collective forward intelligence,' enabling mutual learning, co-innovation, and co-production among a diverse stakeholder community, where business transformation and innovation are achieved.

Keywords: business transformation, innovation, foresight, critical design

Procedia PDF Downloads 74
27649 Model Order Reduction Using Hybrid Genetic Algorithm and Simulated Annealing

Authors: Khaled Salah

Abstract:

Model order reduction has been one of the most challenging topics in the past years. In this paper, a hybrid solution of genetic algorithm (GA) and simulated annealing algorithm (SA) are used to approximate high-order transfer functions (TFs) to lower-order TFs. In this approach, hybrid algorithm is applied to model order reduction putting in consideration improving accuracy and preserving the properties of the original model which are two important issues for improving the performance of simulation and computation and maintaining the behavior of the original complex models being reduced. Compared to conventional mathematical methods that have been used to obtain a reduced order model of high order complex models, our proposed method provides better results in terms of reducing run-time. Thus, the proposed technique could be used in electronic design automation (EDA) tools.

Keywords: genetic algorithm, simulated annealing, model reduction, transfer function

Procedia PDF Downloads 140
27648 Data Disorders in Healthcare Organizations: Symptoms, Diagnoses, and Treatments

Authors: Zakieh Piri, Shahla Damanabi, Peyman Rezaii Hachesoo

Abstract:

Introduction: Healthcare organizations like other organizations suffer from a number of disorders such as Business Sponsor Disorder, Business Acceptance Disorder, Cultural/Political Disorder, Data Disorder, etc. As quality in healthcare care mostly depends on the quality of data, we aimed to identify data disorders and its symptoms in two teaching hospitals. Methods: Using a self-constructed questionnaire, we asked 20 questions in related to quality and usability of patient data stored in patient records. Research population consisted of 150 managers, physicians, nurses, medical record staff who were working at the time of study. We also asked their views about the symptoms and treatments for any data disorders they mentioned in the questionnaire. Using qualitative methods we analyzed the answers. Results: After classifying the answers, we found six main data disorders: incomplete data, missed data, late data, blurred data, manipulated data, illegible data. The majority of participants believed in their important roles in treatment of data disorders while others believed in health system problems. Discussion: As clinicians have important roles in producing of data, they can easily identify symptoms and disorders of patient data. Health information managers can also play important roles in early detection of data disorders by proactively monitoring and periodic check-ups of data.

Keywords: data disorders, quality, healthcare, treatment

Procedia PDF Downloads 427