Search results for: raw complex data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28304

Search results for: raw complex data

28184 Reliability Assessment and Failure Detection in a Complex Human-Machine System Using Agent-Based and Human Decision-Making Modeling

Authors: Sanjal Gavande, Thomas Mazzuchi, Shahram Sarkani

Abstract:

In a complex aerospace operational environment, identifying failures in a procedure involving multiple human-machine interactions are difficult. These failures could lead to accidents causing loss of hardware or human life. The likelihood of failure further increases if operational procedures are tested for a novel system with multiple human-machine interfaces and with no prior performance data. The existing approach in the literature of reviewing complex operational tasks in a flowchart or tabular form doesn’t provide any insight into potential system failures due to human decision-making ability. To address these challenges, this research explores an agent-based simulation approach for reliability assessment and fault detection in complex human-machine systems while utilizing a human decision-making model. The simulation will predict the emergent behavior of the system due to the interaction between humans and their decision-making capability with the varying states of the machine and vice-versa. Overall system reliability will be evaluated based on a defined set of success-criteria conditions and the number of recorded failures over an assigned limit of Monte Carlo runs. The study also aims at identifying high-likelihood failure locations for the system. The research concludes that system reliability and failures can be effectively calculated when individual human and machine agent states are clearly defined. This research is limited to the operations phase of a system lifecycle process in an aerospace environment only. Further exploration of the proposed agent-based and human decision-making model will be required to allow for a greater understanding of this topic for application outside of the operations domain.

Keywords: agent-based model, complex human-machine system, human decision-making model, system reliability assessment

Procedia PDF Downloads 164
28183 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory

Authors: Xiaochen Mu

Abstract:

Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.

Keywords: data protection, property rights, intellectual property, Big data

Procedia PDF Downloads 31
28182 A Fast Community Detection Algorithm

Authors: Chung-Yuan Huang, Yu-Hsiang Fu, Chuen-Tsai Sun

Abstract:

Community detection represents an important data-mining tool for analyzing and understanding real-world complex network structures and functions. We believe that at least four criteria determine the appropriateness of a community detection algorithm: (a) it produces useable normalized mutual information (NMI) and modularity results for social networks, (b) it overcomes resolution limitation problems associated with synthetic networks, (c) it produces good NMI results and performance efficiency for Lancichinetti-Fortunato-Radicchi (LFR) benchmark networks, and (d) it produces good modularity and performance efficiency for large-scale real-world complex networks. To our knowledge, no existing community detection algorithm meets all four criteria. In this paper, we describe a simple hierarchical arc-merging (HAM) algorithm that uses network topologies and rule-based arc-merging strategies to identify community structures that satisfy the criteria. We used five well-studied social network datasets and eight sets of LFR benchmark networks to validate the ground-truth community correctness of HAM, eight large-scale real-world complex networks to measure its performance efficiency, and two synthetic networks to determine its susceptibility to resolution limitation problems. Our results indicate that the proposed HAM algorithm is capable of providing satisfactory performance efficiency and that HAM-identified communities were close to ground-truth communities in social and LFR benchmark networks while overcoming resolution limitation problems.

Keywords: complex network, social network, community detection, network hierarchy

Procedia PDF Downloads 220
28181 Electron Impact Ionization Cross-Sections for e-C₅H₅N₅ Scattering

Authors: Manoj Kumar

Abstract:

Ionization cross sections of molecules due to electron impact play an important role in chemical processes in various branches of applied physics, such as radiation chemistry, gas discharges, plasmas etching in semiconductors, planetary upper atmospheric physics, mass spectrometry, etc. In the present work, we have calculated the total ionization cross sections for Adenine (C₅H₅N₅), a biologically important molecule, by electron impact in the incident electron energy range from ionization threshold to 2 keV employing a well-known Jain-Khare semiempirical formulation based on Bethe and Möllor cross sections. In the non-availability of the experimental results, the present results are in good agreement qualitatively as well as quantitatively with available theoretical results. The present results drive our confidence for further investigation of complex bio-molecule with better accuracy. Notwithstanding, the present method can deduce reliable cross-sectional data for complex targets with adequate accuracy and may facilitate the acclimatization of calculated cross-sections into atomic molecular cross-section data sets for modeling codes and other applications.

Keywords: electron impact ionization cross-sections, oscillator strength, jain-khare semiempirical approach

Procedia PDF Downloads 109
28180 Dissimilarity Measure for General Histogram Data and Its Application to Hierarchical Clustering

Authors: K. Umbleja, M. Ichino

Abstract:

Symbolic data mining has been developed to analyze data in very large datasets. It is also useful in cases when entry specific details should remain hidden. Symbolic data mining is quickly gaining popularity as datasets in need of analyzing are becoming ever larger. One type of such symbolic data is a histogram, which enables to save huge amounts of information into a single variable with high-level of granularity. Other types of symbolic data can also be described in histograms, therefore making histogram a very important and general symbolic data type - a method developed for histograms - can also be applied to other types of symbolic data. Due to its complex structure, analyzing histograms is complicated. This paper proposes a method, which allows to compare two histogram-valued variables and therefore find a dissimilarity between two histograms. Proposed method uses the Ichino-Yaguchi dissimilarity measure for mixed feature-type data analysis as a base and develops a dissimilarity measure specifically for histogram data, which allows to compare histograms with different number of bins and bin widths (so called general histogram). Proposed dissimilarity measure is then used as a measure for clustering. Furthermore, linkage method based on weighted averages is proposed with the concept of cluster compactness to measure the quality of clustering. The method is then validated with application on real datasets. As a result, the proposed dissimilarity measure is found producing adequate and comparable results with general histograms without the loss of detail or need to transform the data.

Keywords: dissimilarity measure, hierarchical clustering, histograms, symbolic data analysis

Procedia PDF Downloads 156
28179 Communication Infrastructure Required for a Driver Behaviour Monitoring System, ‘SiaMOTO’ IT Platform

Authors: Dogaru-Ulieru Valentin, Sălișteanu Ioan Corneliu, Ardeleanu Mihăiță Nicolae, Broscăreanu Ștefan, Sălișteanu Bogdan, Mihai Mihail

Abstract:

The SiaMOTO system is a communications and data processing platform for vehicle traffic. The human factor is the most important factor in the generation of this data, as the driver is the one who dictates the trajectory of the vehicle. Like any trajectory, specific parameters refer to position, speed and acceleration. Constant knowledge of these parameters allows complex analyses. Roadways allow many vehicles to travel through their confined space, and the overlapping trajectories of several vehicles increase the likelihood of collision events, known as road accidents. Any such event has causes that lead to its occurrence, so the conditions for its occurrence are known. The human factor is predominant in deciding the trajectory parameters of the vehicle on the road, so monitoring it by knowing the events reported by the DiaMOTO device over time, will generate a guide to target any potentially high-risk driving behavior and reward those who control the driving phenomenon well. In this paper, we have focused on detailing the communication infrastructure of the DiaMOTO device with the traffic data collection server, the infrastructure through which the database that will be used for complex AI/DLM analysis is built. The central element of this description is the data string in CODEC-8 format sent by the DiaMOTO device to the SiaMOTO collection server database. The data presented are specific to a functional infrastructure implemented in an experimental model stage, by installing on a number of 50 vehicles DiaMOTO unique code devices, integrating ADAS and GPS functions, through which vehicle trajectories can be monitored 24 hours a day.

Keywords: DiaMOTO, Codec-8, ADAS, GPS, driver monitoring

Procedia PDF Downloads 72
28178 Application of Universal Distribution Factors for Real-Time Complex Power Flow Calculation

Authors: Abdullah M. Alodhaiani, Yasir A. Alturki, Mohamed A. Elkady

Abstract:

Complex power flow distribution factors, which relate line complex power flows to the bus injected complex powers, have been widely used in various power system planning and analysis studies. In particular, AC distribution factors have been used extensively in the recent power and energy pricing studies in free electricity market field. As was demonstrated in the existing literature, many of the electricity market related costing studies rely on the use of the distribution factors. These known distribution factors, whether the injection shift factors (ISF’s) or power transfer distribution factors (PTDF’s), are linear approximations of the first order sensitivities of the active power flows with respect to various variables. This paper presents a novel model for evaluating the universal distribution factors (UDF’s), which are appropriate for an extensive range of power systems analysis and free electricity market studies. These distribution factors are used for the calculations of lines complex power flows and its independent of bus power injections, they are compact matrix-form expressions with total flexibility in determining the position on the line at which line flows are measured. The proposed approach was tested on IEEE 9-Bus system. Numerical results demonstrate that the proposed approach is very accurate compared with exact method.

Keywords: distribution factors, power system, sensitivity factors, electricity market

Procedia PDF Downloads 467
28177 Flow Prediction of Boundary Shear Stress with Enlarging Flood Plains

Authors: Spandan Sahu, Amiya Kumar Pati, Kishanjit Kumar Khatua

Abstract:

River is our main source of water which is a form of open channel flow and the flow in open channel provides with many complex phenomenon of sciences that needs to be tackled such as the critical flow conditions, boundary shear stress and depth averaged velocity. During floods, part of a river is carried by the simple main channel and rest is carried by flood plains. For such compound asymmetric channels, the flow structure becomes complicated due to momentum exchange between main channel and adjoining flood plains. Distribution of boundary shear in subsections provides us with the concept of momentum transfer between the interface of main channel and the flood plains. Experimentally, to get better data with accurate results are very complex because of the complexity of the problem. Hence, CES software has been used to tackle the complex processes to determine the shear stresses at different sections of an open channel having asymmetric flood plains on both sides of the main channel and the results is compared with the symmetric flood plains for various geometrical shapes and flow conditions. Error analysis is also performed to know the degree of accuracy of the model implemented.

Keywords: depth average velocity, non prismatic compound channel, relative flow depth, velocity distribution

Procedia PDF Downloads 146
28176 A Network Approach to Analyzing Financial Markets

Authors: Yusuf Seedat

Abstract:

The necessity to understand global financial markets has increased following the unfortunate spread of the recent financial crisis around the world. Financial markets are considered to be complex systems consisting of highly volatile move-ments whose indexes fluctuate without any clear pattern. Analytic methods of stock prices have been proposed in which financial markets are modeled using common network analysis tools and methods. It has been found that two key components of social network analysis are relevant to modeling financial markets, allowing us to forecast accurate predictions of stock prices within the financial market. Financial markets have a number of interacting components, leading to complex behavioral patterns. This paper describes a social network approach to analyzing financial markets as a viable approach to studying the way complex stock markets function. We also look at how social network analysis techniques and metrics are used to gauge an understanding of the evolution of financial markets as well as how community detection can be used to qualify and quantify in-fluence within a network.

Keywords: network analysis, social networks, financial markets, stocks, nodes, edges, complex networks

Procedia PDF Downloads 187
28175 A Sociological Investigation on the Population and Public Spaces of Nguyen Cong Tru, a Soviet-Style Collective Housing Complex in Hanoi in Regards to Its New Community-Focused Architectural Design

Authors: Duy Nguyen Do, Bart Julien Dewancker

Abstract:

Many Soviet-style collective housing complexes (also known as KTT) were built since the 1960s in Hanoi to support the post-war population growth. Those low-rise buildings have created well-knitted, robust communities, so much to the point that in most complexes, all families in one housing block would know each other, occasionally interact and provide supports in need. To understand how the community of collective housing complexes have developed and maintained in order to adapt their advantages into modern housing designs, the study is executed on the site of Nguyen Cong Tru KTT. This is one of the oldest KTT in Hanoi, completed in 1954. The complex also has an unique characteristic that is closely related to its community: the symbiotic relationship with Hom – a flea market that has been co-developing with Nguyen Cong Tru KTT since its beginning. The research consists of three phases: the first phase is a sociological investigation with Nguyen Cong Tru KTT’s current residents and a site survey on the complex’s economic and architectural characteristics. In the second phase, the collected data is analyzed to find out people’s opinions with the KTT’s concerning their satisfaction with the current housing status, floor plan organization, community, the relationship between the KTT’s dedicated public spaces with the flea market and their usage. Simultaneously, the master plan and gathered information regarding current architectural characteristics of the complex are also inspected. On the third phase, the analyses’ results will provide information regarding the issues, positive trends and significant historical features of the complex’s architecture in order to generate suitable proposals for the redesigning project of Nguyen Cong Tru KTT, a design focused on vitalizing modern apartments’ communities.

Keywords: collective house community, collective house public space, community-focused, redesigning Nguyen Cong Tru KTT, sociological investigation

Procedia PDF Downloads 354
28174 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 63
28173 Synthesis and Use of Thiourea Derivative (1-Phenyl-3- Benzoyl-2-Thiourea) for Extraction of Cadmium Ion

Authors: Abdulfattah M. Alkherraz, Zaineb I. Lusta, Ahmed E. Zubi

Abstract:

The environmental pollution by heavy metals became more problematic nowadays. To solve the problem of Cadmium accumulation in human organs which lead to dangerous effects on human health, and to determine its concentration, the organic legand 1-phenyl-3-benzoyl-2-thiourea was used to extract the cadmium ions from its solution. This legand as one of thiourea derivatives was successfully synthesized. The legand was characterized by NMR and CHN elemental analysis, and used to extract the cadmium from its solutions by formation of a stable complex at neutral pH. The complex was characterized by elemental analysis and melting point. The concentrations of cadmium ions before and after the extraction were determined by Atomic Absorption Spectrophotometer (AAS). The data show the percentage of the extract was more than 98.7% of the concentration of cadmium used in the study.

Keywords: thiourea derivatives, cadmium extraction, water, environment

Procedia PDF Downloads 344
28172 In vitro Comparison Study of Biologically Synthesized Cupper-Disulfiram Nanoparticles with Its Free Corresponding Complex as Therapeutic Approach for Breast and Liver Cancer

Authors: Marwa M. Abu-Serie, Marwa M. Eltarahony

Abstract:

The search for reliable, effective, and safe nanoparticles (NPs) as a treatment for cancer is a pressing priority. In this study, Cu-NPs were fabricated by Streptomyces cyaneofuscatus through simultaneous bioreduction strategy of copper nitrate salt. The as-prepared Cu-NPs subjected to structural analysis; energy-dispersive X-ray spectroscopy, elemental mapping, X-ray diffraction, transmission electron microscopy, and ζ-potential. These biological synthesized Cu-NPs were mixed with disulfiram (DS), forming a nanocomplex of Cu-DS with a size of ~135 nm. The prepared nanocomplex (nanoCu-DS) exhibited higher anticancer activity than that of free complex of DS-Cu, Cu-NPs, and DS alone. This was illustrated by the lowest IC50 of nanoCu-DS (< 4 µM) against human breast and liver cancer cell lines comparing with DS-Cu, Cu-NPs, and DS (~8, 22.98-33.51 and 11.95-14.86, respectively). Moreover, flow cytometric analysis confirmed that higher apoptosis percentage range of nanoCu-DS-treated in MDA-MB 231, MCF-7, Huh-7, and HepG-2 cells (51.24-65.28%) than free complex of Cu-DS ( < 4.5%). Regarding inhibition potency of liver and breast cancer cell migration, no significant difference was recorded between free and nanocomplex. Furthermore, nanoCu-DS suppressed gene expression of β-catenine, Akt, and NF-κB and upregulated p53 expression (> 3, >15, > 5 and ≥ 3 folds, respectively) more efficiently than free complex (all ~ 1 fold) in MDA-MB 231 and Huh-7 cells. Our finding proved this prepared nano complex has a powerful anticancer activity relative to free complex, thereby offering a promising cancer treatment.

Keywords: biologically prepared Cu-NPs, breast cancer cell lines, liver cancer cell lines, nanoCu- disulfiram

Procedia PDF Downloads 183
28171 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures

Authors: Silvina Caíno-Lores, Jesús Carretero

Abstract:

Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.

Keywords: data locality, data-centric computing, large scale infrastructures, cloud computing

Procedia PDF Downloads 254
28170 Predicting Costs in Construction Projects with Machine Learning: A Detailed Study Based on Activity-Level Data

Authors: Soheila Sadeghi

Abstract:

Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.

Keywords: cost prediction, machine learning, project management, random forest, neural networks

Procedia PDF Downloads 41
28169 A Machine Learning Approach for Efficient Resource Management in Construction Projects

Authors: Soheila Sadeghi

Abstract:

Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.

Keywords: resource allocation, machine learning, optimization, data-driven decision-making, project management

Procedia PDF Downloads 29
28168 Complex Network Analysis of Seismicity and Applications to Short-Term Earthquake Forecasting

Authors: Kahlil Fredrick Cui, Marissa Pastor

Abstract:

Earthquakes are complex phenomena, exhibiting complex correlations in space, time, and magnitude. Recently, the concept of complex networks has been used to shed light on the statistical and dynamical characteristics of regional seismicity. In this work, we study the relationships and interactions of seismic regions in Chile, Japan, and the Philippines through weighted and directed complex network analysis. Geographical areas are digitized into cells of fixed dimensions which in turn become the nodes of the network when an earthquake has occurred therein. Nodes are linked if a correlation exists between them as determined and measured by a correlation metric. The networks are found to be scale-free, exhibiting power-law behavior in the distributions of their different centrality measures: the in- and out-degree and the in- and out-strength. The evidence is also found of preferential interaction between seismically active regions through their degree-degree correlations suggesting that seismicity is dictated by the activity of a few active regions. The importance of a seismic region to the overall seismicity is measured using a generalized centrality metric taken to be an indicator of its activity or passivity. The spatial distribution of earthquake activity indicates the areas where strong earthquakes have occurred in the past while the passivity distribution points toward the likely locations an earthquake would occur whenever another one happens elsewhere. Finally, we propose a method that would project the location of the next possible earthquake using the generalized centralities coupled with correlations calculated between the latest earthquakes and a geographical point in the future.

Keywords: complex networks, correlations, earthquake, hazard assessment

Procedia PDF Downloads 206
28167 Copper Complexe Derivative of Chalcone: Synthesis, Characterization, Electrochemical Properties and XRD/Hirschfeld Surface

Authors: Salima Tabti, Amel Djedouani., Djouhra Aggoun, Ismail Warad

Abstract:

The reaction of copper (II) with 4-hydroxy-3-[(2E)-3-(1H-indol-3-yl)prop-2-enoyl]-6-methyl-2H-pyran-2-one (HL) lead to a new complexe: Cu(L)₂(DMF)₂. The crystal structure of the Cu(L)₂(DMF)₂ complex have been determined by X-ray diffraction methods. The Cu(II) lying on an inversion centre is coordinated to six oxygen atoms forming an octahedral elongated. Additionally, the electrochemical behavior of the metal complexe was investigated by cyclic voltammetry at a glassy carbon electrode (GC) in CH₃CN solution, showing the quasi-reversible redox process ascribed to the reduction of the MII/MI couple. The X-ray single crystal structure data of the complex was matched excellently with the optimized monomer structure of the desired compound; Hirschfeld surface analysis supported the packed crystal lattice 3D network intermolecular forces.

Keywords: chalcones, cyclic voltametry, X-ray, Hirschfeld surface

Procedia PDF Downloads 58
28166 Simulation of Red Blood Cells in Complex Micro-Tubes

Authors: Ting Ye, Nhan Phan-Thien, Chwee Teck Lim, Lina Peng, Huixin Shi

Abstract:

In biofluid flow systems, often the flow problems of fluids of complex structures, such as the flow of red blood cells (RBCs) through complex capillary vessels, need to be considered. In this paper, we aim to apply a particle-based method, Smoothed Dissipative Particle Dynamics (SDPD), to simulate the motion and deformation of RBCs in complex micro-tubes. We first present the theoretical models, including SDPD model, RBC-fluid interaction model, RBC deformation model, RBC aggregation model, and boundary treatment model. After that, we show the verification and validation of these models, by comparing our numerical results with the theoretical, experimental and previously-published numerical results. Finally, we provide some simulation cases, such as the motion and deformation of RBCs in rectangular, cylinder, curved, bifurcated, and constricted micro-tubes, respectively.

Keywords: aggregation, deformation, red blood cell, smoothed dissipative particle dynamics

Procedia PDF Downloads 167
28165 Retrospective Reconstruction of Time Series Data for Integrated Waste Management

Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy

Abstract:

The development, operation and maintenance of Integrated Waste Management Systems (IWMS) affects essentially the sustainable concern of every region. The features of such systems have great influence on all of the components of sustainability. In order to reach the optimal way of processes, a comprehensive mapping of the variables affecting the future efficiency of the system is needed such as analysis of the interconnections among the components and modelling of their interactions. The planning of a IWMS is based fundamentally on technical and economical opportunities and the legal framework. Modelling the sustainability and operation effectiveness of a certain IWMS is not in the scope of the present research. The complexity of the systems and the large number of the variables require the utilization of a complex approach to model the outcomes and future risks. This complex method should be able to evaluate the logical framework of the factors composing the system and the interconnections between them. The authors of this paper studied the usability of the Fuzzy Cognitive Map (FCM) approach modelling the future operation of IWMS’s. The approach requires two input data set. One is the connection matrix containing all the factors affecting the system in focus with all the interconnections. The other input data set is the time series, a retrospective reconstruction of the weights and roles of the factors. This paper introduces a novel method to develop time series by content analysis.

Keywords: content analysis, factors, integrated waste management system, time series

Procedia PDF Downloads 321
28164 AI for Efficient Geothermal Exploration and Utilization

Authors: Velimir "monty" Vesselinov, Trais Kliplhuis, Hope Jasperson

Abstract:

Artificial intelligence (AI) is a powerful tool in the geothermal energy sector, aiding in both exploration and utilization. Identifying promising geothermal sites can be challenging due to limited surface indicators and the need for expensive drilling to confirm subsurface resources. Geothermal reservoirs can be located deep underground and exhibit complex geological structures, making traditional exploration methods time-consuming and imprecise. AI algorithms can analyze vast datasets of geological, geophysical, and remote sensing data, including satellite imagery, seismic surveys, geochemistry, geology, etc. Machine learning algorithms can identify subtle patterns and relationships within this data, potentially revealing hidden geothermal potential in areas previously overlooked. To address these challenges, a SIML (Science-Informed Machine Learning) technology has been developed. SIML methods are different from traditional ML techniques. In both cases, the ML models are trained to predict the spatial distribution of an output (e.g., pressure, temperature, heat flux) based on a series of inputs (e.g., permeability, porosity, etc.). The traditional ML (a) relies on deep and wide neural networks (NNs) based on simple algebraic mappings to represent complex processes. In contrast, the SIML neurons incorporate complex mappings (including constitutive relationships and physics/chemistry models). This results in ML models that have a physical meaning and satisfy physics laws and constraints. The prototype of the developed software, called GeoTGO, is accessible through the cloud. Our software prototype demonstrates how different data sources can be made available for processing, executed demonstrative SIML analyses, and presents the results in a table and graphic form.

Keywords: science-informed machine learning, artificial inteligence, exploration, utilization, hidden geothermal

Procedia PDF Downloads 43
28163 Harnessing Deep-Level Metagenomics to Explore the Three Dynamic One Health Areas: Healthcare, Domiciliary and Veterinary

Authors: Christina Killian, Katie Wall, Séamus Fanning, Guerrino Macori

Abstract:

Deep-level metagenomics offers a useful technical approach to explore the three dynamic One Health axes: healthcare, domiciliary and veterinary. There is currently limited understanding of the composition of complex biofilms, natural abundance of AMR genes and gene transfer occurrence in these ecological niches. By using a newly established small-scale complex biofilm model, COMBAT has the potential to provide new information on microbial diversity, antimicrobial resistance (AMR)-encoding gene abundance, and their transfer in complex biofilms of importance to these three One Health axes. Shotgun metagenomics has been used to sample the genomes of all microbes comprising the complex communities found in each biofilm source. A comparative analysis between untreated and biocide-treated biofilms is described. The basic steps include the purification of genomic DNA, followed by library preparation, sequencing, and finally, data analysis. The use of long-read sequencing facilitates the completion of metagenome-assembled genomes (MAG). Samples were sequenced using a PromethION platform, and following quality checks, binning methods, and bespoke bioinformatics pipelines, we describe the recovery of individual MAGs to identify mobile gene elements (MGE) and the corresponding AMR genotypes that map to these structures. High-throughput sequencing strategies have been deployed to characterize these communities. Accurately defining the profiles of these niches is an essential step towards elucidating the impact of the microbiota on each niche biofilm environment and their evolution.

Keywords: COMBAT, biofilm, metagenomics, high-throughput sequencing

Procedia PDF Downloads 51
28162 Complex Event Processing System Based on the Extended ECA Rule

Authors: Kwan Hee Han, Jun Woo Lee, Sung Moon Bae, Twae Kyung Park

Abstract:

ECA (Event-Condition-Action) languages are largely adopted for event processing since they are an intuitive and powerful paradigm for programming reactive systems. However, there are some limitations about ECA rules for processing of complex events such as coupling of event producer and consumer. The objective of this paper is to propose an ECA rule pattern to improve the current limitations of ECA rule, and to develop a prototype system. In this paper, conventional ECA rule is separated into 3 parts and each part is extended to meet the requirements of CEP. Finally, event processing logic is established by combining the relevant elements of 3 parts. The usability of proposed extended ECA rule is validated by a test scenario in this study.

Keywords: complex event processing, ECA rule, Event processing system, event-driven architecture, internet of things

Procedia PDF Downloads 527
28161 Impulsive Synchronization of Periodically Forced Complex Duffing's Oscillators

Authors: Shaban Aly, Ali Al-Qahtani, Houari B. Khenous

Abstract:

Synchronization is an important phenomenon commonly observed in nature. A system of periodically forced complex Duffings oscillators was introduced and shown to display chaotic behavior and possess strange attractors. Such complex oscillators appear in many problems of physics and engineering, as, for example, nonlinear optics, deep-water wave theory, plasma physics and bimolecular dynamics. In this paper, we study the remarkable phenomenon of chaotic synchronization on these oscillator systems, using impulsive synchronization techniques. We derive analytical expressions for impulsive control functions and show that the dynamics of error evolution is globally stable, by constructing appropriate Lyapunov functions. This means that, for a relatively large set initial conditions, the differences between the drive and response systems vanish exponentially and synchronization is achieved. Numerical results are obtained to test the validity of the analytical expressions and illustrate the efficiency of these techniques for inducing chaos synchronization in our nonlinear oscillators.

Keywords: complex nonlinear oscillators, impulsive synchronization, chaotic systems, global exponential synchronization

Procedia PDF Downloads 442
28160 Perception-Oriented Model Driven Development for Designing Data Acquisition Process in Wireless Sensor Networks

Authors: K. Indra Gandhi

Abstract:

Wireless Sensor Networks (WSNs) have always been characterized for application-specific sensing, relaying and collection of information for further analysis. However, software development was not considered as a separate entity in this process of data collection which has posed severe limitations on the software development for WSN. Software development for WSN is a complex process since the components involved are data-driven, network-driven and application-driven in nature. This implies that there is a tremendous need for the separation of concern from the software development perspective. A layered approach for developing data acquisition design based on Model Driven Development (MDD) has been proposed as the sensed data collection process itself varies depending upon the application taken into consideration. This work focuses on the layered view of the data acquisition process so as to ease the software point of development. A metamodel has been proposed that enables reusability and realization of the software development as an adaptable component for WSN systems. Further, observing users perception indicates that proposed model helps in improving the programmer's productivity by realizing the collaborative system involved.

Keywords: data acquisition, model-driven development, separation of concern, wireless sensor networks

Procedia PDF Downloads 428
28159 Evaluation of Antioxidant Activity as a Function of the Genetic Diversity of Canna indica Complex

Authors: A. Rattanapittayapron, O. Vanijajiva

Abstract:

Canna indica is a prominent species complex in tropical and subtropical areas. They become indigenous in Southeast Asia where they have been introduced. At present, C. indica complex comprises over hundred hybrids, are cultivated as commercial horticulture. The species complex contains starchy rhizome having economic value in terms of food and herbal medicine. In addition, bright color of the flowers makes it a valuable ornamental plant and potential source for natural colorant. This study aims to assess genetic diversity of four varieties of C. indica complex based on SRAP (sequence-related amplified polymorphism) and iPBS (inter primer binding site) markers. We also examined phytochemical characteristics and antioxidant properties of the flower extracts from four different color varieties. Results showed that despite of the genetic variation, there were no significant differences in phytochemical characteristics and antioxidant properties of flowers. The SRAP and iPBS results agree with the more primitive traits showed by morphological information and phytochemical and antioxidant characteristics from the flowers. Since Canna flowers has long been used as natural colorants together with the antioxidant activities from the ethanol extracts in this study, there are likely to be good source for cosmetics additives.

Keywords: Canna indica, antioxidant activity, genetic diversity, SRAP, iPBS

Procedia PDF Downloads 308
28158 A Data Driven Methodological Approach to Economic Pre-Evaluation of Reuse Projects of Ancient Urban Centers

Authors: Pietro D'Ambrosio, Roberta D'Ambrosio

Abstract:

The upgrading of the architectural and urban heritage of the urban historic centers almost always involves the planning for the reuse and refunctionalization of the structures. Such interventions have complexities linked to the need to take into account the urban and social context in which the structure and its intrinsic characteristics such as historical and artistic value are inserted. To these, of course, we have to add the need to make a preliminary estimate of recovery costs and more generally to assess the economic and financial sustainability of the whole project of re-socialization. Particular difficulties are encountered during the pre-assessment of costs since it is often impossible to perform analytical surveys and structural tests for both structural conditions and obvious cost and time constraints. The methodology proposed in this work, based on a multidisciplinary and data-driven approach, is aimed at obtaining, at very low cost, reasonably priced economic evaluations of the interventions to be carried out. In addition, the specific features of the approach used, derived from the predictive analysis techniques typically applied in complex IT domains (big data analytics), allow to obtain as a result indirectly the evaluation process of a shared database that can be used on a generalized basis to estimate such other projects. This makes the methodology particularly indicated in those cases where it is expected to intervene massively across entire areas of historical city centers. The methodology has been partially tested during a study aimed at assessing the feasibility of a project for the reuse of the monumental complex of San Massimo, located in the historic center of Salerno, and is being further investigated.

Keywords: evaluation, methodology, restoration, reuse

Procedia PDF Downloads 179
28157 A Tuning Method for Microwave Filter via Complex Neural Network and Improved Space Mapping

Authors: Shengbiao Wu, Weihua Cao, Min Wu, Can Liu

Abstract:

This paper presents an intelligent tuning method of microwave filter based on complex neural network and improved space mapping. The tuning process consists of two stages: the initial tuning and the fine tuning. At the beginning of the tuning, the return loss of the filter is transferred to the passband via the error of phase. During the fine tuning, the phase shift caused by the transmission line and the higher order mode is removed by the curve fitting. Then, an Cauchy method based on the admittance parameter (Y-parameter) is used to extract the coupling matrix. The influence of the resonant cavity loss is eliminated during the parameter extraction process. By using processed data pairs (the amount of screw variation and the variation of the coupling matrix), a tuning model is established by the complex neural network. In view of the improved space mapping algorithm, the mapping relationship between the actual model and the ideal model is established, and the amplitude and direction of the tuning is constantly updated. Finally, the tuning experiment of the eight order coaxial cavity filter shows that the proposed method has a good effect in tuning time and tuning precision.

Keywords: microwave filter, scattering parameter, coupling matrix, intelligent tuning

Procedia PDF Downloads 297
28156 Synthesis, Spectroscopic and XRD Study of Transition Metal Complex Derived from Low-Schiff Acyl-Hydrazone Ligand

Authors: Mohamedou El Boukhary, Farba Bouyagui Tamboura, A. Hamady Barry, T. Moussa Seck, Mohamed L. Gaye

Abstract:

Nowadays, low-schiff acyl-hydrazone ligands are highly sought after due to their wide applications in various fields of biology, coordination chemistry, and catalysis. They are studied for their antioxidant, antibacterial and antiviral properties. The complexes of transition metals and the lanthanide they derive are well known for their magnetic, optical, and catalytic properties. In this work, we present the synthesis of an acyl-hydrazone (H2L) schiff base and their 3d transition complexes. The ligand (H2L) is characterized by IR, NMR (1H; 13C) spectroscopy. The complexes are characterized by different physic-chemical techniques such as IR, UV-visible, conductivity, measurement of magnetic susceptibility. The study of XRD allowed us to elucidate the crystalline structure of the manganese (Mn) complex. The asymmetric unit of the complex is composed of two molecules of the ligand, one manganese (II) ion, and two coordinate chloride ions; the environment around Mn is described as a pentagonal base bipyramid. In the crystal lattice, the asymmetric unit is bound by hydrogen bonds.

Keywords: synthene, acyl-hydrazone, 3D transition metal complex, application

Procedia PDF Downloads 46
28155 Comparison of Monte Carlo Simulations and Experimental Results for the Measurement of Complex DNA Damage Induced by Ionizing Radiations of Different Quality

Authors: Ifigeneia V. Mavragani, Zacharenia Nikitaki, George Kalantzis, George Iliakis, Alexandros G. Georgakilas

Abstract:

Complex DNA damage consisting of a combination of DNA lesions, such as Double Strand Breaks (DSBs) and non-DSB base lesions occurring in a small volume is considered as one of the most important biological endpoints regarding ionizing radiation (IR) exposure. Strong theoretical (Monte Carlo simulations) and experimental evidence suggests an increment of the complexity of DNA damage and therefore repair resistance with increasing linear energy transfer (LET). Experimental detection of complex (clustered) DNA damage is often associated with technical deficiencies limiting its measurement, especially in cellular or tissue systems. Our groups have recently made significant improvements towards the identification of key parameters relating to the efficient detection of complex DSBs and non-DSBs in human cellular systems exposed to IR of varying quality (γ-, X-rays 0.3-1 keV/μm, α-particles 116 keV/μm and 36Ar ions 270 keV/μm). The induction and processing of DSB and non-DSB-oxidative clusters were measured using adaptations of immunofluorescence (γH2AX or 53PB1 foci staining as DSB probes and human repair enzymes OGG1 or APE1 as probes for oxidized purines and abasic sites respectively). In the current study, Relative Biological Effectiveness (RBE) values for DSB and non-DSB induction have been measured in different human normal (FEP18-11-T1) and cancerous cell lines (MCF7, HepG2, A549, MO59K/J). The experimental results are compared to simulation data obtained using a validated microdosimetric fast Monte Carlo DNA Damage Simulation code (MCDS). Moreover, this simulation approach is implemented in two realistic clinical cases, i.e. prostate cancer treatment using X-rays generated by a linear accelerator and a pediatric osteosarcoma case using a 200.6 MeV proton pencil beam. RBE values for complex DNA damage induction are calculated for the tumor areas. These results reveal a disparity between theory and experiment and underline the necessity for implementing highly precise and more efficient experimental and simulation approaches.

Keywords: complex DNA damage, DNA damage simulation, protons, radiotherapy

Procedia PDF Downloads 316