Search results for: weighted aggregation model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17194

Search results for: weighted aggregation model

17194 A Fuzzy Nonlinear Regression Model for Interval Type-2 Fuzzy Sets

Authors: O. Poleshchuk, E. Komarov

Abstract:

This paper presents a regression model for interval type-2 fuzzy sets based on the least squares estimation technique. Unknown coefficients are assumed to be triangular fuzzy numbers. The basic idea is to determine aggregation intervals for type-1 fuzzy sets, membership functions of whose are low membership function and upper membership function of interval type-2 fuzzy set. These aggregation intervals were called weighted intervals. Low and upper membership functions of input and output interval type-2 fuzzy sets for developed regression models are considered as piecewise linear functions.

Keywords: interval type-2 fuzzy sets, fuzzy regression, weighted interval

Procedia PDF Downloads 363
17193 Efficient Positioning of Data Aggregation Point for Wireless Sensor Network

Authors: Sifat Rahman Ahona, Rifat Tasnim, Naima Hassan

Abstract:

Data aggregation is a helpful technique for reducing the data communication overhead in wireless sensor network. One of the important tasks of data aggregation is positioning of the aggregator points. There are a lot of works done on data aggregation. But, efficient positioning of the aggregators points is not focused so much. In this paper, authors are focusing on the positioning or the placement of the aggregation points in wireless sensor network. Authors proposed an algorithm to select the aggregators positions for a scenario where aggregator nodes are more powerful than sensor nodes.

Keywords: aggregation point, data communication, data aggregation, wireless sensor network

Procedia PDF Downloads 150
17192 Modeling Aggregation of Insoluble Phase in Reactors

Authors: A. Brener, B. Ismailov, G. Berdalieva

Abstract:

In the paper we submit the modification of kinetic Smoluchowski equation for binary aggregation applying to systems with chemical reactions of first and second orders in which the main product is insoluble. The goal of this work is to create theoretical foundation and engineering procedures for calculating the chemical apparatuses in the conditions of joint course of chemical reactions and processes of aggregation of insoluble dispersed phases which are formed in working zones of the reactor.

Keywords: binary aggregation, clusters, chemical reactions, insoluble phases

Procedia PDF Downloads 301
17191 Notes on Frames in Weighted Hardy Spaces and Generalized Weighted Composition Operators

Authors: Shams Alyusof

Abstract:

This work is to enrich the studies of the frames due to their prominent role in pure mathematics as well as in applied mathematics and many applications in computer science and engineering. Recently, there are remarkable studies of operators that preserve frames on some spaces, and this research could be considered as an extension of such studies. Indeed, this paper is to we characterize weighted composition operators that preserve frames in weighted Hardy spaces on the open unit disk. Moreover, it shows that this characterization does not apply to generalized weighted composition operators on such spaces. Nevertheless, this study could be extended to provide more specific characterizations.

Keywords: frames, generalized weighted composition operators, weighted Hardy spaces, analytic functions

Procedia PDF Downloads 117
17190 Some Results for F-Minimal Hypersurfaces in Manifolds with Density

Authors: M. Abdelmalek

Abstract:

In this work, we study the hypersurfaces of constant weighted mean curvature embedded in weighted manifolds. We give a condition about these hypersurfaces to be minimal. This condition is given by the ellipticity of the weighted Newton transformations. We especially prove that two compact hypersurfaces of constant weighted mean curvature embedded in space forms and with the intersection in at least a point of the boundary must be transverse. The method is based on the calculus of the matrix of the second fundamental form in a boundary point and then the matrix associated with the Newton transformations. By equality, we find the weighted elementary symmetric function on the boundary of the hypersurface. We give in the end some examples and applications. Especially in Euclidean space, we use the above result to prove the Alexandrov spherical caps conjecture for the weighted case.

Keywords: weighted mean curvature, weighted manifolds, ellipticity, Newton transformations

Procedia PDF Downloads 86
17189 DNAJB6 Chaperone Prevents the Aggregation of Intracellular but not Extracellular Aβ Peptides Associated with Alzheimer’s Disease

Authors: Rasha M. Hussein, Reem M. Hashem, Laila A. Rashed

Abstract:

Alzheimer’s disease is the most common dementia disease in the elderly. It is characterized by the accumulation of extracellular amyloid β (Aβ) peptides and intracellular hyper-phosphorylated tau protein. In addition, recent evidence indicates that accumulation of intracellular amyloid β peptides may play a role in Alzheimer’s disease pathogenesis. This suggests that intracellular Heat Shock Proteins (HSP) that maintain the protein quality control in the cell might be potential candidates for disease amelioration. DNAJB6, a member of DNAJ family of HSP, effectively prevented the aggregation of poly glutamines stretches associated with Huntington’s disease both in vitro and in cells. In addition, DNAJB6 was found recently to delay the aggregation of Aβ42 peptides in vitro. In the present study, we investigated the ability of DNAJB6 to prevent the aggregation of both intracellular and extracellular Aβ peptides using transfection of HEK293 cells with Aβ-GFP and recombinant Aβ42 peptides respectively. We performed western blotting and immunofluorescence techniques. We found that DNAJB6 can prevent Aβ-GFP aggregation, but not the seeded aggregation initiated by extracellular Aβ peptides. Moreover, DNAJB6 required interaction with HSP70 to prevent the aggregation of Aβ-GFP protein and its J-domain was essential for this anti-aggregation activity. Interestingly, overexpression of other DNAJ proteins as well as HSPB1 suppressed Aβ-GFP aggregation efficiently. Our findings suggest that DNAJB6 is a promising candidate for the inhibition of Aβ-GFP mediated aggregation through a canonical HSP70 dependent mechanism.

Keywords: , Alzheimer’s disease, chaperone, DNAJB6, aggregation

Procedia PDF Downloads 506
17188 Wind Speed Prediction Using Passive Aggregation Artificial Intelligence Model

Authors: Tarek Aboueldahab, Amin Mohamed Nassar

Abstract:

Wind energy is a fluctuating energy source unlike conventional power plants, thus, it is necessary to accurately predict short term wind speed to integrate wind energy in the electricity supply structure. To do so, we present a hybrid artificial intelligence model of short term wind speed prediction based on passive aggregation of the particle swarm optimization and neural networks. As a result, improvement of the prediction accuracy is obviously obtained compared to the standard artificial intelligence method.

Keywords: artificial intelligence, neural networks, particle swarm optimization, passive aggregation, wind speed prediction

Procedia PDF Downloads 443
17187 Reliability and Probability Weighted Moment Estimation for Three Parameter Mukherjee-Islam Failure Model

Authors: Ariful Islam, Showkat Ahmad Lone

Abstract:

The Mukherjee-Islam Model is commonly used as a simple life time distribution to assess system reliability. The model exhibits a better fit for failure information and provides more appropriate information about hazard rate and other reliability measures as shown by various authors. It is possible to introduce a location parameter at a time (i.e., a time before which failure cannot occur) which makes it a more useful failure distribution than the existing ones. Even after shifting the location of the distribution, it represents a decreasing, constant and increasing failure rate. It has been shown to represent the appropriate lower tail of the distribution of random variables having fixed lower bound. This study presents the reliability computations and probability weighted moment estimation of three parameter model. A comparative analysis is carried out between three parameters finite range model and some existing bathtub shaped curve fitting models. Since probability weighted moment method is used, the results obtained can also be applied on small sample cases. Maximum likelihood estimation method is also applied in this study.

Keywords: comparative analysis, maximum likelihood estimation, Mukherjee-Islam failure model, probability weighted moment estimation, reliability

Procedia PDF Downloads 268
17186 Aggregation Scheduling Algorithms in Wireless Sensor Networks

Authors: Min Kyung An

Abstract:

In Wireless Sensor Networks which consist of tiny wireless sensor nodes with limited battery power, one of the most fundamental applications is data aggregation which collects nearby environmental conditions and aggregates the data to a designated destination, called a sink node. Important issues concerning the data aggregation are time efficiency and energy consumption due to its limited energy, and therefore, the related problem, named Minimum Latency Aggregation Scheduling (MLAS), has been the focus of many researchers. Its objective is to compute the minimum latency schedule, that is, to compute a schedule with the minimum number of timeslots, such that the sink node can receive the aggregated data from all the other nodes without any collision or interference. For the problem, the two interference models, the graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR), have been adopted with different power models, uniform-power and non-uniform power (with power control or without power control), and different antenna models, omni-directional antenna and directional antenna models. In this survey article, as the problem has proven to be NP-hard, we present and compare several state-of-the-art approximation algorithms in various models on the basis of latency as its performance measure.

Keywords: data aggregation, convergecast, gathering, approximation, interference, omni-directional, directional

Procedia PDF Downloads 224
17185 FLEX: A Backdoor Detection and Elimination Method in Federated Scenario

Authors: Shuqi Zhang

Abstract:

Federated learning allows users to participate in collaborative model training without sending data to third-party servers, reducing the risk of user data privacy leakage, and is widely used in smart finance and smart healthcare. However, the distributed architecture design of federation learning itself and the existence of secure aggregation protocols make it inherently vulnerable to backdoor attacks. To solve this problem, the federated learning backdoor defense framework FLEX based on group aggregation, cluster analysis, and neuron pruning is proposed, and inter-compatibility with secure aggregation protocols is achieved. The good performance of FLEX is verified by building a horizontal federated learning framework on the CIFAR-10 dataset for experiments, which achieves 98% success rate of backdoor detection and reduces the success rate of backdoor tasks to 0% ~ 10%.

Keywords: federated learning, secure aggregation, backdoor attack, cluster analysis, neuron pruning

Procedia PDF Downloads 90
17184 Analytical Study of Applying the Account Aggregation Approach in E-Banking Services

Authors: A. Al Drees, A. Alahmari, R. Almuwayshir

Abstract:

The advanced information technology is becoming an important factor in the development of financial services industry, especially the banking industry. It has introduced new ways of delivering banking to the customer, such as Internet Banking. Banks began to look at electronic banking (e-banking) as a means to replace some of their traditional branch functions using the Internet as a new distribution channel. Some consumers have at least more than one account, and across banks, and access these accounts using e-banking services. To look at the current net worth position, customers have to login to each of their accounts and get the details and work on consolidation. This not only takes ample time but it is a repetitive activity at a specified frequency. To address this point, an account aggregation concept is added as a solution. E-banking account aggregation, as one of the e-banking types, appeared to build a stronger relationship with customers. Account Aggregation Service generally refers to a service that allows customers to manage their bank accounts maintained in different institutions through a common Internet banking operating a platform, with a high concern to security and privacy. This paper presents an overview of an e-banking account aggregation approach as a new service in the e-banking field.

Keywords: e-banking, account aggregation, security, enterprise development

Procedia PDF Downloads 319
17183 Statistical and Analytical Comparison of GIS Overlay Modelings: An Appraisal on Groundwater Prospecting in Precambrian Metamorphics

Authors: Tapas Acharya, Monalisa Mitra

Abstract:

Overlay modeling is the most widely used conventional analysis for spatial decision support system. Overlay modeling requires a set of themes with different weightage computed in varied manners, which gives a resultant input for further integrated analysis. In spite of the popularity and most widely used technique; it gives inconsistent and erroneous results for similar inputs while processed in various GIS overlay techniques. This study is an attempt to compare and analyse the differences in the outputs of different overlay methods using GIS platform with same set of themes of the Precambrian metamorphic to obtain groundwater prospecting in Precambrian metamorphic rocks. The objective of the study is to emphasize the most suitable overlay method for groundwater prospecting in older Precambrian metamorphics. Seven input thematic layers like slope, Digital Elevation Model (DEM), soil thickness, lineament intersection density, average groundwater table fluctuation, stream density and lithology have been used in the spatial overlay models of fuzzy overlay, weighted overlay and weighted sum overlay methods to yield the suitable groundwater prospective zones. Spatial concurrence analysis with high yielding wells of the study area and the statistical comparative studies among the outputs of various overlay models using RStudio reveal that the Weighted Overlay model is the most efficient GIS overlay model to delineate the groundwater prospecting zones in the Precambrian metamorphic rocks.

Keywords: fuzzy overlay, GIS overlay model, groundwater prospecting, Precambrian metamorphics, weighted overlay, weighted sum overlay

Procedia PDF Downloads 122
17182 The Use of Geographically Weighted Regression for Deforestation Analysis: Case Study in Brazilian Cerrado

Authors: Ana Paula Camelo, Keila Sanches

Abstract:

The Geographically Weighted Regression (GWR) was proposed in geography literature to allow relationship in a regression model to vary over space. In Brazil, the agricultural exploitation of the Cerrado Biome is the main cause of deforestation. In this study, we propose a methodology using geostatistical methods to characterize the spatial dependence of deforestation in the Cerrado based on agricultural production indicators. Therefore, it was used the set of exploratory spatial data analysis tools (ESDA) and confirmatory analysis using GWR. It was made the calibration a non-spatial model, evaluation the nature of the regression curve, election of the variables by stepwise process and multicollinearity analysis. After the evaluation of the non-spatial model was processed the spatial-regression model, statistic evaluation of the intercept and verification of its effect on calibration. In an analysis of Spearman’s correlation the results between deforestation and livestock was +0.783 and with soybeans +0.405. The model presented R²=0.936 and showed a strong spatial dependence of agricultural activity of soybeans associated to maize and cotton crops. The GWR is a very effective tool presenting results closer to the reality of deforestation in the Cerrado when compared with other analysis.

Keywords: deforestation, geographically weighted regression, land use, spatial analysis

Procedia PDF Downloads 355
17181 PDDA: Priority-Based, Dynamic Data Aggregation Approach for Sensor-Based Big Data Framework

Authors: Lutful Karim, Mohammed S. Al-kahtani

Abstract:

Sensors are being used in various applications such as agriculture, health monitoring, air and water pollution monitoring, traffic monitoring and control and hence, play the vital role in the growth of big data. However, sensors collect redundant data. Thus, aggregating and filtering sensors data are significantly important to design an efficient big data framework. Current researches do not focus on aggregating and filtering data at multiple layers of sensor-based big data framework. Thus, this paper introduces (i) three layers data aggregation and framework for big data and (ii) a priority-based, dynamic data aggregation scheme (PDDA) for the lowest layer at sensors. Simulation results show that the PDDA outperforms existing tree and cluster-based data aggregation scheme in terms of overall network energy consumptions and end-to-end data transmission delay.

Keywords: big data, clustering, tree topology, data aggregation, sensor networks

Procedia PDF Downloads 337
17180 Hybrid Fuzzy Weighted K-Nearest Neighbor to Predict Hospital Readmission for Diabetic Patients

Authors: Soha A. Bahanshal, Byung G. Kim

Abstract:

Identification of patients at high risk for hospital readmission is of crucial importance for quality health care and cost reduction. Predicting hospital readmissions among diabetic patients has been of great interest to many researchers and health decision makers. We build a prediction model to predict hospital readmission for diabetic patients within 30 days of discharge. The core of the prediction model is a modified k Nearest Neighbor called Hybrid Fuzzy Weighted k Nearest Neighbor algorithm. The prediction is performed on a patient dataset which consists of more than 70,000 patients with 50 attributes. We applied data preprocessing using different techniques in order to handle data imbalance and to fuzzify the data to suit the prediction algorithm. The model so far achieved classification accuracy of 80% compared to other models that only use k Nearest Neighbor.

Keywords: machine learning, prediction, classification, hybrid fuzzy weighted k-nearest neighbor, diabetic hospital readmission

Procedia PDF Downloads 180
17179 A Stochastic Analytic Hierarchy Process Based Weighting Model for Sustainability Measurement in an Organization

Authors: Faramarz Khosravi, Gokhan Izbirak

Abstract:

A weighted statistical stochastic based Analytical Hierarchy Process (AHP) model for modeling the potential barriers and enablers of sustainability for measuring and assessing the sustainability level is proposed. For context-dependent potential barriers and enablers, the proposed model takes the basis of the properties of the variables describing the sustainability functions and was developed into a realistic analytical model for the sustainable behavior of an organization. This thus serves as a means for measuring the sustainability of the organization. The main focus of this paper was the application of the AHP tool in a statistically-based model for measuring sustainability. Hence a strong weighted stochastic AHP based procedure was achieved. A case study scenario of a widely reported major Canadian electric utility was adopted to demonstrate the applicability of the developed model and comparatively examined its results with those of an equal-weighted model method. Variations in the sustainability of a company, as fluctuations, were figured out during the time. In the results obtained, sustainability index for successive years changed form 73.12%, 79.02%, 74.31%, 76.65%, 80.49%, 79.81%, 79.83% to more exact values 73.32%, 77.72%, 76.76%, 79.41%, 81.93%, 79.72%, and 80,45% according to priorities of factors that have found by expert views, respectively. By obtaining relatively necessary informative measurement indicators, the model can practically and effectively evaluate the sustainability extent of any organization and also to determine fluctuations in the organization over time.

Keywords: AHP, sustainability fluctuation, environmental indicators, performance measurement

Procedia PDF Downloads 114
17178 Developing a Decision-Making Tool for Prioritizing Green Building Initiatives

Authors: Tayyab Ahmad, Gerard Healey

Abstract:

Sustainability in built environment sector is subject to many development constraints. Building projects are developed under different requirements of deliverables which makes each project unique. For an owner organization, i.e., a higher-education institution, involved in a significant building stock, it is important to prioritize some of the sustainability initiatives over the others in order to align the sustainable building development with organizational goals. The point-based green building rating tools i.e. Green Star, LEED, BREEAM are becoming increasingly popular and are well-acknowledged worldwide for verifying a sustainable development. It is imperative to synthesize a multi-criteria decision-making tool that can capitalize on the point-based methodology of rating systems while customizing the sustainable development of building projects according to the individual requirements and constraints of the client organization. A multi-criteria decision-making tool for the University of Melbourne is developed that builds on the action-learning and experience of implementing Green Buildings at the University of Melbourne. The tool evaluates the different sustainable building initiatives based on the framework of Green Star rating tool of Green Building Council of Australia. For each different sustainability initiative the decision-making tool makes an assessment based on at least five performance criteria including the ease with which a sustainability initiative can be achieved and the potential of a sustainability initiative to enhance project objectives, reduce life-cycle costs, enhance University’s reputation, and increase the confidence in quality construction. The use of a weighted aggregation mathematical model in the proposed tool can have a considerable role in the decision-making process of a Green Building project by indexing the Green Building initiatives in terms of organizational priorities. The index value of each initiative will be based on its alignment with some of the key performance criteria. The usefulness of the decision-making tool is validated by conducting structured interviews with some of the key stakeholders involved in the development of sustainable building projects at the University of Melbourne. The proposed tool is realized to help a client organization in deciding that within limited resources which sustainability initiatives and practices are more important to be pursued than others.

Keywords: higher education institution, multi-criteria decision-making tool, organizational values, prioritizing sustainability initiatives, weighted aggregation model

Procedia PDF Downloads 225
17177 Post-Contrast Susceptibility Weighted Imaging vs. Post-Contrast T1 Weighted Imaging for Evaluation of Brain Lesions

Authors: Sujith Rajashekar Swamy, Meghana Rajashekara Swamy

Abstract:

Although T1-weighted gadolinium-enhanced imaging (T1-Gd) has its established clinical role in diagnosing brain lesions of infectious and metastatic origins, the use of post-contrast susceptibility-weighted imaging (SWI) has been understudied. This observational study aims to explore and compare the prominence of brain parenchymal lesions between T1-Gd and SWI-Gd images. A cross-sectional study design was utilized to analyze 58 patients with brain parenchymal lesions using T1-Gd and SWI-Gd scanning techniques. Our results indicated that SWI-Gd enhanced the conspicuity of metastatic as well as infectious brain lesions when compared to T1-Gd. Consequently, it can be used as an adjunct to T1-Gd for post-contrast imaging, thereby avoiding additional contrast administration. Improved conspicuity of brain lesions translates directly to enhanced patient outcomes, and hence SWI-Gd imaging proves useful to meet that endpoint.

Keywords: susceptibility weighted, T1 weighted, brain lesions, gadolinium contrast

Procedia PDF Downloads 122
17176 Simulation of Red Blood Cells in Complex Micro-Tubes

Authors: Ting Ye, Nhan Phan-Thien, Chwee Teck Lim, Lina Peng, Huixin Shi

Abstract:

In biofluid flow systems, often the flow problems of fluids of complex structures, such as the flow of red blood cells (RBCs) through complex capillary vessels, need to be considered. In this paper, we aim to apply a particle-based method, Smoothed Dissipative Particle Dynamics (SDPD), to simulate the motion and deformation of RBCs in complex micro-tubes. We first present the theoretical models, including SDPD model, RBC-fluid interaction model, RBC deformation model, RBC aggregation model, and boundary treatment model. After that, we show the verification and validation of these models, by comparing our numerical results with the theoretical, experimental and previously-published numerical results. Finally, we provide some simulation cases, such as the motion and deformation of RBCs in rectangular, cylinder, curved, bifurcated, and constricted micro-tubes, respectively.

Keywords: aggregation, deformation, red blood cell, smoothed dissipative particle dynamics

Procedia PDF Downloads 164
17175 Solvent Extraction in Ionic Liquids: Structuration and Aggregation Effects on Extraction Mechanisms

Authors: Sandrine Dourdain, Cesar Lopez, Tamir Sukhbaatar, Guilhem Arrachart, Stephane Pellet-Rostaing

Abstract:

A promising challenge in solvent extraction is to replace the conventional organic solvents, with ionic liquids (IL). Depending on the extraction systems, these new solvents show better efficiency than the conventional ones. Although some assumptions based on ions exchanges have been proposed in the literature, these properties are not predictable because the involved mechanisms are still poorly understood. It is well established that the mechanisms underlying solvent extraction processes are based not only on the molecular chelation of the extractant molecules but also on their ability to form supra-molecular aggregates due to their amphiphilic nature. It is therefore essential to evaluate how IL affects the aggregation properties of the extractant molecules. Our aim is to evaluate the influence of IL structure and polarity on solvent extraction mechanisms, by looking at the aggregation of the extractant molecules in IL. We compare extractant systems that are well characterized in common solvents and show thanks to SAXS and SANS measurements, that in the absence of IL ion exchange mechanisms, extraction properties are related to aggregation.

Keywords: solvent extraction in Ionic liquid, aggregation, Ionic liquids structure, SAXS, SANS

Procedia PDF Downloads 151
17174 Distributed Framework for Pothole Detection and Monitoring Using Federated Learning

Authors: Ezil Sam Leni, Shalen S.

Abstract:

Transport service monitoring and upkeep are essential components of smart city initiatives. The main risks to the relevant departments and authorities are the ever-increasing vehicular traffic and the conditions of the roads. In India, the economy is greatly impacted by the road transport sector. In 2021, the Ministry of Road Transport and Highways Transport, Government of India, produced a report with statistical data on traffic accidents. The data included the number of fatalities, injuries, and other pertinent criteria. This study proposes a distributed infrastructure for the monitoring, detection, and reporting of potholes to the appropriate authorities. In a distributed environment, the nodes are the edge devices, and local edge servers, and global servers. The edge devices receive the initial model to be employed from the global server. The YOLOv8 model for pothole detection is used in the edge devices. The edge devices run the pothole detection model, gather the pothole images on their path, and send the updates to the nearby edge server. The local edge server selects the clients for its aggregation process, aggregates the model updates and sends the updates to the global server. The global server collects the updates from the local edge servers, performs aggregation and derives the updated model. The updated model has the information about the potholes received from the local edge servers and notifies the updates to the local edge servers and concerned authorities for monitoring and maintenance of road conditions. The entire process is implemented in FedCV distributed environment with the implementation using the client-server model and aggregation entities. After choosing the clients for its aggregation process, the local edge server gathers the model updates and transmits them to the global server. After gathering the updates from the regional edge servers, the global server aggregates them and creates the updated model. Performance indicators and the experimentation environment are assessed, discussed, and presented. Accelerometer data may be taken into consideration for improved performance in the future development of this study, in addition to the images captured from the transportation routes.

Keywords: federated Learning, pothole detection, distributed framework, federated averaging

Procedia PDF Downloads 91
17173 Econophysics: The Use of Entropy Measures in Finance

Authors: Muhammad Sheraz, Vasile Preda, Silvia Dedu

Abstract:

Concepts of econophysics are usually used to solve problems related to uncertainty and nonlinear dynamics. In the theory of option pricing the risk neutral probabilities play very important role. The application of entropy in finance can be regarded as the extension of both information entropy and the probability entropy. It can be an important tool in various financial methods such as measure of risk, portfolio selection, option pricing and asset pricing. Gulko applied Entropy Pricing Theory (EPT) for pricing stock options and introduced an alternative framework of Black-Scholes model for pricing European stock option. In this article, we present solutions to maximum entropy problems based on Tsallis, Weighted-Tsallis, Kaniadakis, Weighted-Kaniadakies entropies, to obtain risk-neutral densities. We have also obtained the value of European call and put in this framework.

Keywords: option pricing, Black-Scholes model, Tsallis entropy, Kaniadakis entropy, weighted entropy, risk-neutral density

Procedia PDF Downloads 294
17172 Multi Tier Data Collection and Estimation, Utilizing Queue Model in Wireless Sensor Networks

Authors: Amirhossein Mohajerzadeh, Abolghasem Mohajerzadeh

Abstract:

In this paper, target parameter is estimated with desirable precision in hierarchical wireless sensor networks (WSN) while the proposed algorithm also tries to prolong network lifetime as much as possible, using efficient data collecting algorithm. Target parameter distribution function is considered unknown. Sensor nodes sense the environment and send the data to the base station called fusion center (FC) using hierarchical data collecting algorithm. FC builds underlying phenomena based on collected data. Considering the aggregation level, x, the goal is providing the essential infrastructure to find the best value for aggregation level in order to prolong network lifetime as much as possible, while desirable accuracy is guaranteed (required sample size is fully depended on desirable precision). First, the sample size calculation algorithm is discussed, second, the average queue length based on M/M[x]/1/K queue model is determined and it is used for energy consumption calculation. Nodes can decrease transmission cost by aggregating incoming data. Furthermore, the performance of the new algorithm is evaluated in terms of lifetime and estimation accuracy.

Keywords: aggregation, estimation, queuing, wireless sensor network

Procedia PDF Downloads 182
17171 Event Driven Dynamic Clustering and Data Aggregation in Wireless Sensor Network

Authors: Ashok V. Sutagundar, Sunilkumar S. Manvi

Abstract:

Energy, delay and bandwidth are the prime issues of wireless sensor network (WSN). Energy usage optimization and efficient bandwidth utilization are important issues in WSN. Event triggered data aggregation facilitates such optimal tasks for event affected area in WSN. Reliable delivery of the critical information to sink node is also a major challenge of WSN. To tackle these issues, we propose an event driven dynamic clustering and data aggregation scheme for WSN that enhances the life time of the network by minimizing redundant data transmission. The proposed scheme operates as follows: (1) Whenever the event is triggered, event triggered node selects the cluster head. (2) Cluster head gathers data from sensor nodes within the cluster. (3) Cluster head node identifies and classifies the events out of the collected data using Bayesian classifier. (4) Aggregation of data is done using statistical method. (5) Cluster head discovers the paths to the sink node using residual energy, path distance and bandwidth. (6) If the aggregated data is critical, cluster head sends the aggregated data over the multipath for reliable data communication. (7) Otherwise aggregated data is transmitted towards sink node over the single path which is having the more bandwidth and residual energy. The performance of the scheme is validated for various WSN scenarios to evaluate the effectiveness of the proposed approach in terms of aggregation time, cluster formation time and energy consumed for aggregation.

Keywords: wireless sensor network, dynamic clustering, data aggregation, wireless communication

Procedia PDF Downloads 445
17170 Experimental Correlation for Erythrocyte Aggregation Rate in Population Balance Modeling

Authors: Erfan Niazi, Marianne Fenech

Abstract:

Red Blood Cells (RBCs) or erythrocytes tend to form chain-like aggregates under low shear rate called rouleaux. This is a reversible process and rouleaux disaggregate in high shear rates. Therefore, RBCs aggregation occurs in the microcirculation where low shear rates are present but does not occur under normal physiological conditions in large arteries. Numerical modeling of RBCs interactions is fundamental in analytical models of a blood flow in microcirculation. Population Balance Modeling (PBM) is particularly useful for studying problems where particles agglomerate and break in a two phase flow systems to find flow characteristics. In this method, the elementary particles lose their individual identity due to continuous destructions and recreations by break-up and agglomeration. The aim of this study is to find RBCs aggregation in a dynamic situation. Simplified PBM was used previously to find the aggregation rate on a static observation of the RBCs aggregation in a drop of blood under the microscope. To find aggregation rate in a dynamic situation we propose an experimental set up testing RBCs sedimentation. In this test, RBCs interact and aggregate to form rouleaux. In this configuration, disaggregation can be neglected due to low shear stress. A high-speed camera is used to acquire video-microscopic pictures of the process. The sizes of the aggregates and velocity of sedimentation are extracted using an image processing techniques. Based on the data collection from 5 healthy human blood samples, the aggregation rate was estimated as 2.7x103(±0.3 x103) 1/s.

Keywords: red blood cell, rouleaux, microfluidics, image processing, population balance modeling

Procedia PDF Downloads 351
17169 A New Aggregation Operator for Trapezoidal Fuzzy Numbers Based On the Geometric Means of the Left and Right Line Slopes

Authors: Manju Pandey, Nilay Khare, S. C. Shrivastava

Abstract:

This paper is the final in a series, which has defined two new classes of aggregation operators for triangular and trapezoidal fuzzy numbers based on the geometrical characteristics of their fuzzy membership functions. In the present paper, a new aggregation operator for trapezoidal fuzzy numbers has been defined. The new operator is based on the geometric mean of the membership lines to the left and right of the maximum possibility interval. The operator is defined and the analytical relationships have been derived. Computation of the aggregate is demonstrated with a numerical example. Corresponding arithmetic and geometric aggregates as well as results from the recent work of the authors on TrFN aggregates have also been computed.

Keywords: LR fuzzy number, interval fuzzy number, triangular fuzzy number, trapezoidal fuzzy number, apex angle, left apex angle, right apex angle, aggregation operator, arithmetic and geometric mean

Procedia PDF Downloads 465
17168 Banking and Accounting Analysis Researches Effect on Environment

Authors: Michael Saad Thabet Azrek

Abstract:

The advanced facts era is becoming a vital element within the improvement of financial offerings enterprise, in particular, the banking enterprise. It has introduced new approaches to delivering banking to the patron, including Internet Banking. Banks started to observe digital banking (e-banking) as a means to update a number of their conventional branch features using the net as a brand-new distribution channel. A few purchasers have, as a minimum, a couple of accounts across banks and get the right of entry to these accounts using e-banking offerings. To study the contemporary net really worth role, clients ought to log in to each of their debts and get the info and paintings on consolidation. This not simplest takes enough time, but it's also a repetitive hobby at a specific frequency. To cope with this point, an account aggregation idea is introduced as an answer. E-banking account aggregation, as one of the e-banking types, appeared to construct a more potent dating with clients. Account Aggregation provider usually refers to a provider that permits clients to manage their financial institution debts maintained in distinct establishments through a not unusual net banking operating a platform, with an excessive situation to protection and privateness. This paper gives an outline of an e-banking account aggregation method as a new provider in the e-banking field.

Keywords: compatibility, complexity, mobile banking, observation, risk banking technology, internet banks, modernization of banks, banks, account aggregation, security, enterprise development

Procedia PDF Downloads 24
17167 Aggregating Buyers and Sellers for E-Commerce: How Demand and Supply Meet in Fairs

Authors: Pierluigi Gallo, Francesco Randazzo, Ignazio Gallo

Abstract:

In recent years, many new and interesting models of successful online business have been developed. Many of these are based on the competition between users, such as online auctions, where the product price is not fixed and tends to rise. Other models, including group-buying, are based on cooperation between users, characterized by a dynamic price of the product that tends to go down. There is not yet a business model in which both sellers and buyers are grouped in order to negotiate on a specific product or service. The present study investigates a new extension of the group-buying model, called fair, which allows aggregation of demand and supply for price optimization, in a cooperative manner. Additionally, our system also aggregates products and destinations for shipping optimization. We introduced the following new relevant input parameters in order to implement a double-side aggregation: (a) price-quantity curves provided by the seller; (b) waiting time, that is, the longer buyers wait, the greater discount they get; (c) payment time, which determines if the buyer pays before, during or after receiving the product; (d) the distance between the place where products are available and the place of shipment, provided in advance by the buyer or dynamically suggested by the system. To analyze the proposed model we implemented a system prototype and a simulator that allows studying effects of changing some input parameters. We analyzed the dynamic price model in fairs having one single seller and a combination of selected sellers. The results are very encouraging and motivate further investigation on this topic.

Keywords: auction, aggregation, fair, group buying, social buying

Procedia PDF Downloads 288
17166 Measuring Housing Quality Using Geographic Information System (GIS)

Authors: Silvija ŠIljeg, Ante ŠIljeg, Ivan Marić

Abstract:

Measuring housing quality is being done on objective and subjective level using different indicators. During the research 5 urban and housing indicators formed according to 58 variables from different housing, domains were used. The aims of the research were to measure housing quality based on GIS approach and to detect critical points of housing in the example of Croatian coastal Town Zadar. The purposes of GIS in the research are to generate models of housing quality indexes by standardisation and aggregation of variables and to examine accuracy model of housing quality index. Analysis of accuracy has been done on the example of variable referring to educational objects availability. By defining weighted coefficients and using different GIS methods high, middle and low housing quality zones were determined. Obtained results can be of use to town planners, spatial planners and town authorities in the process of generating decisions, guidelines, and spatial interventions.

Keywords: housing quality, GIS, housing quality index, indicators, models of housing quality

Procedia PDF Downloads 292
17165 Multi-Criteria Goal Programming Model for Sustainable Development of India

Authors: Irfan Ali, Srikant Gupta, Aquil Ahmed

Abstract:

Every country needs a sustainable development (SD) for its economic growth by forming suitable policies and initiative programs for the development of different sectors of the country. This paper is comprised of modeling and optimization of different sectors of India that form a multi-criterion model. In this paper, we developed a fractional goal programming (FGP) model that helps in providing the efficient allocation of resources simultaneously by achieving the sustainable goals in gross domestic product (GDP), electricity consumption (EC) and greenhouse gasses (GHG) emission by the year 2030. Also, a weighted model of FGP is presented to obtain varying solution according to the priorities set by the policy maker for achieving future goals of GDP growth, EC, and GHG emission. The presented models provide a useful insight to the decision makers for implementing strategies in a different sector.

Keywords: sustainable and economic development, multi-objective fractional programming, fuzzy goal programming, weighted fuzzy goal programming

Procedia PDF Downloads 220