Search results for: real estate prediction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7271

Search results for: real estate prediction

3431 UWB Channel Estimation Using an Efficient Sub-Nyquist Sampling Scheme

Authors: Yaacoub Tina, Youssef Roua, Radoi Emanuel, Burel Gilles

Abstract:

Recently, low-complexity sub-Nyquist sampling schemes based on the Finite Rate of Innovation (FRI) theory have been introduced to sample parametric signals at minimum rates. The multichannel modulating waveforms (MCMW) is such an efficient scheme, where the received signal is mixed with an appropriate set of arbitrary waveforms, integrated and sampled at rates far below the Nyquist rate. In this paper, the MCMW scheme is adapted to the special case of ultra wideband (UWB) channel estimation, characterized by dense multipaths. First, an appropriate structure, which accounts for the bandpass spectrum feature of UWB signals, is defined. Then, a novel approach to decrease the number of processing channels and reduce the complexity of this sampling scheme is presented. Finally, the proposed concepts are validated by simulation results, obtained with real filters, in the framework of a coherent Rake receiver.

Keywords: coherent rake receiver, finite rate of innovation, sub-nyquist sampling, ultra wideband

Procedia PDF Downloads 256
3430 Sparse Unmixing of Hyperspectral Data by Exploiting Joint-Sparsity and Rank-Deficiency

Authors: Fanqiang Kong, Chending Bian

Abstract:

In this work, we exploit two assumed properties of the abundances of the observed signatures (endmembers) in order to reconstruct the abundances from hyperspectral data. Joint-sparsity is the first property of the abundances, which assumes the adjacent pixels can be expressed as different linear combinations of same materials. The second property is rank-deficiency where the number of endmembers participating in hyperspectral data is very small compared with the dimensionality of spectral library, which means that the abundances matrix of the endmembers is a low-rank matrix. These assumptions lead to an optimization problem for the sparse unmixing model that requires minimizing a combined l2,p-norm and nuclear norm. We propose a variable splitting and augmented Lagrangian algorithm to solve the optimization problem. Experimental evaluation carried out on synthetic and real hyperspectral data shows that the proposed method outperforms the state-of-the-art algorithms with a better spectral unmixing accuracy.

Keywords: hyperspectral unmixing, joint-sparse, low-rank representation, abundance estimation

Procedia PDF Downloads 261
3429 Project Management at University: Towards an Evaluation Process around Cooperative Learning

Authors: J. L. Andrade-Pineda, J.M. León-Blanco, M. Calle, P. L. González-R

Abstract:

The enrollment in current Master's degree programs usually pursues gaining the expertise required in real-life workplaces. The experience we present here concerns the learning process of "Project Management Methodology (PMM)", around a cooperative/collaborative mechanism aimed at affording students measurable learning goals and providing the teacher with the ability of focusing on the weaknesses detected. We have designed a mixed summative/formative evaluation, which assures curriculum engage while enriches the comprehension of PMM key concepts. In this experience we converted the students into active actors in the evaluation process itself and we endowed ourselves as teachers with a flexible process in which along with qualifications (score), other attitudinal feedback arises. Despite the high level of self-affirmation on their discussion within the interactive assessment sessions, they ultimately have exhibited a great ability to review and correct the wrong reasoning when that was the case.

Keywords: cooperative-collaborative learning, educational management, formative-summative assessment, leadership training

Procedia PDF Downloads 169
3428 A Simulative Approach for JIT Parts-Feeding Policies

Authors: Zhou BingHai, Fradet Victor

Abstract:

Lean philosophy follows the simple principle of “creating more value with fewer resources”. In accordance with this policy, material handling can be managed by the mean of Kanban which by triggering every feeding tour only when needed regulates the flow of material in one of the most efficient way. This paper focuses on Kanban Supermarket’s parameters and their optimization on a purely cost-based point of view. Number and size of forklifts, as well as size of the containers they carry, will be variables of the cost function which includes handling costs, inventory costs but also shortage costs. With an innovative computational approach encoded into industrial engineering software Tecnomatix and reproducing real-life conditions, a fictive assembly line is established and produces a random list of orders. Multi-scenarios are then run to study the impact of each change of parameter and the variation of costs it implies. Lastly, best-case scenarios financially speaking are selected.

Keywords: Kanban, supermarket, parts-feeding policies, multi-scenario simulation, assembly line

Procedia PDF Downloads 195
3427 Spatial Correlation of Channel State Information in Real Long Range Measurement

Authors: Ahmed Abdelghany, Bernard Uguen, Christophe Moy, Dominique Lemur

Abstract:

The Internet of Things (IoT) is developed to ensure monitoring and connectivity within different applications. Thus, it is critical to study the channel propagation characteristics in Low Power Wide Area Network (LPWAN), especially Long Range Wide Area Network (LoRaWAN). In this paper, an in-depth investigation of the reciprocity between the uplink and downlink Channel State Information (CSI) is done by performing an outdoor measurement campaign in the area of Campus Beaulieu in Rennes. At each different location, the CSI reciprocity is quantified using the Pearson Correlation Coefficient (PCC) which shows a very high linear correlation between the uplink and downlink CSI. This reciprocity feature could be utilized for the physical layer security between the node and the gateway. On the other hand, most of the CSI shapes from different locations are highly uncorrelated from each other. Hence, it can be anticipated that this could achieve significant localization gain by utilizing the frequency hopping in the LoRa systems by getting access to a wider band.

Keywords: IoT, LPWAN, LoRa, effective signal power, onsite measurement

Procedia PDF Downloads 162
3426 A Packet Loss Probability Estimation Filter Using Most Recent Finite Traffic Measurements

Authors: Pyung Soo Kim, Eung Hyuk Lee, Mun Suck Jang

Abstract:

A packet loss probability (PLP) estimation filter with finite memory structure is proposed to estimate the packet rate mean and variance of the input traffic process in real-time while removing undesired system and measurement noises. The proposed PLP estimation filter is developed under a weighted least square criterion using only the finite traffic measurements on the most recent window. The proposed PLP estimation filter is shown to have several inherent properties such as unbiasedness, deadbeat, robustness. A guideline for choosing appropriate window length is described since it can affect significantly the estimation performance. Using computer simulations, the proposed PLP estimation filter is shown to be superior to the Kalman filter for the temporarily uncertain system. One possible explanation for this is that the proposed PLP estimation filter can have greater convergence time of a filtered estimate as the window length M decreases.

Keywords: packet loss probability estimation, finite memory filter, infinite memory filter, Kalman filter

Procedia PDF Downloads 674
3425 Natural Emergence of a Core Structure in Networks via Clique Percolation

Authors: A. Melka, N. Slater, A. Mualem, Y. Louzoun

Abstract:

Networks are often presented as containing a “core” and a “periphery.” The existence of a core suggests that some vertices are central and form the skeleton of the network, to which all other vertices are connected. An alternative view of graphs is through communities. Multiple measures have been proposed for dense communities in graphs, the most classical being k-cliques, k-cores, and k-plexes, all presenting groups of tightly connected vertices. We here show that the edge number thresholds for such communities to emerge and for their percolation into a single dense connectivity component are very close, in all networks studied. These percolating cliques produce a natural core and periphery structure. This result is generic and is tested in configuration models and in real-world networks. This is also true for k-cores and k-plexes. Thus, the emergence of this connectedness among communities leading to a core is not dependent on some specific mechanism but a direct result of the natural percolation of dense communities.

Keywords: cliques, core structure, percolation, phase transition

Procedia PDF Downloads 171
3424 Integrative Transcriptomic Profiling of NK Cells and Monocytes: Advancing Diagnostic and Therapeutic Strategies for COVID-19

Authors: Salma Loukman, Reda Benmrid, Najat Bouchmaa, Hicham Hboub, Rachid El Fatimy, Rachid Benhida

Abstract:

In this study, it use integrated transcriptomic datasets from the GEO repository with the purpose of investigating immune dysregulation in COVID-19. Thus, in this context, we decided to be focused on NK cells and CD14+ monocytes gene expression, considering datasets GSE165461 and GSE198256, respectively. Other datasets with PBMCs, lung, olfactory, and sensory epithelium and lymph were used to provide robust validation for our results. This approach gave an integrated view of the immune responses in COVID-19, pointing out a set of potential biomarkers and therapeutic targets with special regard to standards of physiological conditions. IFI27, MKI67, CENPF, MBP, HBA2, TMEM158, THBD, HBA1, LHFPL2, SLA, and AC104564.3 were identified as key genes from our analysis that have critical biological processes related to inflammation, immune regulation, oxidative stress, and metabolic processes. Consequently, such processes are important in understanding the heterogeneous clinical manifestations of COVID-19—from acute to long-term effects now known as 'long COVID'. Subsequent validation with additional datasets consolidated these genes as robust biomarkers with an important role in the diagnosis of COVID-19 and the prediction of its severity. Moreover, their enrichment in key pathophysiological pathways presented them as potential targets for therapeutic intervention.The results provide insight into the molecular dynamics of COVID-19 caused by cells such as NK cells and other monocytes. Thus, this study constitutes a solid basis for targeted diagnostic and therapeutic development and makes relevant contributions to ongoing research efforts toward better management and mitigation of the pandemic.

Keywords: SARS-COV-2, RNA-seq, biomarkers, severity, long COVID-19, bio analysis

Procedia PDF Downloads 12
3423 Implementation of a Low-Cost Instrumentation for an Open Cycle Wind Tunnel to Evaluate Pressure Coefficient

Authors: Cristian P. Topa, Esteban A. Valencia, Victor H. Hidalgo, Marco A. Martinez

Abstract:

Wind tunnel experiments for aerodynamic profiles display numerous advantages, such as: clean steady laminar flow, controlled environmental conditions, streamlines visualization, and real data acquisition. However, the experiment instrumentation usually is expensive, and hence, each test implies a incremented in design cost. The aim of this work is to select and implement a low-cost static pressure data acquisition system for a NACA 2412 airfoil in an open cycle wind tunnel. This work compares wind tunnel experiment with Computational Fluid Dynamics (CFD) simulation and parametric analysis. The experiment was evaluated at Reynolds of 1.65 e5, with increasing angles from -5° to 15°. The comparison between the approaches show good enough accuracy, between the experiment and CFD, additional parametric analysis results differ widely from the other methods, which complies with the lack of accuracy of the lateral approach due its simplicity.

Keywords: wind tunnel, low cost instrumentation, experimental testing, CFD simulation

Procedia PDF Downloads 180
3422 GPU Based High Speed Error Protection for Watermarked Medical Image Transmission

Authors: Md Shohidul Islam, Jongmyon Kim, Ui-pil Chong

Abstract:

Medical image is an integral part of e-health care and e-diagnosis system. Medical image watermarking is widely used to protect patients’ information from malicious alteration and manipulation. The watermarked medical images are transmitted over the internet among patients, primary and referred physicians. The images are highly prone to corruption in the wireless transmission medium due to various noises, deflection, and refractions. Distortion in the received images leads to faulty watermark detection and inappropriate disease diagnosis. To address the issue, this paper utilizes error correction code (ECC) with (8, 4) Hamming code in an existing watermarking system. In addition, we implement the high complex ECC on a graphics processing units (GPU) to accelerate and support real-time requirement. Experimental results show that GPU achieves considerable speedup over the sequential CPU implementation, while maintaining 100% ECC efficiency.

Keywords: medical image watermarking, e-health system, error correction, Hamming code, GPU

Procedia PDF Downloads 290
3421 Quantitative Analysis of Multiprocessor Architectures for Radar Signal Processing

Authors: Deepak Kumar, Debasish Deb, Reena Mamgain

Abstract:

Radar signal processing requires high number crunching capability. Most often this is achieved using multiprocessor platform. Though multiprocessor platform provides the capability of meeting the real time computational challenges, the architecture of the same along with mapping of the algorithm on the architecture plays a vital role in efficiently using the platform. Towards this, along with standard performance metrics, few additional metrics are defined which helps in evaluating the multiprocessor platform along with the algorithm mapping. A generic multiprocessor architecture can not suit all the processing requirements. Depending on the system requirement and type of algorithms used, the most suitable architecture for the given problem is decided. In the paper, we study different architectures and quantify the different performance metrics which enables comparison of different architectures for their merit. We also carried out case study of different architectures and their efficiency depending on parallelism exploited on algorithm or data or both.

Keywords: radar signal processing, multiprocessor architecture, efficiency, load imbalance, buffer requirement, pipeline, parallel, hybrid, cluster of processors (COPs)

Procedia PDF Downloads 412
3420 Design, Synthesis and Pharmacological Investigation of Novel 2-Phenazinamine Derivatives as a Mutant BCR-ABL (T315I) Inhibitor

Authors: Gajanan M. Sonwane

Abstract:

Nowadays, the entire pharmaceutical industry is facing the challenge of increasing efficiency and innovation. The major hurdles are the growing cost of research and development and a concurrent stagnating number of new chemical entities (NCEs). Hence, the challenge is to select the most druggable targets and to search the equivalent drug-like compounds, which also possess specific pharmacokinetic and toxicological properties that allow them to be developed as drugs. The present research work includes the studies of developing new anticancer heterocycles by using molecular modeling techniques. The heterocycles synthesized through such methodology are much effective as various physicochemical parameters have been already studied and the structure has been optimized for its best fit in the receptor. Hence, on the basis of the literature survey and considering the need to develop newer anticancer agents, new phenazinamine derivatives were designed by subjecting the nucleus to molecular modeling, viz., GQSAR analysis and docking studies. Simultaneously, these designed derivatives were subjected to in silico prediction of biological activity through PASS studies and then in silico toxicity risk assessment studies. In PASS studies, it was found that all the derivatives exhibited a good spectrum of biological activities confirming its anticancer potential. The toxicity risk assessment studies revealed that all the derivatives obey Lipinski’s rule. Amongst these series, compounds 4c, 5b and 6c were found to possess logP and drug-likeness values comparable with the standard Imatinib (used for anticancer activity studies) and also with the standard drug methotrexate (used for antimitotic activity studies). One of the most notable mutations is the threonine to isoleucine mutation at codon 315 (T315I), which is known to be resistant to all currently available TKI. Enzyme assay planned for confirmation of target selective activity.

Keywords: drug design, tyrosine kinases, anticancer, Phenazinamine

Procedia PDF Downloads 116
3419 An Axiomatic Approach to Constructing an Applied Theory of Possibility

Authors: Oleksii Bychkov

Abstract:

The fundamental difference between randomness and vagueness is that the former requires statistical research. These issues were studied by Zadeh L, Dubois D., Prad A. The theory of possibility works with expert assessments, hypotheses, etc. gives an idea of the characteristics of the problem situation, the nature of the goals and real limitations. Possibility theory examines experiments that are not repeated. The article discusses issues related to the formalization of a fuzzy, uncertain idea of reality. The author proposes to expand the classical model of the theory of possibilities by introducing a measure of necessity. The proposed model of the theory of possibilities allows us to extend the measures of possibility and necessity onto a Boolean while preserving the properties of the measure. Thus, upper and lower estimates are obtained to describe the fact that the event will occur. Knowledge of the patterns that govern mass random, uncertain, fuzzy events allows us to predict how these events will proceed. The article proposed for publication quite fully reveals the essence of the construction and use of the theory of probability and the theory of possibility.

Keywords: possibility, artificial, modeling, axiomatics, intellectual approach

Procedia PDF Downloads 33
3418 A Deep Learning Model with Greedy Layer-Wise Pretraining Approach for Optimal Syngas Production by Dry Reforming of Methane

Authors: Maryam Zarabian, Hector Guzman, Pedro Pereira-Almao, Abraham Fapojuwo

Abstract:

Dry reforming of methane (DRM) has sparked significant industrial and scientific interest not only as a viable alternative for addressing the environmental concerns of two main contributors of the greenhouse effect, i.e., carbon dioxide (CO₂) and methane (CH₄), but also produces syngas, i.e., a mixture of hydrogen (H₂) and carbon monoxide (CO) utilized by a wide range of downstream processes as a feedstock for other chemical productions. In this study, we develop an AI-enable syngas production model to tackle the problem of achieving an equivalent H₂/CO ratio [1:1] with respect to the most efficient conversion. Firstly, the unsupervised density-based spatial clustering of applications with noise (DBSAN) algorithm removes outlier data points from the original experimental dataset. Then, random forest (RF) and deep neural network (DNN) models employ the error-free dataset to predict the DRM results. DNN models inherently would not be able to obtain accurate predictions without a huge dataset. To cope with this limitation, we employ reusing pre-trained layers’ approaches such as transfer learning and greedy layer-wise pretraining. Compared to the other deep models (i.e., pure deep model and transferred deep model), the greedy layer-wise pre-trained deep model provides the most accurate prediction as well as similar accuracy to the RF model with R² values 1.00, 0.999, 0.999, 0.999, 0.999, and 0.999 for the total outlet flow, H₂/CO ratio, H₂ yield, CO yield, CH₄ conversion, and CO₂ conversion outputs, respectively.

Keywords: artificial intelligence, dry reforming of methane, artificial neural network, deep learning, machine learning, transfer learning, greedy layer-wise pretraining

Procedia PDF Downloads 86
3417 The Effect of User Comments on Traffic Application Usage

Authors: I. Gokasar, G. Bakioglu

Abstract:

With the unprecedented rates of technological improvements, people start to solve their problems with the help of technological tools. According to application stores and websites in which people evaluate and comment on the traffic apps, there are more than 100 traffic applications which have different features with respect to their purpose of usage ranging from the features of traffic apps for public transit modes to the features of traffic apps for private cars. This study focuses on the top 30 traffic applications which were chosen with respect to their download counts. All data about the traffic applications were obtained from related websites. The purpose of this study is to analyze traffic applications in terms of their categorical attributes with the help of developing a regression model. The analysis results suggest that negative interpretations (e.g., being deficient) does not lead to lower star ratings of the applications. However, those negative interpretations result in a smaller increase in star rate. In addition, women use higher star rates than men for the evaluation of traffic applications.

Keywords: traffic app, real–time information, traffic congestion, regression analysis, dummy variables

Procedia PDF Downloads 429
3416 In-Game Business and the Problem of Gambling: Legal Analysis of Loot Boxes from the Perspective of Iranian Law

Authors: Vesali Naseh Morteza, Najafi Mohammad Hosein

Abstract:

The possibility of trading in-game items for real money provides a high economic capacity for online games and turns them into a business model. Nowadays, the market for in-game item purchases and microtransactions or micropayments has been growing increasingly. Since the market should be legal, lawyers and lawmakers around the world have expressed concerns over the legality of online gaming and in-game transactions. The issue is highlighted by the recent emergence of an in-game business model in the name of loot boxes. Similarities between loot boxes gaming and gambling features activities have started a legal debate as to whether loot boxes constitute a form of gambling or whether the game’s use of loot boxes should be considered gambling. Hence, based on the relationship between loot boxes purchasing and problem gambling, the paper investigates the legal effect of the newly emergent phenomenon of loot boxes on online games from the perspective of Iranian law.

Keywords: serious games, loot boxes, online gambling, in-game purchase, virtual items

Procedia PDF Downloads 107
3415 Optimizing the Window Geometry Using Fractals

Authors: K. Geetha Ramesh, A. Ramachandraiah

Abstract:

In an internal building space, daylight becomes a powerful source of illumination. The challenge therefore, is to develop means of utilizing both direct and diffuse natural light in buildings while maintaining and improving occupant's visual comfort, particularly at greater distances from the windows throwing daylight. The geometrical features of windows in a building have significant effect in providing daylight. The main goal of this research is to develop an innovative window geometry, which will effectively provide the daylight component adequately together with internal reflected component(IRC) and also the external reflected component(ERC), if any. This involves exploration of a light redirecting system using fractal geometry for windows, in order to penetrate and distribute daylight more uniformly to greater depths, minimizing heat gain and glare, and also to reduce building energy use substantially. Of late the creation of fractal geometrical window and the occurrence of daylight illuminance due to such windows is becoming an interesting study. The amount of daylight can change significantly based on the window geometry and sky conditions. This leads to the (i) exploration of various fractal patterns suitable for window designs, and (ii) quantification of the effect of chosen fractal window based on the relationship between the fractal pattern, size, orientation and glazing properties for optimizing daylighting. There are a lot of natural lighting applications able to predict the behaviour of a light in a room through a traditional opening - a regular window. The conventional prediction methodology involves the evaluation of the daylight factor, the internal reflected component and the external reflected component. Having evaluated the daylight illuminance level for a conventional window, the technical performance of a fractal window for an optimal daylighting is to be studied and compared with that of a regular window. The methodologies involved are highlighted in this paper.

Keywords: daylighting, fractal geometry, fractal window, optimization

Procedia PDF Downloads 301
3414 Optimal-Based Structural Vibration Attenuation Using Nonlinear Tuned Vibration Absorbers

Authors: Pawel Martynowicz

Abstract:

Vibrations are a crucial problem for slender structures such as towers, masts, chimneys, wind turbines, bridges, high buildings, etc., that is why most of them are equipped with vibration attenuation or fatigue reduction solutions. In this work, a slender structure (i.e., wind turbine tower-nacelle model) equipped with nonlinear, semiactive tuned vibration absorber(s) is analyzed. For this study purposes, magnetorheological (MR) dampers are used as semiactive actuators. Several optimal-based approaches to structural vibration attenuation are investigated against the standard ‘ground-hook’ law and passive tuned vibration absorber(s) implementations. The common approach to optimal control of nonlinear systems is offline computation of the optimal solution, however, so determined open loop control suffers from lack of robustness to uncertainties (e.g., unmodelled dynamics, perturbations of external forces or initial conditions), and thus perturbation control techniques are often used. However, proper linearization may be an issue for highly nonlinear systems with implicit relations between state, co-state, and control. The main contribution of the author is the development as well as numerical and experimental verification of the Pontriagin maximum-principle-based vibration control concepts that produce directly actuator control input (not the demanded force), thus force tracking algorithm that results in control inaccuracy is entirely omitted. These concepts, including one-step optimal control, quasi-optimal control, and optimal-based modified ‘ground-hook’ law, can be directly implemented in online and real-time feedback control for periodic (or semi-periodic) disturbances with invariant or time-varying parameters, as well as for non-periodic, transient or random disturbances, what is a limitation for some other known solutions. No offline calculation, excitations/disturbances assumption or vibration frequency determination is necessary, moreover, all of the nonlinear actuator (MR damper) force constraints, i.e., no active forces, lower and upper saturation limits, hysteresis-type dynamics, etc., are embedded in the control technique, thus the solution is optimal or suboptimal for the assumed actuator, respecting its limitations. Depending on the selected method variant, a moderate or decisive reduction in the computational load is possible compared to other methods of nonlinear optimal control, while assuring the quality and robustness of the vibration reduction system, as well as considering multi-pronged operational aspects, such as possible minimization of the amplitude of the deflection and acceleration of the vibrating structure, its potential and/or kinetic energy, required actuator force, control input (e.g. electric current in the MR damper coil) and/or stroke amplitude. The developed solutions are characterized by high vibration reduction efficiency – the obtained maximum values of the dynamic amplification factor are close to 2.0, while for the best of the passive systems, these values exceed 3.5.

Keywords: magnetorheological damper, nonlinear tuned vibration absorber, optimal control, real-time structural vibration attenuation, wind turbines

Procedia PDF Downloads 124
3413 Design and Implementation of an AI-Enabled Task Assistance and Management System

Authors: Arun Prasad Jaganathan

Abstract:

In today's dynamic industrial world, traditional task allocation methods often fall short in adapting to evolving operational conditions. This paper introduces an AI-enabled task assistance and management system designed to overcome the limitations of conventional approaches. By using artificial intelligence (AI) and machine learning (ML), the system intelligently interprets user instructions, analyzes tasks, and allocates resources based on real-time data and environmental factors. Additionally, geolocation tracking enables proactive identification of potential delays, ensuring timely interventions. With its transparent reporting mechanisms, the system provides stakeholders with clear insights into task progress, fostering accountability and informed decision-making. The paper presents a comprehensive overview of the system architecture, algorithm, and implementation, highlighting its potential to revolutionize task management across diverse industries.

Keywords: artificial intelligence, machine learning, task allocation, operational efficiency, resource optimization

Procedia PDF Downloads 59
3412 Simulating the Unseen: David Cronenberg’s Body Horror through Baudrillard’s Lens

Authors: Mario G. Rodriguez

Abstract:

This paper undertakes an in-depth exploration of David Cronenberg's filmography through Jean Baudrillard's theory of simulacra and simulation. Little has been written to show how Cronenberg’s cinema exemplifies Baudrillard’s conceptualization of postmodernity. The study employs Baudrillard’s historical orders of simulacra, as well as his definitions of hyperreality and simulation, to recontextualize Cronenberg’s films in an era characterized by the increasing influence of media and technology and Cronenberg's oeuvre presents a compelling canvas for examining the interplay between the real and the simulated. Through films like "Videodrome" (1983), "The Fly" (1986), and "eXistenZ" (1999), Cronenberg navigates the complex terrain of the human body, technology, and societal perceptions, echoing Baudrillard's concerns about the hyperreal and the dissolution of reality. The study concludes with a consideration of the role of "body-horror" as it pertains to Baudrillard's theory. It sheds light on how fear of loss of bodily autonomy, the relationship between technology and the human body, and the intersection of science, medicine, and horror reflect the nature of hyperreality and simulation.

Keywords: Cronenberg, hyperreality, simulation, Baudrillard

Procedia PDF Downloads 69
3411 Nuclear Fuel Safety Threshold Determined by Logistic Regression Plus Uncertainty

Authors: D. S. Gomes, A. T. Silva

Abstract:

Analysis of the uncertainty quantification related to nuclear safety margins applied to the nuclear reactor is an important concept to prevent future radioactive accidents. The nuclear fuel performance code may involve the tolerance level determined by traditional deterministic models producing acceptable results at burn cycles under 62 GWd/MTU. The behavior of nuclear fuel can simulate applying a series of material properties under irradiation and physics models to calculate the safety limits. In this study, theoretical predictions of nuclear fuel failure under transient conditions investigate extended radiation cycles at 75 GWd/MTU, considering the behavior of fuel rods in light-water reactors under reactivity accident conditions. The fuel pellet can melt due to the quick increase of reactivity during a transient. Large power excursions in the reactor are the subject of interest bringing to a treatment that is known as the Fuchs-Hansen model. The point kinetic neutron equations show similar characteristics of non-linear differential equations. In this investigation, the multivariate logistic regression is employed to a probabilistic forecast of fuel failure. A comparison of computational simulation and experimental results was acceptable. The experiments carried out use the pre-irradiated fuels rods subjected to a rapid energy pulse which exhibits the same behavior during a nuclear accident. The propagation of uncertainty utilizes the Wilk's formulation. The variables chosen as essential to failure prediction were the fuel burnup, the applied peak power, the pulse width, the oxidation layer thickness, and the cladding type.

Keywords: logistic regression, reactivity-initiated accident, safety margins, uncertainty propagation

Procedia PDF Downloads 292
3410 Reducing Power Consumption in Network on Chip Using Scramble Techniques

Authors: Vinayaga Jagadessh Raja, R. Ganesan, S. Ramesh Kumar

Abstract:

An ever more significant fraction of the overall power dissipation of a network-on-chip (NoC) based system on- chip (SoC) is due to the interconnection scheme. In information, as equipment shrinks, the power contributes of NoC links starts to compete with that of NoC routers. In this paper, we propose the use of clock gating in the data encoding techniques as a viable way to reduce both power dissipation and time consumption of NoC links. The projected scramble scheme exploits the wormhole switching techniques. That is, flits are scramble by the network interface (NI) before they are injected in the network and are decoded by the target NI. This makes the scheme transparent to the underlying network since the encoder and decoder logic is integrated in the NI and no modification of the routers structural design is required. We review the projected scramble scheme on a set of representative data streams (both synthetic and extracted from real applications) showing that it is possible to reduce the power contribution of both the self-switching activity and the coupling switching activity in inter-routers links.

Keywords: Xilinx 12.1, power consumption, Encoder, NOC

Procedia PDF Downloads 400
3409 Study on the Dynamic Characteristics Change of Welded Beam Due to Vibration Aging

Authors: S. H. Bae, D. W. Cho, W. B. Jeong, J. R. Cho

Abstract:

Fatigue fracture of an aluminum welded structure is a phenomenon frequently occurring from pores in a weld. In order to grasp the state of the welded structure in operation in real time, the acceleration signal of the structure is measured. At this time, the vibration characteristic of the signal according to the fatigue load is an important parameter of the state diagnosis. This paper was an experimental study on the variation of vibration characteristics of welded beams with vibration aging (especially bending vibration). First simple beams were produced according to welding conditions. Each beam was vibrated and measured beam's PSD (power spectral density) according to the degree of aging. Also, modal testing was conducted to compare the transfer functions of welded beams. Testing result shows that the natural frequencies of the beam changed with the vibration aging due to the change of stiffness in welding part and its stiffness was estimated by the finite element method.

Keywords: modal testing, natural frequency, vibration aging, welded structure

Procedia PDF Downloads 483
3408 Designing a Method to Control and Determine the Financial Performance of the Real Cost Sub-System in the Information Management System of Construction Projects

Authors: Alireza Ghaffari, Hassan Saghi

Abstract:

Project management is more complex than managing the day-to-day affairs of an organization. When the project dimensions are broad and multiple projects have to be monitored in different locations, the integrated management becomes even more complicated. One of the main concerns of project managers is the integrated project management, which is mainly rooted in the lack of accurate and accessible information from different projects in various locations. The collection of dispersed information from various parts of the network, their integration and finally the selective reporting of this information is among the goals of integrated information systems. It can help resolve the main problem, which is bridging the information gap between executives and senior managers in the organization. Therefore, the main objective of this study is to design and implement an important subset of a project management information system in order to successfully control the cost of construction projects so that its results can be used to design raw software forms and proposed relationships between different project units for the collection of necessary information.

Keywords: financial performance, cost subsystem, PMIS, project management

Procedia PDF Downloads 109
3407 A Stable Method for Determination of the Number of Independent Components

Authors: Yuyan Yi, Jingyi Zheng, Nedret Billor

Abstract:

Independent component analysis (ICA) is one of the most commonly used blind source separation (BSS) techniques for signal pre-processing, such as noise reduction and feature extraction. The main parameter in the ICA method is the number of independent components (IC). Although there have been several methods for the determination of the number of ICs, it has not been given sufficient attentionto this important parameter. In this study, wereview the mostused methods fordetermining the number of ICs and providetheir advantages and disadvantages. Further, wepropose an improved version of column-wise ICAByBlock method for the determination of the number of ICs.To assess the performance of the proposed method, we compare the column-wise ICAbyBlock with several existing methods through different ICA methods by using simulated and real signal data. Results show that the proposed column-wise ICAbyBlock is an effective and stable method for determining the optimal number of components in ICA. This method is simple, and results can be demonstrated intuitively with good visualizations.

Keywords: independent component analysis, optimal number, column-wise, correlation coefficient, cross-validation, ICAByblock

Procedia PDF Downloads 99
3406 Evaluation of Robust Feature Descriptors for Texture Classification

Authors: Jia-Hong Lee, Mei-Yi Wu, Hsien-Tsung Kuo

Abstract:

Texture is an important characteristic in real and synthetic scenes. Texture analysis plays a critical role in inspecting surfaces and provides important techniques in a variety of applications. Although several descriptors have been presented to extract texture features, the development of object recognition is still a difficult task due to the complex aspects of texture. Recently, many robust and scaling-invariant image features such as SIFT, SURF and ORB have been successfully used in image retrieval and object recognition. In this paper, we have tried to compare the performance for texture classification using these feature descriptors with k-means clustering. Different classifiers including K-NN, Naive Bayes, Back Propagation Neural Network , Decision Tree and Kstar were applied in three texture image sets - UIUCTex, KTH-TIPS and Brodatz, respectively. Experimental results reveal SIFTS as the best average accuracy rate holder in UIUCTex, KTH-TIPS and SURF is advantaged in Brodatz texture set. BP neuro network works best in the test set classification among all used classifiers.

Keywords: texture classification, texture descriptor, SIFT, SURF, ORB

Procedia PDF Downloads 369
3405 Decision Support Tool for Green Roofs Selection: A Multicriteria Analysis

Authors: I. Teotónio, C.O. Cruz, C.M. Silva, M. Manso

Abstract:

Diverse stakeholders show different concerns when choosing green roof systems. Also, green roof solutions vary in their cost and performance. Therefore, decision-makers continually face the difficult task of balancing benefits against green roofs costs. Decision analysis methods, as multicriteria analysis, can be used when the decision‑making process includes different perspectives, multiple objectives, and uncertainty. The present study adopts a multicriteria decision model to evaluate the installation of green roofs in buildings, determining the solution with the best trade-off between costs and benefits in agreement with the preferences of the users/investors. This methodology was applied to a real decision problem, assessing the preferences between different green roof systems in an existing building in Lisbon. This approach supports the decision-making process on green roofs and enables robust and informed decisions on urban planning while optimizing buildings retrofitting.

Keywords: decision making, green roofs, investors preferences, multicriteria analysis, sustainable development

Procedia PDF Downloads 184
3404 The Design of Acoustic Horns for Ultrasonic Aided Tube Double Side Flange Making

Authors: Kuen-Ming Shu, Jyun-Wei Chen

Abstract:

Encapsulated O-rings are specifically designed to address the problem of sealing the most hostile chemicals and extreme temperature applications. Ultrasonic vibration hot embossing and ultrasonic welding techniques provide a fast and reliable method to fabricate encapsulated O-ring. This paper performs the design and analysis method of the acoustic horns with double extrusion to process tube double side flange simultaneously. The paper deals with study through Finite Element Method (FEM) of ultrasonic stepped horn used to process a capsulated O-ring, the theoretical dimensions of horns, and their natural frequencies and amplitudes are obtained through the simulations of COMOSOL software. Furthermore, real horns were fabricated, tested and verified to proof the practical utility of these horns.

Keywords: encapsulated O-rings, ultrasonic vibration hot embossing, flange making, acoustic horn, finite element analysis

Procedia PDF Downloads 318
3403 Sensitivity Analysis of the Thermal Properties in Early Age Modeling of Mass Concrete

Authors: Farzad Danaei, Yilmaz Akkaya

Abstract:

In many civil engineering applications, especially in the construction of large concrete structures, the early age behavior of concrete has shown to be a crucial problem. The uneven rise in temperature within the concrete in these constructions is the fundamental issue for quality control. Therefore, developing accurate and fast temperature prediction models is essential. The thermal properties of concrete fluctuate over time as it hardens, but taking into account all of these fluctuations makes numerical models more complex. Experimental measurement of the thermal properties at the laboratory conditions also can not accurately predict the variance of these properties at site conditions. Therefore, specific heat capacity and the heat conductivity coefficient are two variables that are considered constant values in many of the models previously recommended. The proposed equations demonstrate that these two quantities are linearly decreasing as cement hydrates, and their value are related to the degree of hydration. The effects of changing the thermal conductivity and specific heat capacity values on the maximum temperature and the time it takes for concrete to reach that temperature are examined in this study using numerical sensibility analysis, and the results are compared to models that take a fixed value for these two thermal properties. The current study is conducted in 7 different mix designs of concrete with varying amounts of supplementary cementitious materials (fly ash and ground granulated blast furnace slag). It is concluded that the maximum temperature will not change as a result of the constant conductivity coefficient, but variable specific heat capacity must be taken into account, also about duration when a concrete's central node reaches its max value again variable specific heat capacity can have a considerable effect on the final result. Also, the usage of GGBFS has more influence compared to fly ash.

Keywords: early-age concrete, mass concrete, specific heat capacity, thermal conductivity coefficient

Procedia PDF Downloads 77
3402 Implementation of Computer-Based Technologies into Foreign Language Teaching Process

Authors: Golovchun Aleftina, Dabyltayeva Raikhan

Abstract:

Nowadays, in the world of widely developing cross-cultural interactions and rapidly changing demands of the global labor market, foreign language teaching and learning has taken a special role not only in school education but also in everyday life. Cognitive Lingua-Cultural Methodology of Foreign Language Teaching originated in Kazakhstan brings a communicative approach to the forefront in foreign language teaching that gives raise a variety of techniques to make the language learning a real communication. One of these techniques is Computer Assisted Language Learning. In our article, we aim to: demonstrate what learning benefits students are likely to get by teachers having implemented computer-based technologies into foreign language teaching process; prove that technology-based classroom serves as the best tool for interactive and efficient language learning; give examples of classroom sufficient organization with computer-based activities.

Keywords: computer assisted language learning, learning benefits, foreign language teaching process, implementation, communicative approach

Procedia PDF Downloads 473