Search results for: data interpolating empirical orthogonal function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29466

Search results for: data interpolating empirical orthogonal function

24456 Agricultural Water Consumption Estimation in the Helmand Basin

Authors: Mahdi Akbari, Ali Torabi Haghighi

Abstract:

Hamun Lakes, located in the Helmand Basin, consisting of four water bodies, were the greatest (>8500 km2) freshwater bodies in Iran plateau but have almost entirely desiccated over the last 20 years. The desiccation of the lakes caused dust storm in the region which has huge economic and health consequences on the inhabitants. The flow of the Hirmand (or Helmand) River, the most important feeding river, has decreased from 4 to 1.9 km3 downstream due to anthropogenic activities. In this basin, water is mainly consumed for farming. Due to the lack of in-situ data in the basin, this research utilizes remote-sensing data to show how croplands and consequently consumed water in the agricultural sector have changed. Based on Landsat NDVI, we suggest using a threshold of around 0.35-0.4 to detect croplands in the basin. Croplands of this basin has doubled since 1990, especially in the downstream of the Kajaki Dam (the biggest dam of the basin). Using PML V2 Actual Evapotranspiration (AET) data and considering irrigation efficiency (≈0.3), we estimate that the consumed water (CW) for farming. We found that CW has increased from 2.5 to over 7.5 km3 from 2002 to 2017 in this basin. Also, the annual average Potential Evapotranspiration (PET) of the basin has had a negative trend in the recent years, although the AET over croplands has an increasing trend. In this research, using remote sensing data, we covered lack of data in the studied area and highlighted anthropogenic activities in the upstream which led to the lakes desiccation in the downstream.

Keywords: Afghanistan-Iran transboundary Basin, Iran-Afghanistan water treaty, water use, lake desiccation

Procedia PDF Downloads 119
24455 Financial Instruments Disclosure: A Review of the Literature

Authors: Y. Tahat, T. Dunne, S. Fifield, D. Power

Abstract:

Information about a firm’s usage of Financial Instruments (FIs) plays a very important role in determining its financial position and performance. Yet accounting standard-setters have encountered problems when deciding on the FI-related disclosures which firms must make. The primary objective of this paper is to review the extant literature on FI disclosure. This objective is achieved by surveying the literature on: the corporate usage of FIs; the different accounting standards adopted concerning FIs; and empirical studies on FI disclosure. This review concludes that the current research on FI disclosure has generated a number of useful insights. In particular, the paper reports that: FIs are a very important risk management mechanism in ensuring that companies have the cash available to make value-enhancing investments, however, without a clear set of risk management objectives, using such instruments can be dangerous; accounting standards concerning FIs have resulted in enhanced transparency about the usage of these instruments; and FI-related information is a key input into investors’ decision-making processes. Finally, the paper provides a number of suggestions for future research in the area.

Keywords: financial instruments, financial reporting, accounting standards, value relevance, corporate disclosure

Procedia PDF Downloads 400
24454 Data-Driven Strategies for Enhancing Food Security in Vulnerable Regions: A Multi-Dimensional Analysis of Crop Yield Predictions, Supply Chain Optimization, and Food Distribution Networks

Authors: Sulemana Ibrahim

Abstract:

Food security remains a paramount global challenge, with vulnerable regions grappling with issues of hunger and malnutrition. This study embarks on a comprehensive exploration of data-driven strategies aimed at ameliorating food security in such regions. Our research employs a multifaceted approach, integrating data analytics to predict crop yields, optimizing supply chains, and enhancing food distribution networks. The study unfolds as a multi-dimensional analysis, commencing with the development of robust machine learning models harnessing remote sensing data, historical crop yield records, and meteorological data to foresee crop yields. These predictive models, underpinned by convolutional and recurrent neural networks, furnish critical insights into anticipated harvests, empowering proactive measures to confront food insecurity. Subsequently, the research scrutinizes supply chain optimization to address food security challenges, capitalizing on linear programming and network optimization techniques. These strategies intend to mitigate loss and wastage while streamlining the distribution of agricultural produce from field to fork. In conjunction, the study investigates food distribution networks with a particular focus on network efficiency, accessibility, and equitable food resource allocation. Network analysis tools, complemented by data-driven simulation methodologies, unveil opportunities for augmenting the efficacy of these critical lifelines. This study also considers the ethical implications and privacy concerns associated with the extensive use of data in the realm of food security. The proposed methodology outlines guidelines for responsible data acquisition, storage, and usage. The ultimate aspiration of this research is to forge a nexus between data science and food security policy, bestowing actionable insights to mitigate the ordeal of food insecurity. The holistic approach converging data-driven crop yield forecasts, optimized supply chains, and improved distribution networks aspire to revitalize food security in the most vulnerable regions, elevating the quality of life for millions worldwide.

Keywords: data-driven strategies, crop yield prediction, supply chain optimization, food distribution networks

Procedia PDF Downloads 46
24453 Cold Flow Investigation of Silicon Carbide Cylindrical Filter Element

Authors: Mohammad Alhajeri

Abstract:

This paper reports a computational fluid dynamics (CFD) investigation of cylindrical filter. Silicon carbide cylindrical filter elements have proven to be an effective mean of removing particulates to levels exceeding the new source performance standard. The CFD code is used here to understand the deposition process and the factors that affect the particles distribution over the filter element surface. Different approach cross flow velocity to filter face velocity ratios and different face velocities (ranging from 2 to 5 cm/s) are used in this study. Particles in the diameter range 1 to 100 microns are tracked through the domain. The radius of convergence (or the critical trajectory) is compared and plotted as a function of many parameters.

Keywords: filtration, CFD, CCF, hot gas filtration

Procedia PDF Downloads 451
24452 A Model Towards Creating Positive Accounting Classroom Conditions That Supports Successful Learning at School

Authors: Vine Petzer, Mirna Nel

Abstract:

An explanatory mixed method design was used to investigate accounting classroom conditions in the Further Education and Training (FET) Phase in South Africa. A descriptive survey research study with a heterogeneous group of learners and teachers was conducted in the first phase. In the qualitative phase, semi-structured individual interviews with learners and teachers, as well as observations in the accounting classroom, were employed to gain more in depth understanding of the learning conditions in the accounting classroom. The findings of the empirical research informed the development of a model for teachers in accounting, supporting them to use more effective teaching methods and create positive learning conditions for all learners to experience successful learning. A model towards creating positive Accounting classroom conditions that support successful learning was developed and recommended for education policy and decision-makers for use as a classroom intervention capacity building tool. The model identifies and delineates classroom practices that exert significant effect on learner attainment of quality education.

Keywords: accounting classroom conditions, positive education, successful learning, teaching accounting

Procedia PDF Downloads 133
24451 Refining Employee's Customer Service Performance through an Inter-Organizational Climate Study: A Way Forward

Authors: Zainal Abu Zatim, Hafizah Omar Zaki

Abstract:

Substantial research had been done on refining employee’s customer service performance. Thus, there were very limited empirical studies that are engage in an inter-organizational climate study in assessing employee’s customer service performance. With the current economic situation as well as emerging needs and requirements, all businesses either from public or private sector serving customers put greater attention on fulfilling those needs and requirements. In this state of affairs, the act of polishing its employees’ skills, knowledge, teamwork and passion is very important in ensuring better performance deliverance. A study conducted in one of the telecommunication service provider company in Malaysia had been done to test its inter-organizational climate study. The Internal Climate Study was done to benchmark opinions and perceptions of its employees. The study had provided baseline information about perceptions that exist in the internal environment and ways forward to improve customer service performance. The approach used is through the use of focus group and qualitative interview.

Keywords: employees, Customer Service Performance, inter-organizational climate study, public and private sector

Procedia PDF Downloads 387
24450 Fracture and Dynamic Behavior of Leaf Spring Suspension

Authors: S. Lecheb, A. Chellil, H. Mechakra, S. Attou, H. Kebir

Abstract:

Although leaf springs are one of the oldest suspension components they are still frequently used, especially in commercial vehicles. Being able to capture the leaf spring characteristics is of significant importance for vehicle handling dynamics studies. The main function of leaf spring is not only to support vertical load but also to isolate road induced vibrations. It is subjected to millions of load cycles leading to fatigue failure. It needs to have excellent fatigue life. The objective of this work is its use of Abaqus software to locate the most stressed areas and predict the areas in which it occurs in fatigue and crack of leaf spring and calculate the stress and frequencies of this model.

Keywords: leaf spring, crack, stress, natural frequencies

Procedia PDF Downloads 446
24449 A Statistical Approach to Classification of Agricultural Regions

Authors: Hasan Vural

Abstract:

Turkey is a favorable country to produce a great variety of agricultural products because of her different geographic and climatic conditions which have been used to divide the country into four main and seven sub regions. This classification into seven regions traditionally has been used in order to data collection and publication especially related with agricultural production. Afterwards, nine agricultural regions were considered. Recently, the governmental body which is responsible of data collection and dissemination (Turkish Institute of Statistics-TIS) has used 12 classes which include 11 sub regions and Istanbul province. This study aims to evaluate these classification efforts based on the acreage of ten main crops in a ten years time period (1996-2005). The panel data grouped in 11 subregions has been evaluated by cluster and multivariate statistical methods. It was concluded that from the agricultural production point of view, it will be rather meaningful to consider three main and eight sub-agricultural regions throughout the country.

Keywords: agricultural region, factorial analysis, cluster analysis,

Procedia PDF Downloads 402
24448 Empirical Heat Transfer Correlations of Finned-Tube Heat Exchangers in Pulsatile Flow

Authors: Jason P. Michaud, Connor P. Speer, David A. Miller, David S. Nobes

Abstract:

An experimental study on finned-tube radiators has been conducted. Three radiators found in desktop computers sized for 120 mm fans were tested in steady and pulsatile flows of ambient air over a Reynolds number range of  50 < Re < 900. Water at 60 °C was circulated through the radiators to maintain a constant fin temperature during the tests. For steady flow, it was found that the heat transfer rate increased linearly with the mass flow rate of air. The pulsatile flow experiments showed that frequency of pulsation had a negligible effect on the heat transfer rate for the range of frequencies tested (0.5 Hz – 2.5 Hz). For all three radiators, the heat transfer rate was decreased in the case of pulsatile flow. Linear heat transfer correlations for steady and pulsatile flow were calculated in terms of Reynolds number and Nusselt number.

Keywords: finned-tube heat exchangers, heat transfer correlations, pulsatile flow, computer radiators

Procedia PDF Downloads 494
24447 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings

Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.

Keywords: building system, time series, diagnosis, outliers, delay, data gap

Procedia PDF Downloads 234
24446 Source-Detector Trajectory Optimization for Target-Based C-Arm Cone Beam Computed Tomography

Authors: S. Hatamikia, A. Biguri, H. Furtado, G. Kronreif, J. Kettenbach, W. Birkfellner

Abstract:

Nowadays, three dimensional Cone Beam CT (CBCT) has turned into a widespread clinical routine imaging modality for interventional radiology. In conventional CBCT, a circular sourcedetector trajectory is used to acquire a high number of 2D projections in order to reconstruct a 3D volume. However, the accumulated radiation dose due to the repetitive use of CBCT needed for the intraoperative procedure as well as daily pretreatment patient alignment for radiotherapy has become a concern. It is of great importance for both health care providers and patients to decrease the amount of radiation dose required for these interventional images. Thus, it is desirable to find some optimized source-detector trajectories with the reduced number of projections which could therefore lead to dose reduction. In this study we investigate some source-detector trajectories with the optimal arbitrary orientation in the way to maximize performance of the reconstructed image at particular regions of interest. To achieve this approach, we developed a box phantom consisting several small target polytetrafluoroethylene spheres at regular distances through the entire phantom. Each of these spheres serves as a target inside a particular region of interest. We use the 3D Point Spread Function (PSF) as a measure to evaluate the performance of the reconstructed image. We measured the spatial variance in terms of Full-Width-Half-Maximum (FWHM) of the local PSFs each related to a particular target. The lower value of FWHM shows the better spatial resolution of reconstruction results at the target area. One important feature of interventional radiology is that we have very well-known imaging targets as a prior knowledge of patient anatomy (e.g. preoperative CT) is usually available for interventional imaging. Therefore, we use a CT scan from the box phantom as the prior knowledge and consider that as the digital phantom in our simulations to find the optimal trajectory for a specific target. Based on the simulation phase we have the optimal trajectory which can be then applied on the device in real situation. We consider a Philips Allura FD20 Xper C-arm geometry to perform the simulations and real data acquisition. Our experimental results based on both simulation and real data show our proposed optimization scheme has the capacity to find optimized trajectories with minimal number of projections in order to localize the targets. Our results show the proposed optimized trajectories are able to localize the targets as good as a standard circular trajectory while using just 1/3 number of projections. Conclusion: We demonstrate that applying a minimal dedicated set of projections with optimized orientations is sufficient to localize targets, may minimize radiation.

Keywords: CBCT, C-arm, reconstruction, trajectory optimization

Procedia PDF Downloads 125
24445 The Contemporary Dynamics of Board Composition and Executive Compensation for R&D Spending

Authors: Farheen Akram

Abstract:

Research and Development (R&D) is the most crucial element of the firm’s survival in a competitive business environment. R&D is a long-term investment; therefore, executives having the power to make the investment decisions may be pessimistic when their compensation is closely linked with short-term firm performance. Thus, the current study investigates the impact of board composition and executives’ compensation (cash or short-term benefits and LTIs) on R&D spending using a sample of 85 S&P/100 firms listed on the Australian Stock Exchange (ASX) in 2017. SmartPLS (v.3.2.7) was used to evaluate the proposed model of current research. The empirical findings of this study indicate that board composition has a significant and positive effect on R&D spending. While, as expected, executive cash compensation has negative and Long-Term-Incentives (LTIs) has a positive impact on R&D spending. Based on current findings, the study suggested that myopic behavior of CEOs and top management towards long-term value creation investment like R&D can be controlled by using long-term compensation rewards.

Keywords: cash compensation, LTIs, board composition, R&D spending

Procedia PDF Downloads 176
24444 Forecasting Electricity Spot Price with Generalized Long Memory Modeling: Wavelet and Neural Network

Authors: Souhir Ben Amor, Heni Boubaker, Lotfi Belkacem

Abstract:

This aims of this paper is to forecast the electricity spot prices. First, we focus on modeling the conditional mean of the series so we adopt a generalized fractional -factor Gegenbauer process (k-factor GARMA). Secondly, the residual from the -factor GARMA model has used as a proxy for the conditional variance; these residuals were predicted using two different approaches. In the first approach, a local linear wavelet neural network model (LLWNN) has developed to predict the conditional variance using the Back Propagation learning algorithms. In the second approach, the Gegenbauer generalized autoregressive conditional heteroscedasticity process (G-GARCH) has adopted, and the parameters of the k-factor GARMA-G-GARCH model has estimated using the wavelet methodology based on the discrete wavelet packet transform (DWPT) approach. The empirical results have shown that the k-factor GARMA-G-GARCH model outperform the hybrid k-factor GARMA-LLWNN model, and find it is more appropriate for forecasts.

Keywords: electricity price, k-factor GARMA, LLWNN, G-GARCH, forecasting

Procedia PDF Downloads 219
24443 Drivers of Digital Product Innovation in Firms: An Empirical Study of Technological, Organizational, and Environmental Factors

Authors: Anne Theresa Eidhoff, Sarah E. Stief, Markus Voeth, Sarah Gundlach

Abstract:

With digitalization increasingly changing the rules of competition, firms face the need to adapt and assimilate digital technologies in order to remain competitive. Firms can choose from various possibilities to integrate digital technologies including the option to embed digital technologies aiming to innovate products or to develop digital products. However, the question of which specific factors influence a firm’s decision to pursue digital product innovation remains unanswered in research. By adopting the Technology-Organization-Environment (TOE)-framework we have designed a qualitative exploratory study including eleven German practitioners to investigate relevant contingency factors. Our results indicate that the most critical factors for a company’s decision to pursue digital product innovation can be found in the technological and environmental dimensions, namely customers, competitive pressure, technological change, as well as digitalization fit. 

Keywords: digital innovation, digitalization, product innovation, TOE-framework

Procedia PDF Downloads 465
24442 Application of Fuzzy Clustering on Classification Agile Supply Chain

Authors: Hamidreza Fallah Lajimi , Elham Karami, Fatemeh Ali nasab, Mostafa Mahdavikia

Abstract:

Being responsive is an increasingly important skill for firms in today’s global economy; thus firms must be agile. Naturally, it follows that an organization’s agility depends on its supply chain being agile. However, achieving supply chain agility is a function of other abilities within the organization. This paper analyses results from a survey of 71 Iran manufacturing companies in order to identify some of the factors for agile organizations in managing their supply chains. Then we classification this company in four cluster with fuzzy c-mean technique and with four validations functional determine automatically the optimal number of clusters.

Keywords: agile supply chain, clustering, fuzzy clustering

Procedia PDF Downloads 455
24441 Adapting Strategies of Subaltern Counterpublics under Coronavirus-Related Restrictions

Authors: Alisa Sheppental

Abstract:

The focus of this paper is the impact of coronavirus-related restrictions on the legitimacy and efficacy of subaltern counter publics and political resistance. Both difficulties and alterations of strategies needed to be considered by modern political movements within the counter-public sphere will be illustrated based on recent examples of protests in Hong Kong, Thailand, Belarus, Poland, and France. The dynamics of the modern globalized world have previously required a high level of adaptability, which resulted in a number of new features of modern political resistance in contrast with previous decades, including digitalization of protests and higher involvement of previously fewer active citizens (women, elderly, people with disabilities, etc.) However, a global pandemic situation, along with massive restrictions of daily lives, provide new input for both theoretical and empirical analysis. The following paper represents an attempt to summarize coping and adapting strategies of subaltern counter publics and activist groups under coronavirus-related restrictions.

Keywords: citizenship, political activism, subaltern counterpublics, discourse ethics

Procedia PDF Downloads 118
24440 Critical Evaluation and Analysis of Effects of Different Queuing Disciplines on Packets Delivery and Delay for Different Applications

Authors: Omojokun Gabriel Aju

Abstract:

Communication network is a process of exchanging data between two or more devices via some forms of transmission medium using communication protocols. The data could be in form of text, images, audio, video or numbers which can be grouped into FTP, Email, HTTP, VOIP or Video applications. The effectiveness of such data exchange will be proved if they are accurately delivered within specified time. While some senders will not really mind when the data is actually received by the receiving device, inasmuch as it is acknowledged to have been received by the receiver. The time a data takes to get to a receiver could be very important to another sender, as any delay could cause serious problem or even in some cases rendered the data useless. The validity or invalidity of a data after delay will therefore definitely depend on the type of data (information). It is therefore imperative for the network device (such as router) to be able to differentiate among the packets which are time sensitive and those that are not, when they are passing through the same network. So, here is where the queuing disciplines comes to play, to handle network resources when such network is designed to service widely varying types of traffics and manage the available resources according to the configured policies. Therefore, as part of the resources allocation mechanisms, a router within the network must implement some queuing discipline that governs how packets (data) are buffered while waiting to be transmitted. The implementation of the queuing discipline will regulate how the packets are buffered while waiting to be transmitted. In achieving this, various queuing disciplines are being used to control the transmission of these packets, by determining which of the packets get the highest priority, less priority and which packets are dropped. The queuing discipline will therefore control the packets latency by determining how long a packet can wait to be transmitted or dropped. The common queuing disciplines are first-in-first-out queuing, Priority queuing and Weighted-fair queuing (FIFO, PQ and WFQ). This paper critically evaluates and analyse through the use of Optimized Network Evaluation Tool (OPNET) Modeller, Version 14.5 the effects of three queuing disciplines (FIFO, PQ and WFQ) on the performance of 5 different applications (FTP, HTTP, E-Mail, Voice and Video) within specified parameters using packets sent, packets received and transmission delay as performance metrics. The paper finally suggests some ways in which networks can be designed to provide better transmission performance while using these queuing disciplines.

Keywords: applications, first-in-first-out queuing (FIFO), optimised network evaluation tool (OPNET), packets, priority queuing (PQ), queuing discipline, weighted-fair queuing (WFQ)

Procedia PDF Downloads 345
24439 Condensation of Moist Air in Heat Exchanger Using CFD

Authors: Jan Barak, Karel Frana, Joerg Stiller

Abstract:

This work presents results of moist air condensation in heat exchanger. It describes theoretical knowledge and definition of moist air. Model with geometry of square canal was created for better understanding and post processing of condensation phenomena. Different approaches were examined on this model to find suitable software and model. Obtained knowledge was applied to geometry of real heat exchanger and results from experiment were compared with numerical results. One of the goals is to solve this issue without creating any user defined function in the applied code. It also contains summary of knowledge and outlook for future work.

Keywords: condensation, exchanger, experiment, validation

Procedia PDF Downloads 389
24438 The Interrelationship between Formal and Informal Institutions and Its Impacts on the Autonomy of Public Service Delivery Units: The Case of Vietnam

Authors: Minh Thi Hai Vo

Abstract:

This article draws on in-depth interviews with state employees at public hospitals and universities in its institutional analysis of the autonomy practices of public service delivery units in Vietnam. Unlike many empirical and theoretical studies that view formal and informal institutions as complements or substitutes, this article finds no evidence of complementary or substitutive relationships. Instead, the article finds that formal institutions accommodate informal ones and that informal institutions tend to compete and interfere, with the existing and ineffective formal institutions. The result of such conflicting relationship is that the actual autonomy of public service delivery units is, in most cases, perceived to be greater than the formal autonomy they are given. In the condition of poor regulation, the informal autonomy may result in unethical practices including rent-seeking and corruption. The implication of the study finding is policy-makers need to redesign and reorganize the autonomisation of public service delivery units to make informal institutions support and reinforce formal ones in a complementary manner.

Keywords: autonomy, formal institutions, informal institutions, public service delivery units, Vietnam

Procedia PDF Downloads 190
24437 Towards a Sustainable High Population Density Urban Intertextuality – Program Re-Configuration Integrated Urban Design Study in Hangzhou, China

Authors: Xuan Li, Lei Xu

Abstract:

By the end of 2014, China has an urban population of 749 million, reaching the urbanization rate of 54.77%. Dense and vertical urban structure has become a common choice for China and most of the densely populated Asian countries for sustainable development. This paper focuses on the most conspicuous urban change period in China, from 2000 to 2010, during which China's population shifted the fastest from rural region to cities. On one hand, the 200 million nationwide "new citizen" along with the 456 million "old citizen" explored in the new-century city for new urban lifestyle and livable built environment; On the other hand, however, large-scale rapid urban constructions are confined to the methods of traditional two-dimensional architectural thinking. Human-oriented design and system thinking have been missing in this intricate postmodern urban condition. This phenomenon, especially the gap and spark between the solid, huge urban physical system and the rich, subtle everyday urban life, will be studied in depth: How the 20th-century high-rise residential building "spontaneously" turned into an old but expensive multi-functional high-rise complex in the 21st century city center; how 21st century new/late 20th century old public buildings with the same function integrated their different architectural forms into the new / old city center? Finally the paper studies cases in Hangzhou: 1) Function Evolve–downtown high-rise residential building “International Garden” and “Zhongshan Garden” (1999). 2) Form Compare–Hangzhou Theater (1998) vs Hangzhou Grand Theatre (2004), Hangzhou City Railway Station (1999) vs Hangzhou East Railway Station (2013). The research aims at the exploring the essence of city from the building form dispel and urban program re-configuration approach, gaining a better consideration of human behavior through compact urban design effort for improving urban intertextuality, searching for a sustainable development path in the crucial time of urban population explosion in China.

Keywords: architecture form dispel, compact urban design, urban intertextuality, urban program re-configuration

Procedia PDF Downloads 483
24436 Approximation of Intersection Curves of Two Parametric Surfaces

Authors: Misbah Irshad, Faiza Sarfraz

Abstract:

The problem of approximating surface to surface intersection is considered to be very important in computer aided geometric design and computer aided manufacturing. Although it is a complex problem to handle, its continuous need in the industry makes it an active topic in research. A technique for approximating intersection curves of two parametric surfaces is proposed, which extracts boundary points and turning points from a sequence of intersection points and interpolate them with the help of rational cubic spline functions. The proposed approach is demonstrated with the help of examples and analyzed by calculating error.

Keywords: approximation, parametric surface, spline function, surface intersection

Procedia PDF Downloads 250
24435 Data Confidentiality in Public Cloud: A Method for Inclusion of ID-PKC Schemes in OpenStack Cloud

Authors: N. Nalini, Bhanu Prakash Gopularam

Abstract:

The term data security refers to the degree of resistance or protection given to information from unintended or unauthorized access. The core principles of information security are the confidentiality, integrity and availability, also referred as CIA triad. Cloud computing services are classified as SaaS, IaaS and PaaS services. With cloud adoption the confidential enterprise data are moved from organization premises to untrusted public network and due to this the attack surface has increased manifold. Several cloud computing platforms like OpenStack, Eucalyptus, Amazon EC2 offer users to build and configure public, hybrid and private clouds. While the traditional encryption based on PKI infrastructure still works in cloud scenario, the management of public-private keys and trust certificates is difficult. The Identity based Public Key Cryptography (also referred as ID-PKC) overcomes this problem by using publicly identifiable information for generating the keys and works well with decentralized systems. The users can exchange information securely without having to manage any trust information. Another advantage is that access control (role based access control policy) information can be embedded into data unlike in PKI where it is handled by separate component or system. In OpenStack cloud platform the keystone service acts as identity service for authentication and authorization and has support for public key infrastructure for auto services. In this paper, we explain OpenStack security architecture and evaluate the PKI infrastructure piece for data confidentiality. We provide method to integrate ID-PKC schemes for securing data while in transit and stored and explain the key measures for safe guarding data against security attacks. The proposed approach uses JPBC crypto library for key-pair generation based on IEEE P1636.3 standard and secure communication to other cloud services.

Keywords: data confidentiality, identity based cryptography, secure communication, open stack key stone, token scoping

Procedia PDF Downloads 366
24434 Multi-Scale Damage Modelling for Microstructure Dependent Short Fiber Reinforced Composite Structure Design

Authors: Joseph Fitoussi, Mohammadali Shirinbayan, Abbas Tcharkhtchi

Abstract:

Due to material flow during processing, short fiber reinforced composites structures obtained by injection or compression molding generally present strong spatial microstructure variation. On the other hand, quasi-static, dynamic, and fatigue behavior of these materials are highly dependent on microstructure parameters such as fiber orientation distribution. Indeed, because of complex damage mechanisms, SFRC structures design is a key challenge for safety and reliability. In this paper, we propose a micromechanical model allowing prediction of damage behavior of real structures as a function of microstructure spatial distribution. To this aim, a statistical damage criterion including strain rate and fatigue effect at the local scale is introduced into a Mori and Tanaka model. A critical local damage state is identified, allowing fatigue life prediction. Moreover, the multi-scale model is coupled with an experimental intrinsic link between damage under monotonic loading and fatigue life in order to build an abacus giving Tsai-Wu failure criterion parameters as a function of microstructure and targeted fatigue life. On the other hand, the micromechanical damage model gives access to the evolution of the anisotropic stiffness tensor of SFRC submitted to complex thermomechanical loading, including quasi-static, dynamic, and cyclic loading with temperature and amplitude variations. Then, the latter is used to fill out microstructure dependent material cards in finite element analysis for design optimization in the case of complex loading history. The proposed methodology is illustrated in the case of a real automotive component made of sheet molding compound (PSA 3008 tailgate). The obtained results emphasize how the proposed micromechanical methodology opens a new path for the automotive industry to lighten vehicle bodies and thereby save energy and reduce gas emission.

Keywords: short fiber reinforced composite, structural design, damage, micromechanical modelling, fatigue, strain rate effect

Procedia PDF Downloads 94
24433 Licensing in a Hotelling Model with Quadratic Transportation Costs

Authors: Fehmi Bouguezzi

Abstract:

This paper studies optimal licensing regimes in a linear Hotelling model where firms are located at the end points of the city and where the transportation cost is not linear but quadratic. We study for that a more general cost function and we try to compare the findings with the results of the linear cost. We find the same optimal licensing regimes. A per unit royalty is optimal when innovation is not drastic and no licensing is better when innovation is drastic. We also find that no licensing is always better than fixed fee licensing.

Keywords: Hotelling model, technology transfer, patent licensing, quadratic transportation cost

Procedia PDF Downloads 335
24432 Improved Distance Estimation in Dynamic Environments through Multi-Sensor Fusion with Extended Kalman Filter

Authors: Iffat Ara Ebu, Fahmida Islam, Mohammad Abdus Shahid Rafi, Mahfuzur Rahman, Umar Iqbal, John Ball

Abstract:

The application of multi-sensor fusion for enhanced distance estimation accuracy in dynamic environments is crucial for advanced driver assistance systems (ADAS) and autonomous vehicles. Limitations of single sensors such as cameras or radar in adverse conditions motivate the use of combined camera and radar data to improve reliability, adaptability, and object recognition. A multi-sensor fusion approach using an extended Kalman filter (EKF) is proposed to combine sensor measurements with a dynamic system model, achieving robust and accurate distance estimation. The research utilizes the Mississippi State University Autonomous Vehicular Simulator (MAVS) to create a controlled environment for data collection. Data analysis is performed using MATLAB. Qualitative (visualization of fused data vs ground truth) and quantitative metrics (RMSE, MAE) are employed for performance assessment. Initial results with simulated data demonstrate accurate distance estimation compared to individual sensors. The optimal sensor measurement noise variance and plant noise variance parameters within the EKF are identified, and the algorithm is validated with real-world data from a Chevrolet Blazer. In summary, this research demonstrates that multi-sensor fusion with an EKF significantly improves distance estimation accuracy in dynamic environments. This is supported by comprehensive evaluation metrics, with validation transitioning from simulated to real-world data, paving the way for safer and more reliable autonomous vehicle control.

Keywords: sensor fusion, EKF, MATLAB, MAVS, autonomous vehicle, ADAS

Procedia PDF Downloads 20
24431 A User Identification Technique to Access Big Data Using Cloud Services

Authors: A. R. Manu, V. K. Agrawal, K. N. Balasubramanya Murthy

Abstract:

Authentication is required in stored database systems so that only authorized users can access the data and related cloud infrastructures. This paper proposes an authentication technique using multi-factor and multi-dimensional authentication system with multi-level security. The proposed technique is likely to be more robust as the probability of breaking the password is extremely low. This framework uses a multi-modal biometric approach and SMS to enforce additional security measures with the conventional Login/password system. The robustness of the technique is demonstrated mathematically using a statistical analysis. This work presents the authentication system along with the user authentication architecture diagram, activity diagrams, data flow diagrams, sequence diagrams, and algorithms.

Keywords: design, implementation algorithms, performance, biometric approach

Procedia PDF Downloads 459
24430 Between Legal Authority and Epistemic Competence: A Case Study of the Brazilian Supreme Court

Authors: Júlia Massadas

Abstract:

The objective of this paper is to analyze the role played by the institute of the public hearings in the Brazilian Supreme Court. The public hearings are regulated since 1999 by the Brazilian Laws nº 9.868, nº 9.882 and by the Intern Regiment of the Brazilian Supreme Court. According to this legislation, the public hearings are supposed to be called when a matter of circumstance of fact must be clarified, what can be done through the hearing of the testimonies of persons with expertise and authority in the theme related to the cause. This work aims to investigate what is the role played by the public hearings and by the experts in the Brazilian Supreme Court. The hypothesis of this research is that: (I) The public hearings in the Brazilian Supreme Court are used to uphold a rhetoric of a democratic legitimacy of the Court`s decisions; (II) The Legislative intentions have been distorted. To test this hypothesis, the adopted methodology involves an empirical study of the Brazilian jurisprudence. As a conclusion, it follows that the public hearings convened by the Brazilian Supreme Court do not correspond, in practice, to the role assigned to them by the Congress since they do not serve properly to epistemic interests. The public hearings not only do not legitimate democratically the decisions, but also, do not properly clarify technical issues.

Keywords: Brazilian Supreme Court, constitutional law, public hearings, epistemic competence, legal authority

Procedia PDF Downloads 390
24429 Input Data Balancing in a Neural Network PM-10 Forecasting System

Authors: Suk-Hyun Yu, Heeyong Kwon

Abstract:

Recently PM-10 has become a social and global issue. It is one of major air pollutants which affect human health. Therefore, it needs to be forecasted rapidly and precisely. However, PM-10 comes from various emission sources, and its level of concentration is largely dependent on meteorological and geographical factors of local and global region, so the forecasting of PM-10 concentration is very difficult. Neural network model can be used in the case. But, there are few cases of high concentration PM-10. It makes the learning of the neural network model difficult. In this paper, we suggest a simple input balancing method when the data distribution is uneven. It is based on the probability of appearance of the data. Experimental results show that the input balancing makes the neural networks’ learning easy and improves the forecasting rates.

Keywords: artificial intelligence, air quality prediction, neural networks, pattern recognition, PM-10

Procedia PDF Downloads 220
24428 Metabolic Predictive Model for PMV Control Based on Deep Learning

Authors: Eunji Choi, Borang Park, Youngjae Choi, Jinwoo Moon

Abstract:

In this study, a predictive model for estimating the metabolism (MET) of human body was developed for the optimal control of indoor thermal environment. Human body images for indoor activities and human body joint coordinated values were collected as data sets, which are used in predictive model. A deep learning algorithm was used in an initial model, and its number of hidden layers and hidden neurons were optimized. Lastly, the model prediction performance was analyzed after the model being trained through collected data. In conclusion, the possibility of MET prediction was confirmed, and the direction of the future study was proposed as developing various data and the predictive model.

Keywords: deep learning, indoor quality, metabolism, predictive model

Procedia PDF Downloads 246
24427 Analysis of Brownfield Soil Contamination Using Local Government Planning Data

Authors: Emma E. Hellawell, Susan J. Hughes

Abstract:

BBrownfield sites are currently being redeveloped for residential use. Information on soil contamination on these former industrial sites is collected as part of the planning process by the local government. This research project analyses this untapped resource of environmental data, using site investigation data submitted to a local Borough Council, in Surrey, UK. Over 150 site investigation reports were collected and interrogated to extract relevant information. This study involved three phases. Phase 1 was the development of a database for soil contamination information from local government reports. This database contained information on the source, history, and quality of the data together with the chemical information on the soil that was sampled. Phase 2 involved obtaining site investigation reports for development within the study area and extracting the required information for the database. Phase 3 was the data analysis and interpretation of key contaminants to evaluate typical levels of contaminants, their distribution within the study area, and relating these results to current guideline levels of risk for future site users. Preliminary results for a pilot study using a sample of the dataset have been obtained. This pilot study showed there is some inconsistency in the quality of the reports and measured data, and careful interpretation of the data is required. Analysis of the information has found high levels of lead in shallow soil samples, with mean and median levels exceeding the current guidance for residential use. The data also showed elevated (but below guidance) levels of potentially carcinogenic polyaromatic hydrocarbons. Of particular concern from the data was the high detection rate for asbestos fibers. These were found at low concentrations in 25% of the soil samples tested (however, the sample set was small). Contamination levels of the remaining chemicals tested were all below the guidance level for residential site use. These preliminary pilot study results will be expanded, and results for the whole local government area will be presented at the conference. The pilot study has demonstrated the potential for this extensive dataset to provide greater information on local contamination levels. This can help inform regulators and developers and lead to more targeted site investigations, improving risk assessments, and brownfield development.

Keywords: Brownfield development, contaminated land, local government planning data, site investigation

Procedia PDF Downloads 128