Search results for: standard methodology.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3022

Search results for: standard methodology.

472 Feasibility Study of a Solar Farm Project with an Executive Approach

Authors: Amir Reza Talaghat

Abstract:

Since 2015, a new approach and policy regarding energy resources protection and using renewable energies has been started in Iran which was developing new projects. Investigating about the feasibility study of these new projects helped to figure out five steps to prepare an executive feasibility study of the concerned projects, which are proper site selections, authorizations, design and simulation, economic study and programming, respectively. The results were interesting and essential for decision makers and investors to start implementing of these projects in reliable condition. The research is obtained through collection and study of the project's documents as well as recalculation to review conformity of the results with GIS data and the technical information of the bidders. In this paper, it is attempted to describe the result of the performed research by describing the five steps as an executive methodology, for preparing a feasible study of installing a 10 MW – solar farm project. The corresponding results of the research also help decision makers to start similar projects is explained in this paper as follows: selecting the best location for the concerned PV plant, reliable and safe conditions for investment and the required authorizations to start implementing the solar farm project in the concerned region, selecting suitable component to achieve the best possible performance for the plant, economic profit of the investment, proper programming to implement the project on time.

Keywords: Solar farm, solar energy, execution of PV power plant, PV power plant, feasibility study.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1741
471 Legal Doctrine on Rylands v. Fletcher: One more time on Feasibility of a General Clause of Strict Liability in the UK

Authors: Maria Lubomira Kubica

Abstract:

The paper reveals the birth and evolution of the British precedent Rylands v. Fletcher that, once adopted on the other side of the Ocean (in United States), gave rise to a general clause of liability for abnormally dangerous activities recognized by the §20 of the American Restatements of the Law Third, Liability for Physical and Emotional Harm. The main goal of the paper was to analyze the development of the legal doctrine and of the case law posterior to the precedent together with the intent of the British judicature to leapfrog from the traditional rule contained in Rylands v. Fletcher to a general clause similar to that introduced in the United States and recently also on the European level. As it is well known, within the scope of tort law two different initiatives compete with the aim of harmonizing the European laws: European Group on Tort Law with its Principles of European Tort Law (hereinafter PETL) in which article 5:101 sets forth a general clause for strict liability for abnormally dangerous activities and Study Group on European Civil Code with its Common Frame of Reference (CFR) which promotes rather ad hoc model of listing out determined cases of strict liability. Very narrow application scope of the art. 5:101 PETL, restricted only to abnormally dangerous activities, stays in opposition to very broad spectrum of strict liability cases governed by the CFR. The former is a perfect example of a general clause that offers a minimum and basic standard, possibly acceptable also in those countries in which, like in the United Kingdom, this regime of liability is completely marginalized.

Keywords: Abnormally dangerous activities, general clause, Rylands v. Fletcher, strict liability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2051
470 Integrated Wastewater Reuse Project of the Faculty of Sciences Ain Chock, Morocco

Authors: Nihad Chakri, Btissam El Amrani, Faouzi Berrada, Fouad Amraoui

Abstract:

In Morocco, water scarcity requires the exploitation of non-conventional resources. Rural areas are under-equipped with sanitation infrastructure, unlike urban areas. Decentralized and low-cost solutions could improve the quality of life of the population and the environment. In this context, the Faculty of Sciences Ain Chock (FSAC) has undertaken an integrated project to treat part of its wastewater using a decentralized compact system. The project will propose alternative solutions that are inexpensive and adapted to the context of peri-urban and rural areas in order to treat the wastewater generated and to use it for irrigation, watering and cleaning. For this purpose, several tests were carried out in the laboratory in order to develop a liquid waste treatment system optimized for local conditions. Based on the results obtained at laboratory scale of the different proposed scenarios, we designed and implemented a prototype of a mini wastewater treatment plant for the faculty. In this article, we will outline the steps of dimensioning, construction and monitoring of the mini-station in our faculty.

Keywords: Wastewater, purification, response methodology surfaces optimization, vertical filter, Moving Bed Biofilm Reactors, MBBR process, sizing, prototype, Faculty of Sciences Ain Chock, decentralized approach, mini wastewater treatment plant, reuse of treated wastewater reuse, irrigation, sustainable development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 254
469 4D Modelling of Low Visibility Underwater Archaeological Excavations Using Multi-Source Photogrammetry in the Bulgarian Black Sea

Authors: Rodrigo Pacheco-Ruiz, Jonathan Adams, Felix Pedrotti

Abstract:

This paper introduces the applicability of underwater photogrammetric survey within challenging conditions as the main tool to enhance and enrich the process of documenting archaeological excavation through the creation of 4D models. Photogrammetry was being attempted on underwater archaeological sites at least as early as the 1970s’ and today the production of traditional 3D models is becoming a common practice within the discipline. Photogrammetry underwater is more often implemented to record exposed underwater archaeological remains and less so as a dynamic interpretative tool.  Therefore, it tends to be applied in bright environments and when underwater visibility is > 1m, reducing its implementation on most submerged archaeological sites in more turbid conditions. Recent years have seen significant development of better digital photographic sensors and the improvement of optical technology, ideal for darker environments. Such developments, in tandem with powerful processing computing systems, have allowed underwater photogrammetry to be used by this research as a standard recording and interpretative tool. Using multi-source photogrammetry (5, GoPro5 Hero Black cameras) this paper presents the accumulation of daily (4D) underwater surveys carried out in the Early Bronze Age (3,300 BC) to Late Ottoman (17th Century AD) archaeological site of Ropotamo in the Bulgarian Black Sea under challenging conditions (< 0.5m visibility). It proves that underwater photogrammetry can and should be used as one of the main recording methods even in low light and poor underwater conditions as a way to better understand the complexity of the underwater archaeological record.

Keywords: 4D modelling, Black Sea, maritime archaeology, underwater photogrammetry, Bronze Age, low visibility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1538
468 Evaluation of Non-Staggered Body-Fitted Grid Based Solution Method in Application to Supercritical Fluid Flows

Authors: Suresh Sahu, Abhijeet M. Vaidya, Naresh K. Maheshwari

Abstract:

The efforts to understand the heat transfer behavior of supercritical water in supercritical water cooled reactor (SCWR) are ongoing worldwide to fulfill the future energy demand. The higher thermal efficiency of these reactors compared to a conventional nuclear reactor is one of the driving forces for attracting the attention of nuclear scientists. In this work, a solution procedure has been described for solving supercritical fluid flow problems in complex geometries. The solution procedure is based on non-staggered grid. All governing equations are discretized by finite volume method (FVM) in curvilinear coordinate system. Convective terms are discretized by first-order upwind scheme and central difference approximation has been used to discretize the diffusive parts. k-ε turbulence model with standard wall function has been employed. SIMPLE solution procedure has been implemented for the curvilinear coordinate system. Based on this solution method, 3-D Computational Fluid Dynamics (CFD) code has been developed. In order to demonstrate the capability of this CFD code in supercritical fluid flows, heat transfer to supercritical water in circular tubes has been considered as a test problem. Results obtained by code have been compared with experimental results reported in literature.

Keywords: Curvilinear coordinate, body-fitted mesh, momentum interpolation, non-staggered grid, supercritical fluids.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 671
467 Information Dissemination System (IDS) Based E-Learning in Agricultural of Iran (Perception of Iranian Extension Agents)

Authors: A. R. Ommani, M. Chizari

Abstract:

The purpose of the study reported here was designing Information Dissemination System (IDS) based E-learning in agricultural of Iran. A questionnaire was developed to designing Information Dissemination System. The questionnaire was distributed to 96 extension agents who work for Management of Extension and Farming System of Khuzestan province of Iran. Data collected were analyzed using the Statistical Package for the Social Sciences (SPSS). Appropriate statistical procedures for description (frequencies, percent, means, and standard deviations) were used. In this study there was a significant relationship between the age , IT skill and knowledge, years of extension work, the extend of information seeking motivation, level of job satisfaction and level of education with use of information technology by extension agent. According to extension agents five factors were ranked respectively as five top essential items to designing Information Dissemination System (IDS) based E-learning in agricultural of Iran. These factors include: 1) Establish communication between farmers, coordinators (extension agents), agricultural experts, research centers, and community by information technology. 2) The communication between all should be mutual. 3) The information must be based farmers need. 4) Internet used as a facility to transfer the advanced agricultural information to the farming community. 5) Farmers can be illiterate and speak a local and they are not expected to use the system directly. Knowledge produced by the agricultural scientist must be transformed in to computer understandable presentation. To designing Information Dissemination System, electronic communication, in the agricultural society and rural areas must be developed. This communication must be mutual between all factors.

Keywords: E-learning, information dissemination system, information technology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2389
466 Interaction Effect of Feed Rate and Cutting Speed in CNC-Turning on Chip Micro-Hardness of 304- Austenitic Stainless Steel

Authors: G. H. Senussi

Abstract:

The present work is concerned with the effect of turning process parameters (cutting speed, feed rate, and depth of cut) and distance from the center of work piece as input variables on the chip micro-hardness as response or output. Three experiments were conducted; they were used to investigate the chip micro-hardness behavior at diameter of work piece for 30[mm], 40[mm], and 50[mm]. Response surface methodology (R.S.M) is used to determine and present the cause and effect of the relationship between true mean response and input control variables influencing the response as a two or three dimensional hyper surface. R.S.M has been used for designing a three factor with five level central composite rotatable factors design in order to construct statistical models capable of accurate prediction of responses. The results obtained showed that the application of R.S.M can predict the effect of machining parameters on chip micro-hardness. The five level factorial designs can be employed easily for developing statistical models to predict chip micro-hardness by controllable machining parameters. Results obtained showed that the combined effect of cutting speed at it?s lower level, feed rate and depth of cut at their higher values, and larger work piece diameter can result increasing chi micro-hardness.

Keywords: Machining Parameters, Chip Micro-Hardness, CNCMachining, 304-Austenic Stainless Steel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3284
465 A Review on Recycled Use of Solid Wastes in Building Materials

Authors: Oriyomi M. Okeyinka, David A. Oloke, Jamal M. Khatib

Abstract:

Large quantities of solid wastes being generated worldwide from sources such as household, domestic, industrial, commercial and construction demolition activities, leads to environmental concerns. Utilization of these wastes in making building construction materials can reduce the magnitude of the associated problems. When these waste products are used in place of other conventional materials, natural resources and energy are preserved and expensive and/or potentially harmful waste disposal is avoided. Recycling which is regarded as the third most preferred waste disposal option, with its numerous environmental benefits, stand as a viable option to offset the environmental impact associated with the construction industry. This paper reviews the results of laboratory tests and important research findings, and the potential of using these wastes in building construction materials with focus on sustainable development. Research gaps, which includes; the need to develop standard mix design for solid waste based building materials; the need to develop energy efficient method of processing solid waste use in concrete; the need to study the actual behavior or performance of such building materials in practical application and the limited real life application of such building materials have also been identified. A research is being proposed to develop an environmentally friendly, lightweight building block from recycled waste paper, without the use of cement, and with properties suitable for use as walling unit. This proposed research intends to incorporate, laboratory experimentation and modeling to address the identified research gaps.

Keywords: Recycling, solid waste, construction, building materials.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7111
464 A Proposal for a Secure and Interoperable Data Framework for Energy Digitalization

Authors: Hebberly Ahatlan

Abstract:

The process of digitizing energy systems involves transforming traditional energy infrastructure into interconnected, data-driven systems that enhance efficiency, sustainability, and responsiveness. As smart grids become increasingly integral to the efficient distribution and management of electricity from both fossil and renewable energy sources, the energy industry faces strategic challenges associated with digitalization and interoperability — particularly in the context of modern energy business models, such as virtual power plants (VPPs). The critical challenge in modern smart grids is to seamlessly integrate diverse technologies and systems, including virtualization, grid computing and service-oriented architecture (SOA), across the entire energy ecosystem. Achieving this requires addressing issues like semantic interoperability, Information Technology (IT) and Operational Technology (OT) convergence, and digital asset scalability, all while ensuring security and risk management. This paper proposes a four-layer digitalization framework to tackle these challenges, encompassing persistent data protection, trusted key management, secure messaging, and authentication of IoT resources. Data assets generated through this framework enable AI systems to derive insights for improving smart grid operations, security, and revenue generation. Furthermore, this paper also proposes a Trusted Energy Interoperability Alliance as a universal guiding standard in the development of this digitalization framework to support more dynamic and interoperable energy markets.

Keywords: Digitalization, IT/OT convergence, semantic interoperability, TEIA alliance, VPP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 117
463 Building Blocks for the Next eGovernment Era: Exploratory Study Based on Dubai and UAE’s Ministry of Happiness Communication in 2020

Authors: Diamantino Ribeiro, António Pedro Costa, Jorge Remondes

Abstract:

Dubai and the UAE governments have been investing in technology and digital communication for a long time. These governments are pioneers in introducing innovative strategies, policies and projects. They are also recognized worldwide for defining and implementing long term public programs. In terms of eGovernment Dubai and the UAE rank among the world’s most advanced. Both governments have surprised the world a few years ago by creating a Happiness Ministry. This paper focuses on UAE’s government digital strategies and its approach to the next era. The main goal of this exploratory study is to understand the new era of eGovernment and transfer of the happiness and wellness programs. Data were collected from the corpus latente and analysis was anchored in qualitative methodology using content analysis and observation as analysis techniques. The study allowed to highlight that the 2020 government reshuffle has a strong focus on digital reorganisation and digital sustainability, one of the newest trends in sustainability. Regarding happiness and wellbeing portfolio, we were able to observe that there has been a major change within the government organisation: The Ministry of Happiness was extinct and the Ministry of Community Development will manage the so-called ‘Happiness Portfolio’. Additionally, our observation allowed to note the government dual approach to governance: one through digital transformation, thus enhancing the digital sustainability process and, the second one trough government development.

Keywords: Ministry of Happiness, eGovernment, communication, digital sustainability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 502
462 Initiative Strategies on How to Increasing Value Add of the Recycling Business

Authors: Yananda Siraphatthada

Abstract:

The current study was the succession of a previous study on value added of recycling business management. Its aims are to 1) explore conditions on how to increasing value add of Thai recycling business, and 2) exam the implementation of the 3-staged plan (short, medium, and long term), suggested by the former study, to increase value added of the recycling business as immediate mechanisms to accelerate government operation. Quantitative and qualitative methods were utilized in this research. A qualitative research consisted of in-depth interviews and focus group discussions. Responses were obtained from owners of the waste separation plants, and recycle shops, as well as officers in relevant governmental agencies. They were randomly selected via Quota Sampling. Data was analyzed via content analysis. The sample used for quantitative method consisted of 1,274 licensed recycling operators in eight provinces. The operators were randomly stratified via sampling method. Data were analyzed via descriptive statistics frequency, percentage, average (Mean) and standard deviation.The study recommended three-staged plan: short, medium, and long terms. The plan included the development of logistics, the provision of quality market/plants, the amendment of recycling rules/regulation, the restructuring recycling business, the establishment of green-purchasing recycling center, support for the campaigns run by the International Green Purchasing Network (IGPN), conferences/workshops as a public forum to share insights among experts/concern people.

Keywords: Strategies, Value Added, Recycle Business.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
461 An Activity Based Trajectory Search Approach

Authors: Mohamed Mahmoud Hasan, Hoda M. O. Mokhtar

Abstract:

With the gigantic increment in portable applications use and the spread of positioning and location-aware technologies that we are seeing today, new procedures and methodologies for location-based strategies are required. Location recommendation is one of the highly demanded location-aware applications uniquely with the wide accessibility of social network applications that are location-aware including Facebook check-ins, Foursquare, and others. In this paper, we aim to present a new methodology for location recommendation. The proposed approach coordinates customary spatial traits alongside other essential components including shortest distance, and user interests. We also present another idea namely, "activity trajectory" that represents trajectory that fulfills the set of activities that the user is intrigued to do. The approach dispatched acquaints the related distance value to select trajectory(ies) with minimum cost value (distance) and spatial-area to prune unneeded directions. The proposed calculation utilizes the idea of movement direction to prescribe most comparable N-trajectory(ies) that matches the client's required action design with least voyaging separation. To upgrade the execution of the proposed approach, parallel handling is applied through the employment of a MapReduce based approach. Experiments taking into account genuine information sets were built up and tested for assessing the proposed approach. The exhibited tests indicate how the proposed approach beets different strategies giving better precision and run time.

Keywords: Location-based recommendation, map-reduce, recommendation system, trajectory search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 979
460 Indian License Plate Detection and Recognition Using Morphological Operation and Template Matching

Authors: W. Devapriya, C. Nelson Kennedy Babu, T. Srihari

Abstract:

Automatic License plate recognition (ALPR) is a technology which recognizes the registration plate or number plate or License plate of a vehicle. In this paper, an Indian vehicle number plate is mined and the characters are predicted in efficient manner. ALPR involves four major technique i) Pre-processing ii) License Plate Location Identification iii) Individual Character Segmentation iv) Character Recognition. The opening phase, named pre-processing helps to remove noises and enhances the quality of the image using the conception of Morphological Operation and Image subtraction. The second phase, the most puzzling stage ascertain the location of license plate using the protocol Canny Edge detection, dilation and erosion. In the third phase, each characters characterized by Connected Component Approach (CCA) and in the ending phase, each segmented characters are conceptualized using cross correlation template matching- a scheme specifically appropriate for fixed format. Major application of ALPR is Tolling collection, Border Control, Parking, Stolen cars, Enforcement, Access Control, Traffic control. The database consists of 500 car images taken under dissimilar lighting condition is used. The efficiency of the system is 97%. Our future focus is Indian Vehicle License Plate Validation (Whether License plate of a vehicle is as per Road transport and highway standard).

Keywords: Automatic License plate recognition, Character recognition, Number plate Recognition, Template matching, morphological operation, canny edge detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2405
459 Study of Encapsulation of Quantum Dots in Polystyrene and Poly (E-Caprolactone)Microreactors Prepared by Microvolcanic Eruption of Freeze Dried Microspheres

Authors: Deepak Kukkar, Inderpreet Kaur, Jagtar Singh, Lalit M Bharadwaj

Abstract:

Polymeric microreactors have emerged as a new generation of carriers that hold tremendous promise in the areas of cancer therapy, controlled delivery of drugs, for removal of pollutants etc. Present work reports a simple and convenient methodology for synthesis of polystyrene and poly caprolactone microreactors. An aqueous suspension of carboxylated (1μm) polystyrene latex particles was mixed with toluene solution followed by freezing with liquid nitrogen. Freezed particles were incubated at -20°C and characterized for formation of voids on the surface of polymer microspheres by Field Emission Scanning Electron Microscope. The hollow particles were then overnight incubated at 40ºC with unfunctionalized quantum dots (QDs) in 5:1 ratio. QDs Encapsulated polystyrene microcapsules were characterized by fluorescence microscopy. Likewise Poly ε-caprolactone microreactors were prepared by micro-volcanic rupture of freeze dried microspheres synthesized using emulsification of polymer with aqueous Poly vinyl alcohol and freezed with liquid nitrogen. Microreactors were examined with Field Emission Scanning Electron Microscope for size and morphology. Current study is an attempt to create hollow polymer particles which can be employed for microencapsulation of nanoparticles and drug molecules.

Keywords: FE-SEM, Microreactors, Microvolcanic rupture, Poly (ε-caprolactone), Polystyrene

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2359
458 Multi-Objective Evolutionary Computation Based Feature Selection Applied to Behaviour Assessment of Children

Authors: F. Jiménez, R. Jódar, M. Martín, G. Sánchez, G. Sciavicco

Abstract:

Abstract—Attribute or feature selection is one of the basic strategies to improve the performances of data classification tasks, and, at the same time, to reduce the complexity of classifiers, and it is a particularly fundamental one when the number of attributes is relatively high. Its application to unsupervised classification is restricted to a limited number of experiments in the literature. Evolutionary computation has already proven itself to be a very effective choice to consistently reduce the number of attributes towards a better classification rate and a simpler semantic interpretation of the inferred classifiers. We present a feature selection wrapper model composed by a multi-objective evolutionary algorithm, the clustering method Expectation-Maximization (EM), and the classifier C4.5 for the unsupervised classification of data extracted from a psychological test named BASC-II (Behavior Assessment System for Children - II ed.) with two objectives: Maximizing the likelihood of the clustering model and maximizing the accuracy of the obtained classifier. We present a methodology to integrate feature selection for unsupervised classification, model evaluation, decision making (to choose the most satisfactory model according to a a posteriori process in a multi-objective context), and testing. We compare the performance of the classifier obtained by the multi-objective evolutionary algorithms ENORA and NSGA-II, and the best solution is then validated by the psychologists that collected the data.

Keywords: Feature selection, multi-objective evolutionary computation, unsupervised classification, behavior assessment system for children.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1447
457 Implementation of a Paraconsistent-Fuzzy Digital PID Controller in a Level Control Process

Authors: H. M. Côrtes, J. I. Da Silva Filho, M. F. Blos, B. S. Zanon

Abstract:

In a modern society the factor corresponding to the increase in the level of quality in industrial production demand new techniques of control and machinery automation. In this context, this work presents the implementation of a Paraconsistent-Fuzzy Digital PID controller. The controller is based on the treatment of inconsistencies both in the Paraconsistent Logic and in the Fuzzy Logic. Paraconsistent analysis is performed on the signals applied to the system inputs using concepts from the Paraconsistent Annotated Logic with annotation of two values (PAL2v). The signals resulting from the paraconsistent analysis are two values defined as Dc - Degree of Certainty and Dct - Degree of Contradiction, which receive a treatment according to the Fuzzy Logic theory, and the resulting output of the logic actions is a single value called the crisp value, which is used to control dynamic system. Through an example, it was demonstrated the application of the proposed model. Initially, the Paraconsistent-Fuzzy Digital PID controller was built and tested in an isolated MATLAB environment and then compared to the equivalent Digital PID function of this software for standard step excitation. After this step, a level control plant was modeled to execute the controller function on a physical model, making the tests closer to the actual. For this, the control parameters (proportional, integral and derivative) were determined for the configuration of the conventional Digital PID controller and of the Paraconsistent-Fuzzy Digital PID, and the control meshes in MATLAB were assembled with the respective transfer function of the plant. Finally, the results of the comparison of the level control process between the Paraconsistent-Fuzzy Digital PID controller and the conventional Digital PID controller were presented.

Keywords: Fuzzy logic, paraconsistent annotated logic, level control, digital PID.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1237
456 An Approach to Secure Mobile Agent Communication in Multi-Agent Systems

Authors: Olumide Simeon Ogunnusi, Shukor Abd Razak, Michael Kolade Adu

Abstract:

Inter-agent communication manager facilitates communication among mobile agents via message passing mechanism. Until now, all Foundation for Intelligent Physical Agents (FIPA) compliant agent systems are capable of exchanging messages following the standard format of sending and receiving messages. Previous works tend to secure messages to be exchanged among a community of collaborative agents commissioned to perform specific tasks using cryptosystems. However, the approach is characterized by computational complexity due to the encryption and decryption processes required at the two ends. The proposed approach to secure agent communication allows only agents that are created by the host agent server to communicate via the agent communication channel provided by the host agent platform. These agents are assumed to be harmless. Therefore, to secure communication of legitimate agents from intrusion by external agents, a 2-phase policy enforcement system was developed. The first phase constrains the external agent to run only on the network server while the second phase confines the activities of the external agent to its execution environment. To implement the proposed policy, a controller agent was charged with the task of screening any external agent entering the local area network and preventing it from migrating to the agent execution host where the legitimate agents are running. On arrival of the external agent at the host network server, an introspector agent was charged to monitor and restrain its activities. This approach secures legitimate agent communication from Man-in-the Middle and Replay attacks.

Keywords: Agent communication, introspective agent, isolation of agent, policy enforcement system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 644
455 Numerical Modeling of Determination of in situ Rock Mass Deformation Modulus Using the Plate Load Test

Authors: A. Khodabakhshi, A. Mortazavi

Abstract:

Accurate determination of rock mass deformation modulus, as an important design parameter, is one of the most controversial issues in most engineering projects. A 3D numerical model of standard plate load test (PLT) using the FLAC3D code was carried to investigate the mechanism governing the test process. Five objectives were the focus of this study. The first goal was to employ 3D modeling in the interpretation of PLT conducted at the Bazoft dam site, Iran. The second objective was to investigate the effect of displacements measuring depth from the loading plates on the calculated moduli. The magnitude of rock mass deformation modulus calculated from PLT depends on anchor depth, and in practice, this may be a cause of error in the selection of realistic deformation modulus for the rock mass. The third goal of the study was to investigate the effect of testing plate diameter on the calculated modulus. Moreover, a comparison of the calculated modulus from ISRM formula, numerical modeling and calculated modulus from the actual PLT carried out at right abutment of the Bazoft dam site was another objective of the study. Finally, the effect of plastic strains on the calculated moduli in each of the loading-unloading cycles for three loading plates was investigated. The geometry, material properties, and boundary conditions on the constructed 3D model were selected based on the in-situ conditions of PLT at Bazoft dam site. A good agreement was achieved between numerical model results and the field tests results.

Keywords: Deformation modulus, numerical model, plate loading test, rock mass.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 772
454 A Perceptually Optimized Wavelet Embedded Zero Tree Image Coder

Authors: A. Bajit, M. Nahid, A. Tamtaoui, E. H. Bouyakhf

Abstract:

In this paper, we propose a Perceptually Optimized Embedded ZeroTree Image Coder (POEZIC) that introduces a perceptual weighting to wavelet transform coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to the coding quality obtained using the SPIHT algorithm only. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEZIC quality assessment. Our POEZIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) luminance masking and Contrast masking, 2) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting, 3) the Wavelet Error Sensitivity WES used to reduce the perceptual quantization errors. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.

Keywords: DWT, linear-phase 9/7 filter, 9/7 Wavelets Error Sensitivity WES, CSF implementation approaches, JND Just Noticeable Difference, Luminance masking, Contrast masking, standard SPIHT, Objective Quality Measure, Probability Score PS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2051
453 Tourism and Urban Planning for Intermediate Cities: An Empirical Approach toward Cultural Heritage Conservation in Damavand, Iran

Authors: E. Ghabouli

Abstract:

Intermediate cities which also called medium size cities have an important role in the process of globalization. It is argued that, in some cases this type of cities may be depopulated or in otherwise may be transformed as the periphery of metropolitans, so that the personal identity of the city and its local cultural heritage could suffer from its neighbor metropolitan. Over the last decades, the role of tourism in the development process and the cultural heritage has increased. The impact of tourism on socioeconomic growth makes motivation for the study of tourism development in regional and urban planning process. There are evidences that tourism has a positive impact in local development and makes economic motivations for cultural heritage protection. In this study, by considering the role of tourism in local development, especially by its economic and socio-cultural impacts, it is tried to introduce a strategy for tourism development through a method of urban planning for intermediate cities called as Base plan. Damavand is an intermediate city located in Tehran province, Iran with a high potential in tourism by its local specific characteristic like social structure, antiquities and natural attractions. It’s selected as a suitable case study for intended strategy which is a combination of urban planning and tourism development methods. Focusing on recognition of the historical and cultural heritage of Damavand, in this paper through “base plan methodology” a strategy of urban planning toward tourism development is prepared in order to make tourism development as a support for cultural heritage of this city.

Keywords: Urban planning, tourism, cultural heritage, intermediate cities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1538
452 Hybrid Equity Warrants Pricing Formulation under Stochastic Dynamics

Authors: Teh Raihana Nazirah Roslan, Siti Zulaiha Ibrahim, Sharmila Karim

Abstract:

A warrant is a financial contract that confers the right but not the obligation, to buy or sell a security at a certain price before expiration. The standard procedure to value equity warrants using call option pricing models such as the Black–Scholes model had been proven to contain many flaws, such as the assumption of constant interest rate and constant volatility. In fact, existing alternative models were found focusing more on demonstrating techniques for pricing, rather than empirical testing. Therefore, a mathematical model for pricing and analyzing equity warrants which comprises stochastic interest rate and stochastic volatility is essential to incorporate the dynamic relationships between the identified variables and illustrate the real market. Here, the aim is to develop dynamic pricing formulations for hybrid equity warrants by incorporating stochastic interest rates from the Cox-Ingersoll-Ross (CIR) model, along with stochastic volatility from the Heston model. The development of the model involves the derivations of stochastic differential equations that govern the model dynamics. The resulting equations which involve Cauchy problem and heat equations are then solved using partial differential equation approaches. The analytical pricing formulas obtained in this study comply with the form of analytical expressions embedded in the Black-Scholes model and other existing pricing models for equity warrants. This facilitates the practicality of this proposed formula for comparison purposes and further empirical study.

Keywords: Cox-Ingersoll-Ross model, equity warrants, Heston model, hybrid models, stochastic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 584
451 Quality Management in Spice Paprika Production as a Synergy of Internal and External Quality Measures

Authors: É. Kónya, E. Szabó, I. Bata-Vidács, T. Deák, M. Ottucsák, N. Adányi, A. Székács

Abstract:

Spice paprika is a major spice commodity in the European Union (EU), produced locally and imported from non-EU countries, reported not only for chemical and microbiological contamination, but also for fraud. The effective interaction between producers’ quality management practices and government and EU activities is described on the example of spice paprika production and control in Hungary, a country of leading spice paprika producer and per capita consumer in Europe. To demonstrate the importance of various contamination factors in the Hungarian production and EU trade of spice paprika, several aspects concerning food safety of this commodity are presented. Alerts in the Rapid Alert System for Food and Feed (RASFF) of the EU between 2005 and 2013, as well as Hungarian state inspection results on spice paprika in 2004 are discussed, and quality non-compliance claims regarding spice paprika among EU member states are summarized in by means of network analysis. Quality assurance measures established along the spice paprika production technology chain at the leading Hungarian spice paprika manufacturer, Kalocsai Fűszerpaprika Zrt. are surveyed with main critical control points identified. The structure and operation of the Hungarian state food safety inspection system is described. Concerted performance of the latter two quality management systems illustrates the effective interaction between internal (manufacturer) and external (state) quality control measures.

Keywords: Spice paprika, quality control, reporting mechanisms, RASFF, vulnerable points, HACCP, BRC Global Standard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1968
450 Robust Integrated Design for a Mechatronic Feed Drive System of Machine Tools

Authors: Chin-Yin Chen, Chi-Cheng Cheng

Abstract:

This paper aims at to develop a robust optimization methodology for the mechatronic modules of machine tools by considering all important characteristics from all structural and control domains in one single process. The relationship between these two domains is strongly coupled. In order to reduce the disturbance caused by parameters in either one, the mechanical and controller design domains need to be integrated. Therefore, the concurrent integrated design method Design For Control (DFC), will be employed in this paper. In this connect, it is not only applied to achieve minimal power consumption but also enhance structural performance and system response at same time. To investigate the method for integrated optimization, a mechatronic feed drive system of the machine tools is used as a design platform. Pro/Engineer and AnSys are first used to build the 3D model to analyze and design structure parameters such as elastic deformation, nature frequency and component size, based on their effects and sensitivities to the structure. In addition, the robust controller,based on Quantitative Feedback Theory (QFT), will be applied to determine proper control parameters for the controller. Therefore, overall physical properties of the machine tool will be obtained in the initial stage. Finally, the technology of design for control will be carried out to modify the structural and control parameters to achieve overall system performance. Hence, the corresponding productivity is expected to be greatly improved.

Keywords: Machine tools, integrated structure and control design, design for control, multilevel decomposition, quantitative feedback theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1948
449 Info-participation of the Disabled Using the Mixed Preference Data in Improving Their Travel Quality

Authors: Y. Duvarci, S. Mizokami

Abstract:

Today, the preferences and participation of the TD groups such as the elderly and disabled is still lacking in decision-making of transportation planning, and their reactions to certain type of policies are not well known. Thus, a clear methodology is needed. This study aimed to develop a method to extract the preferences of the disabled to be used in the policy-making stage that can also guide to future estimations. The method utilizes the combination of cluster analysis and data filtering using the data of the Arao city (Japan). The method is a process that follows: defining the TD group by the cluster analysis tool, their travel preferences in tabular form from the household surveys by policy variableimpact pairs, zones, and by trip purposes, and the final outcome is the preference probabilities of the disabled. The preferences vary by trip purpose; for the work trips, accessibility and transit system quality policies with the accompanying impacts of modal shifts towards public mode use as well as the decreasing travel costs, and the trip rate increase; for the social trips, the same accessibility and transit system policies leading to the same mode shift impact, together with the travel quality policy area leading to trip rate increase. These results explain the policies to focus and can be used in scenario generation in models, or any other planning purpose as decision support tool.

Keywords: Transportation Disadvantaged, Disabled, Mixed Preference, Stated Preference Data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1079
448 Human Digital Twin for Personal Conversation Automation Using Supervised Machine Learning Approaches

Authors: Aya Salama

Abstract:

Digital Twin has emerged as a compelling research area, capturing the attention of scholars over the past decade. It finds applications across diverse fields, including smart manufacturing and healthcare, offering significant time and cost savings. Notably, it often intersects with other cutting-edge technologies such as Data Mining, Artificial Intelligence, and Machine Learning. However, the concept of a Human Digital Twin (HDT) is still in its infancy and requires further demonstration of its practicality. HDT takes the notion of Digital Twin a step further by extending it to living entities, notably humans, who are vastly different from inanimate physical objects. The primary objective of this research was to create an HDT capable of automating real-time human responses by simulating human behavior. To achieve this, the study delved into various areas, including clustering, supervised classification, topic extraction, and sentiment analysis. The paper successfully demonstrated the feasibility of HDT for generating personalized responses in social messaging applications. Notably, the proposed approach achieved an overall accuracy of 63%, a highly promising result that could pave the way for further exploration of the HDT concept. The methodology employed Random Forest for clustering the question database and matching new questions, while K-nearest neighbor was utilized for sentiment analysis.

Keywords: Human Digital twin, sentiment analysis, topic extraction, supervised machine learning, unsupervised machine learning, classification and clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188
447 Performance Analysis of Reconstruction Algorithms in Diffuse Optical Tomography

Authors: K. Uma Maheswari, S. Sathiyamoorthy, G. Lakshmi

Abstract:

Diffuse Optical Tomography (DOT) is a non-invasive imaging modality used in clinical diagnosis for earlier detection of carcinoma cells in brain tissue. It is a form of optical tomography which produces gives the reconstructed image of a human soft tissue with by using near-infra-red light. It comprises of two steps called forward model and inverse model. The forward model provides the light propagation in a biological medium. The inverse model uses the scattered light to collect the optical parameters of human tissue. DOT suffers from severe ill-posedness due to its incomplete measurement data. So the accurate analysis of this modality is very complicated. To overcome this problem, optical properties of the soft tissue such as absorption coefficient, scattering coefficient, optical flux are processed by the standard regularization technique called Levenberg - Marquardt regularization. The reconstruction algorithms such as Split Bregman and Gradient projection for sparse reconstruction (GPSR) methods are used to reconstruct the image of a human soft tissue for tumour detection. Among these algorithms, Split Bregman method provides better performance than GPSR algorithm. The parameters such as signal to noise ratio (SNR), contrast to noise ratio (CNR), relative error (RE) and CPU time for reconstructing images are analyzed to get a better performance.

Keywords: Diffuse optical tomography, ill-posedness, Levenberg Marquardt method, Split Bregman, the Gradient projection for sparse reconstruction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1618
446 On Methodologies for Analysing Sickness Absence Data: An Insight into a New Method

Authors: Xiaoshu Lu, Päivi Leino-Arjas, Kustaa Piha, Akseli Aittomäki, Peppiina Saastamoinen, Ossi Rahkonen, Eero Lahelma

Abstract:

Sickness absence represents a major economic and social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model selection and a critical analysis of the temporal trends, the occurrence and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model applicability to complicated longitudinal data.

Keywords: Sickness absence, longitudinal data, methodologies, mix-distribution model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2271
445 Diagnostic Contribution of the MMSE-2:EV in the Detection and Monitoring of the Cognitive Impairment: Case Studies

Authors: Cornelia-Eugenia Munteanu

Abstract:

The goal of this paper is to present the diagnostic contribution that the screening instrument, Mini-Mental State Examination-2: Expanded Version (MMSE-2:EV), brings in detecting the cognitive impairment or in monitoring the progress of degenerative disorders. The diagnostic signification is underlined by the interpretation of the MMSE-2:EV scores, resulted from the test application to patients with mild and major neurocognitive disorders. The cases were selected from current practice, in order to cover vast and significant neurocognitive pathology: mild cognitive impairment, Alzheimer’s disease, vascular dementia, mixed dementia, Parkinson’s disease, conversion of the mild cognitive impairment into Alzheimer’s disease. The MMSE-2:EV version was used: it was applied one month after the initial assessment, three months after the first reevaluation and then every six months, alternating the blue and red forms. Correlated with age and educational level, the raw scores were converted in T scores and then, with the mean and the standard deviation, the z scores were calculated. The differences of raw scores between the evaluations were analyzed from the point of view of statistic signification, in order to establish the progression in time of the disease. The results indicated that the psycho-diagnostic approach for the evaluation of the cognitive impairment with MMSE-2:EV is safe and the application interval is optimal. In clinical settings with a large flux of patients, the application of the MMSE-2:EV is a safe and fast psychodiagnostic solution. The clinicians can draw objective decisions and for the patients: it does not take too much time and energy, it does not bother them and it doesn’t force them to travel frequently.

Keywords: MMSE-2, dementia, cognitive impairment, neuropsychology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3686
444 Electricity Load Modeling: An Application to Italian Market

Authors: Giovanni Masala, Stefania Marica

Abstract:

Forecasting electricity load plays a crucial role regards decision making and planning for economical purposes. Besides, in the light of the recent privatization and deregulation of the power industry, the forecasting of future electricity load turned out to be a very challenging problem. Empirical data about electricity load highlights a clear seasonal behavior (higher load during the winter season), which is partly due to climatic effects. We also emphasize the presence of load periodicity at a weekly basis (electricity load is usually lower on weekends or holidays) and at daily basis (electricity load is clearly influenced by the hour). Finally, a long-term trend may depend on the general economic situation (for example, industrial production affects electricity load). All these features must be captured by the model. The purpose of this paper is then to build an hourly electricity load model. The deterministic component of the model requires non-linear regression and Fourier series while we will investigate the stochastic component through econometrical tools. The calibration of the parameters’ model will be performed by using data coming from the Italian market in a 6 year period (2007- 2012). Then, we will perform a Monte Carlo simulation in order to compare the simulated data respect to the real data (both in-sample and out-of-sample inspection). The reliability of the model will be deduced thanks to standard tests which highlight a good fitting of the simulated values.

Keywords: ARMA-GARCH process, electricity load, fitting tests, Fourier series, Monte Carlo simulation, non-linear regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1486
443 Distributed Automation System Based Remote Monitoring of Power Quality Disturbance on LV Network

Authors: Emmanuel D. Buedi, K. O. Boateng, Griffith S. Klogo

Abstract:

Electrical distribution networks are prone to power quality disturbances originating from the complexity of the distribution network, mode of distribution (overhead or underground) and types of loads used by customers. Data on the types of disturbances present and frequency of occurrence is needed for economic evaluation and hence finding solution to the problem. Utility companies have resorted to using secondary power quality devices such as smart meters to help gather the required data. Even though this approach is easier to adopt, data gathered from these devices may not serve the required purpose, since the installation of these devices in the electrical network usually does not conform to available PQM placement methods. This paper presents a design of a PQM that is capable of integrating into an existing DAS infrastructure to take advantage of available placement methodologies. The monitoring component of the design is implemented and installed to monitor an existing LV network. Data from the monitor is analyzed and presented. A portion of the LV network of the Electricity Company of Ghana is modeled in MATLAB-Simulink and analyzed under various earth fault conditions. The results presented show the ability of the PQM to detect and analyze PQ disturbance such as voltage sag and overvoltage. By adopting a placement methodology and installing these nodes, utilities are assured of accurate and reliable information with respect to the quality of power delivered to consumers.

Keywords: Power quality, remote monitoring, distributed automation system, economic evaluation, LV network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1137