Search results for: syntactic complexity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1773

Search results for: syntactic complexity

723 Distributed System Computing Resource Scheduling Algorithm Based on Deep Reinforcement Learning

Authors: Yitao Lei, Xingxiang Zhai, Burra Venkata Durga Kumar

Abstract:

As the quantity and complexity of computing in large-scale software systems increase, distributed system computing becomes increasingly important. The distributed system realizes high-performance computing by collaboration between different computing resources. If there are no efficient resource scheduling resources, the abuse of distributed computing may cause resource waste and high costs. However, resource scheduling is usually an NP-hard problem, so we cannot find a general solution. However, some optimization algorithms exist like genetic algorithm, ant colony optimization, etc. The large scale of distributed systems makes this traditional optimization algorithm challenging to work with. Heuristic and machine learning algorithms are usually applied in this situation to ease the computing load. As a result, we do a review of traditional resource scheduling optimization algorithms and try to introduce a deep reinforcement learning method that utilizes the perceptual ability of neural networks and the decision-making ability of reinforcement learning. Using the machine learning method, we try to find important factors that influence the performance of distributed system computing and help the distributed system do an efficient computing resource scheduling. This paper surveys the application of deep reinforcement learning on distributed system computing resource scheduling proposes a deep reinforcement learning method that uses a recurrent neural network to optimize the resource scheduling, and proposes the challenges and improvement directions for DRL-based resource scheduling algorithms.

Keywords: resource scheduling, deep reinforcement learning, distributed system, artificial intelligence

Procedia PDF Downloads 102
722 Towards Green(er) Cities: The Role of Spatial Planning in Realising the Green Agenda

Authors: Elizelle Juaneé Cilliers

Abstract:

The green hype is becoming stronger within various disciplines, modern practices and academic thinking, enforced by concepts such as eco-health, eco-tourism, eco-cities, and eco-engineering. There is currently also an expanded scientific understanding regarding the value and benefits relating to green infrastructure, for both communities and their host cities, linked to broader sustainability and resilience thinking. The integration and implementation of green infrastructure as part of spatial planning approaches and municipal planning, are, however, more complex, especially in South Africa, inflated by limitations of budgets and human resources, development pressures, inequities in terms of green space availability and political legacies of the past. The prevailing approach to spatial planning is further contributing to complexity, linked to misguided perceptions of the function and value of green infrastructure. As such, green spaces are often considered a luxury, and green infrastructure a costly alternative, resulting in green networks being susceptible to land-use changes and under-prioritized in local authority decision-making. Spatial planning, in this sense, may well be a valuable tool to realise the green agenda, encapsulating various initiatives of sustainability as provided by a range of disciplines. This paper aims to clarify the importance and value of green infrastructure planning as a component of spatial planning approaches, in order to inform and encourage local authorities to embed sustainability thinking into city planning and decision-making approaches. It reflects on the decisive role of land-use management to guide the green agenda and refers to some recent planning initiatives. Lastly, it calls for trans-disciplinary planning approaches to build a case towards green(er) cities.

Keywords: green infrastructure, spatial planning, transdisciplinary, integrative

Procedia PDF Downloads 245
721 Sustainable Production of Tin Oxide Nanoparticles: Exploring Synthesis Techniques, Formation Mechanisms, and Versatile Applications

Authors: Yemane Tadesse Gebreslassie, Henok Gidey Gebretnsae

Abstract:

Nanotechnology has emerged as a highly promising field of research with wide-ranging applications across various scientific disciplines. In recent years, tin oxide has garnered significant attention due to its intriguing properties, particularly when synthesized in the nanoscale range. While numerous physical and chemical methods exist for producing tin oxide nanoparticles, these approaches tend to be costly, energy-intensive, and involve the use of toxic chemicals. Given the growing concerns regarding human health and environmental impact, there has been a shift towards developing cost-effective and environmentally friendly processes for tin oxide nanoparticle synthesis. Green synthesis methods utilizing biological entities such as plant extracts, bacteria, and natural biomolecules have shown promise in successfully producing tin oxide nanoparticles. However, scaling up the production to an industrial level using green synthesis approaches remains challenging due to the complexity of biological substrates, which hinders the elucidation of reaction mechanisms and formation processes. Thus, this review aims to provide an overview of the various sources of biological entities and methodologies employed in the green synthesis of tin oxide nanoparticles, as well as their impact on nanoparticle properties. Furthermore, this research delves into the strides made in comprehending the mechanisms behind the formation of nanoparticles as documented in existing literature. It also sheds light on the array of analytical techniques employed to investigate and elucidate the characteristics of these minuscule particles.

Keywords: nanotechnology, tin oxide, green synthesis, formation mechanisms

Procedia PDF Downloads 43
720 Foreign Languages and Employability in the European Union

Authors: Paulina Pietrzyk-Kowalec

Abstract:

This paper presents the phenomenon of multilingualism becoming the norm rather than the exception in the European Union. It also seeks to describe the correlation between the command of foreign languages and employability. It is evident that the challenges of today's societies when it comes to employability and to the reality of the current labor market are more and more diversified. Thus, it is one of the crucial tasks of higher education to prepare its students to face this kind of complexity, understand its nuances, and have the capacity to adapt effectively to situations that are common in corporations based in the countries belonging to the EU. From this point of view, the assessment of the impact that the command of foreign languages of European university students could have on the numerous business sectors becomes vital. It also involves raising awareness of future professionals to make them understand the importance of mastering communicative skills in foreign languages that will meet the requirements of students' prospective employers. The direct connection between higher education institutions and the world of business also allows companies to realize that they should rethink their recruitment and human resources procedures in order to take into account the importance of foreign languages. This article focuses on the objective of the multilingualism policy developed by the European Commission, which is to enable young people to master at least two foreign languages, which is crucial in their future careers. The article puts emphasis on the existence of a crucial connection between the research conducted in higher education institutions and the business sector in order to reduce current qualification gaps.

Keywords: cross-cultural communication, employability, human resources, language attitudes, multilingualism

Procedia PDF Downloads 128
719 Systems Engineering Management Using Transdisciplinary Quality System Development Lifecycle Model

Authors: Mohamed Asaad Abdelrazek, Amir Taher El-Sheikh, M. Zayan, A.M. Elhady

Abstract:

The successful realization of complex systems is dependent not only on the technology issues and the process for implementing them, but on the management issues as well. Managing the systems development lifecycle requires technical management. Systems engineering management is the technical management. Systems engineering management is accomplished by incorporating many activities. The three major activities are development phasing, systems engineering process and lifecycle integration. Systems engineering management activities are performed across the system development lifecycle. Due to the ever-increasing complexity of systems as well the difficulty of managing and tracking the development activities, new ways to achieve systems engineering management activities are required. This paper presents a systematic approach used as a design management tool applied across systems engineering management roles. In this approach, Transdisciplinary System Development Lifecycle (TSDL) Model has been modified and integrated with Quality Function Deployment. Hereinafter, the name of the systematic approach is the Transdisciplinary Quality System Development Lifecycle (TQSDL) Model. The QFD translates the voice of customers (VOC) into measurable technical characteristics. The modified TSDL model is based on Axiomatic Design developed by Suh which is applicable to all designs: products, processes, systems and organizations. The TQSDL model aims to provide a robust structure and systematic thinking to support the implementation of systems engineering management roles. This approach ensures that the customer requirements are fulfilled as well as satisfies all the systems engineering manager roles and activities.

Keywords: axiomatic design, quality function deployment, systems engineering management, system development lifecycle

Procedia PDF Downloads 354
718 Linguistic Insights Improve Semantic Technology in Medical Research and Patient Self-Management Contexts

Authors: William Michael Short

Abstract:

Semantic Web’ technologies such as the Unified Medical Language System Metathesaurus, SNOMED-CT, and MeSH have been touted as transformational for the way users access online medical and health information, enabling both the automated analysis of natural-language data and the integration of heterogeneous healthrelated resources distributed across the Internet through the use of standardized terminologies that capture concepts and relationships between concepts that are expressed differently across datasets. However, the approaches that have so far characterized ‘semantic bioinformatics’ have not yet fulfilled the promise of the Semantic Web for medical and health information retrieval applications. This paper argues within the perspective of cognitive linguistics and cognitive anthropology that four features of human meaning-making must be taken into account before the potential of semantic technologies can be realized for this domain. First, many semantic technologies operate exclusively at the level of the word. However, texts convey meanings in ways beyond lexical semantics. For example, transitivity patterns (distributions of active or passive voice) and modality patterns (configurations of modal constituents like may, might, could, would, should) convey experiential and epistemic meanings that are not captured by single words. Language users also naturally associate stretches of text with discrete meanings, so that whole sentences can be ascribed senses similar to the senses of words (so-called ‘discourse topics’). Second, natural language processing systems tend to operate according to the principle of ‘one token, one tag’. For instance, occurrences of the word sound must be disambiguated for part of speech: in context, is sound a noun or a verb or an adjective? In syntactic analysis, deterministic annotation methods may be acceptable. But because natural language utterances are typically characterized by polyvalency and ambiguities of all kinds (including intentional ambiguities), such methods leave the meanings of texts highly impoverished. Third, ontologies tend to be disconnected from everyday language use and so struggle in cases where single concepts are captured through complex lexicalizations that involve profile shifts or other embodied representations. More problematically, concept graphs tend to capture ‘expert’ technical models rather than ‘folk’ models of knowledge and so may not match users’ common-sense intuitions about the organization of concepts in prototypical structures rather than Aristotelian categories. Fourth, and finally, most ontologies do not recognize the pervasively figurative character of human language. However, since the time of Galen the widespread use of metaphor in the linguistic usage of both medical professionals and lay persons has been recognized. In particular, metaphor is a well-documented linguistic tool for communicating experiences of pain. Because semantic medical knowledge-bases are designed to help capture variations within technical vocabularies – rather than the kinds of conventionalized figurative semantics that practitioners as well as patients actually utilize in clinical description and diagnosis – they fail to capture this dimension of linguistic usage. The failure of semantic technologies in these respects degrades the efficiency and efficacy not only of medical research, where information retrieval inefficiencies can lead to direct financial costs to organizations, but also of care provision, especially in contexts of patients’ self-management of complex medical conditions.

Keywords: ambiguity, bioinformatics, language, meaning, metaphor, ontology, semantic web, semantics

Procedia PDF Downloads 126
717 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation

Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez

Abstract:

With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).

Keywords: component carrier, carrier aggregation, LTE-advanced, scheduling

Procedia PDF Downloads 193
716 Soft Computing Employment to Optimize Safety Stock Levels in Supply Chain Dairy Product under Supply and Demand Uncertainty

Authors: Riyadh Jamegh, Alla Eldin Kassam, Sawsan Sabih

Abstract:

In order to overcome uncertainty conditions and inability to meet customers' requests due to these conditions, organizations tend to reserve a certain safety stock level (SSL). This level must be chosen carefully in order to avoid the increase in holding cost due to excess in SSL or shortage cost due to too low SSL. This paper used soft computing fuzzy logic to identify optimal SSL; this fuzzy model uses the dynamic concept to cope with high complexity environment status. The proposed model can deal with three input variables, i.e., demand stability level, raw material availability level, and on hand inventory level by using dynamic fuzzy logic to obtain the best SSL as an output. In this model, demand stability, raw material, and on hand inventory levels are described linguistically and then treated by inference rules of the fuzzy model to extract the best level of safety stock. The aim of this research is to provide dynamic approach which is used to identify safety stock level, and it can be implanted in different industries. Numerical case study in the dairy industry with Yogurt 200 gm cup product is explained to approve the validity of the proposed model. The obtained results are compared with the current level of safety stock which is calculated by using the traditional approach. The importance of the proposed model has been demonstrated by the significant reduction in safety stock level.

Keywords: inventory optimization, soft computing, safety stock optimization, dairy industries inventory optimization

Procedia PDF Downloads 118
715 Co-payment Strategies for Chronic Medications: A Qualitative and Comparative Analysis at European Level

Authors: Pedro M. Abreu, Bruno R. Mendes

Abstract:

The management of pharmacotherapy and the process of dispensing medicines is becoming critical in clinical pharmacy due to the increase of incidence and prevalence of chronic diseases, the complexity and customization of therapeutic regimens, the introduction of innovative and more expensive medicines, the unbalanced relation between expenditure and revenue as well as due to the lack of rationalization associated with medication use. For these reasons, co-payments emerged in Europe in the 70s and have been applied over the past few years in healthcare. Co-payments lead to a rationing and rationalization of user’s access under healthcare services and products, and simultaneously, to a qualification and improvement of the services and products for the end-user. This analysis, under hospital practices particularly and co-payment strategies in general, was carried out on all the European regions and identified four reference countries, that apply repeatedly this tool and with different approaches. The structure, content and adaptation of European co-payments were analyzed through 7 qualitative attributes and 19 performance indicators, and the results expressed in a scorecard, allowing to conclude that the German models (total score of 68,2% and 63,6% in both elected co-payments) can collect more compliance and effectiveness, the English models (total score of 50%) can be more accessible, and the French models (total score of 50%) can be more adequate to the socio-economic and legal framework. Other European models did not show the same quality and/or performance, so were not taken as a standard in the future design of co-payments strategies. In this sense, we can see in the co-payments a strategy not only to moderate the consumption of healthcare products and services, but especially to improve them, as well as a strategy to increment the value that the end-user assigns to these services and products, such as medicines.

Keywords: clinical pharmacy, co-payments, healthcare, medicines

Procedia PDF Downloads 247
714 INRAM-3DCNN: Multi-Scale Convolutional Neural Network Based on Residual and Attention Module Combined with Multilayer Perceptron for Hyperspectral Image Classification

Authors: Jianhong Xiang, Rui Sun, Linyu Wang

Abstract:

In recent years, due to the continuous improvement of deep learning theory, Convolutional Neural Network (CNN) has played a great superior performance in the research of Hyperspectral Image (HSI) classification. Since HSI has rich spatial-spectral information, only utilizing a single dimensional or single size convolutional kernel will limit the detailed feature information received by CNN, which limits the classification accuracy of HSI. In this paper, we design a multi-scale CNN with MLP based on residual and attention modules (INRAM-3DCNN) for the HSI classification task. We propose to use multiple 3D convolutional kernels to extract the packet feature information and fully learn the spatial-spectral features of HSI while designing residual 3D convolutional branches to avoid the decline of classification accuracy due to network degradation. Secondly, we also design the 2D Inception module with a joint channel attention mechanism to quickly extract key spatial feature information at different scales of HSI and reduce the complexity of the 3D model. Due to the high parallel processing capability and nonlinear global action of the Multilayer Perceptron (MLP), we use it in combination with the previous CNN structure for the final classification process. The experimental results on two HSI datasets show that the proposed INRAM-3DCNN method has superior classification performance and can perform the classification task excellently.

Keywords: INRAM-3DCNN, residual, channel attention, hyperspectral image classification

Procedia PDF Downloads 73
713 Characterization of Optical Systems for Intraocular Projection

Authors: Charles Q. Yu, Victoria H. Fan, Ahmed F. Al-Qahtani, Ibraim Viera

Abstract:

Introduction: Over 12 million people are blind due to opacity of the cornea, the clear tissue forming the front of the eye. Current methods use plastic implants to produce a clear optical pathway into the eye but are limited by a high rate of complications. New implants utilizing completely inside-the-eye projection technology can overcome blindness due to scarring of the eye by producing images on the retina without need for a clear optical pathway into the eye and may be free of the complications of traditional treatments. However, the interior of the eye is a challenging location for the design of optical focusing systems which can produce a sufficiently high quality image. No optical focusing systems have previously been characterized for this purpose. Methods: 3 optical focusing systems for intraocular (inside the eye) projection were designed and then modeled with ray tracing software, including a pinhole system, a planoconvex, and an achromatic system. These were then constructed using off-the-shelf components and tested in the laboratory. Weight, size, magnification, depth of focus, image quality and brightness were characterized. Results: Image quality increased with complexity of system design, as did weight and size. A dual achromatic doublet optical system produced the highest image quality. The visual acuity equivalent achieved with this system was better than 20/200. Its weight was less than that of the natural human crystalline lens. Conclusions: We demonstrate for the first time that high quality images can be produced by optical systems sufficiently small and light to be implanted within the eye.

Keywords: focusing, projection, blindness, cornea , achromatic, pinhole

Procedia PDF Downloads 124
712 VIAN-DH: Computational Multimodal Conversation Analysis Software and Infrastructure

Authors: Teodora Vukovic, Christoph Hottiger, Noah Bubenhofer

Abstract:

The development of VIAN-DH aims at bridging two linguistic approaches: conversation analysis/interactional linguistics (IL), so far a dominantly qualitative field, and computational/corpus linguistics and its quantitative and automated methods. Contemporary IL investigates the systematic organization of conversations and interactions composed of speech, gaze, gestures, and body positioning, among others. These highly integrated multimodal behaviour is analysed based on video data aimed at uncovering so called “multimodal gestalts”, patterns of linguistic and embodied conduct that reoccur in specific sequential positions employed for specific purposes. Multimodal analyses (and other disciplines using videos) are so far dependent on time and resource intensive processes of manual transcription of each component from video materials. Automating these tasks requires advanced programming skills, which is often not in the scope of IL. Moreover, the use of different tools makes the integration and analysis of different formats challenging. Consequently, IL research often deals with relatively small samples of annotated data which are suitable for qualitative analysis but not enough for making generalized empirical claims derived quantitatively. VIAN-DH aims to create a workspace where many annotation layers required for the multimodal analysis of videos can be created, processed, and correlated in one platform. VIAN-DH will provide a graphical interface that operates state-of-the-art tools for automating parts of the data processing. The integration of tools that already exist in computational linguistics and computer vision, facilitates data processing for researchers lacking programming skills, speeds up the overall research process, and enables the processing of large amounts of data. The main features to be introduced are automatic speech recognition for the transcription of language, automatic image recognition for extraction of gestures and other visual cues, as well as grammatical annotation for adding morphological and syntactic information to the verbal content. In the ongoing instance of VIAN-DH, we focus on gesture extraction (pointing gestures, in particular), making use of existing models created for sign language and adapting them for this specific purpose. In order to view and search the data, VIAN-DH will provide a unified format and enable the import of the main existing formats of annotated video data and the export to other formats used in the field, while integrating different data source formats in a way that they can be combined in research. VIAN-DH will adapt querying methods from corpus linguistics to enable parallel search of many annotation levels, combining token-level and chronological search for various types of data. VIAN-DH strives to bring crucial and potentially revolutionary innovation to the field of IL, (that can also extend to other fields using video materials). It will allow the processing of large amounts of data automatically and, the implementation of quantitative analyses, combining it with the qualitative approach. It will facilitate the investigation of correlations between linguistic patterns (lexical or grammatical) with conversational aspects (turn-taking or gestures). Users will be able to automatically transcribe and annotate visual, spoken and grammatical information from videos, and to correlate those different levels and perform queries and analyses.

Keywords: multimodal analysis, corpus linguistics, computational linguistics, image recognition, speech recognition

Procedia PDF Downloads 102
711 Embedded System of Signal Processing on FPGA: Underwater Application Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.

Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing

Procedia PDF Downloads 72
710 The Influence of Human Factors Education on the Irish Registered Pre-Hospital Practitioner within the National Ambulance Service

Authors: Desmond Wade, Alfredo Ormazabal

Abstract:

Background: Ever since it commenced its registration process of pre-hospital practitioners in the year 2000 through the Irish Government Statute Instrument (SI 109 of 2000) process, the approach to education of its professionals has changed drastically. The progression from the traditional behaviouristic to the current constructivist approach has been based on experiences from other sectors and industries, nationally and internationally. Today, the delivery of a safe and efficient ambulance service heavily depends on its practitioners’ range of technical skills, academic knowledge, and overall competences. As these increase, so does the level of complexity of paramedics’ everyday practice. This has made it inevitable to consider the 'Human Factor' as a source of potential risk and made formative institutions like the National Ambulance Service College to include it in their curriculum. Methods: This paper used a mixed-method approach, where both, an online questionnaire and a set of semi-structured interviews were the source of primary data. An analysis of this data was carried out using qualitative and quantitative data analysis. Conclusions: The evidence presented leads to the conclusion that in the National Ambulance Service there is a considerable lack of education of Human Factors and the levels in understanding of how to manage Human Factors in practice vary across its spectrum. Paramedic Practitioners in Ireland seem to understand that the responsibility of patient care lies on the team, rather than on the most hierarchically senior practitioner present in the scene.

Keywords: human factors, ergonomics, stress, decision making, pre-hospital care, paramedic, education

Procedia PDF Downloads 145
709 Gas Phase Extraction: An Environmentally Sustainable and Effective Method for The Extraction and Recovery of Metal from Ores

Authors: Kolela J Nyembwe, Darlington C. Ashiegbu, Herman J. Potgieter

Abstract:

Over the past few decades, the demand for metals has increased significantly. This has led to a decrease and decline of high-grade ore over time and an increase in mineral complexity and matrix heterogeneity. In addition to that, there are rising concerns about greener processes and a sustainable environment. Due to these challenges, the mining and metal industry has been forced to develop new technologies that are able to economically process and recover metallic values from low-grade ores, materials having a metal content locked up in industrially processed residues (tailings and slag), and complex matrix mineral deposits. Several methods to address these issues have been developed, among which are ionic liquids (IL), heap leaching, and bioleaching. Recently, the gas phase extraction technique has been gaining interest because it eliminates many of the problems encountered in conventional mineral processing methods. The technique relies on the formation of volatile metal complexes, which can be removed from the residual solids by a carrier gas. The complexes can then be reduced using the appropriate method to obtain the metal and regenerate-recover the organic extractant. Laboratory work on the gas phase have been conducted for the extraction and recovery of aluminium (Al), iron (Fe), copper (Cu), chrome (Cr), nickel (Ni), lead (Pb), and vanadium V. In all cases the extraction revealed to depend of temperature and mineral surface area. The process technology appears very promising, offers the feasibility of recirculation, organic reagent regeneration, and has the potential to deliver on all promises of a “greener” process.

Keywords: gas-phase extraction, hydrometallurgy, low-grade ore, sustainable environment

Procedia PDF Downloads 122
708 A Key Parameter in Ocean Thermal Energy Conversion Plant Design and Operation

Authors: Yongjian Gu

Abstract:

Ocean thermal energy is one of the ocean energy sources. It is a renewable, sustainable, and green energy source. Ocean thermal energy conversion (OTEC) applies the ocean temperature gradient between the warmer surface seawater and the cooler deep seawater to run a heat engine and produce a useful power output. Unfortunately, the ocean temperature gradient is not big. Even in the tropical and equatorial regions, the surface water temperature can only reach up to 28oC and the deep water temperature can be as low as 4oC. The thermal efficiency of the OTEC plants, therefore, is low. In order to improve the plant thermal efficiency by using the limited ocean temperature gradient, some OTEC plants use the method of adding more equipment for better heat recovery, such as heat exchangers, pumps, etc. Obviously, the method will increase the plant's complexity and cost. The more important impact of the method is the additional equipment needs to consume power too, which may have an adverse effect on the plant net power output, in turn, the plant thermal efficiency. In the paper, the author first describes varied OTEC plants and the practice of using the method of adding more equipment for improving the plant's thermal efficiency. Then the author proposes a parameter, plant back works ratio ϕ, for measuring if the added equipment is appropriate for the plant thermal efficiency improvement. Finally, in the paper, the author presents examples to illustrate the application of the back work ratio ϕ as a key parameter in the OTEC plant design and operation.

Keywords: ocean thermal energy, ocean thermal energy conversion (OTEC), OTEC plant, plant back work ratio ϕ

Procedia PDF Downloads 189
707 Secure and Privacy-Enhanced Blockchain-Based Authentication System for University User Management

Authors: Ali El Ksimi

Abstract:

In today's digital academic environment, secure authentication methods are essential for managing sensitive user data, including that of students and faculty. The rise in cyber threats and data breaches has exposed the vulnerabilities of traditional authentication systems used in universities. Passwords, often the first line of defense, are particularly susceptible to hacking, phishing, and brute-force attacks. While multi-factor authentication (MFA) provides an additional layer of security, it can still be compromised and often adds complexity and inconvenience for users. As universities seek more robust security measures, blockchain technology emerges as a promising solution. Renowned for its decentralization, immutability, and transparency, blockchain has the potential to transform how user management is conducted in academic institutions. In this article, we explore a system that leverages blockchain technology specifically for managing user accounts within a university setting. The system enables the secure creation and management of accounts for different roles, such as administrators, teachers, and students. Each user is authenticated through a decentralized application (DApp) that ensures their data is securely stored and managed on the blockchain. By eliminating single points of failure and utilizing cryptographic techniques, the system enhances the security and integrity of user management processes. We will delve into the technical architecture, security benefits, and implementation considerations of this approach. By integrating blockchain into user management, we aim to address the limitations of traditional systems and pave the way for the future of digital security in education.

Keywords: blockchain, university, authentication, decentralization, cybersecurity, user management, privacy

Procedia PDF Downloads 9
706 Numerical Simulation of Free Surface Water Wave for the Flow Around NACA 0012 Hydrofoil and Wigley Hull Using VOF Method

Authors: Omar Imine, Mohammed Aounallah, Mustapha Belkadi

Abstract:

Steady three-dimensional and two free surface waves generated by moving bodies are presented, the flow problem to be simulated is rich in complexity and poses many modeling challenges because of the existence of breaking waves around the ship hull, and because of the interaction of the two-phase flow with the turbulent boundary layer. The results of several simulations are reported. The first study was performed for NACA0012 of hydrofoil with different meshes, this section is analyzed at h/c= 1, 0345 for 2D. In the second simulation, a mathematically defined Wigley hull form is used to investigate the application of a commercial CFD code in prediction of the total resistance and its components from tangential and normal forces on the hull wetted surface. The computed resistance and wave profiles are used to estimate the coefficient of the total resistance for Wigley hull advancing in calm water under steady conditions. The commercial CFD software FLUENT version 12 is used for the computations in the present study. The calculated grid is established using the code computer GAMBIT 2.3.26. The shear stress k-ωSST model is used for turbulence modeling and the volume of the fluid technique is employed to simulate the free-surface motion. The second order upwind scheme is used for discretizing the convection terms in the momentum transport equations, the Modified HRICscheme for VOF discretization. The results obtained compare well with the experimental data.

Keywords: free surface flows, breaking waves, boundary layer, Wigley hull, volume of fluid

Procedia PDF Downloads 369
705 Design of a CO₂-Reduced 3D Concrete Mixture Using Circular (Clay-Based) Building Materials

Authors: N. Z. van Hierden, Q. Yu, F. Gauvin

Abstract:

Cement manufacturing is, because of its production process, among the highest contributors to CO₂ emissions worldwide. As cement is one of the major components in 3D printed concrete, achieving sustainability and carbon neutrality can be particularly challenging. To improve the sustainability of 3D printed materials, different CO₂-reducing strategies can be used, each one with a distinct level of impact and complexity. In this work, we focus on the development of these sustainable mixtures and finding alternatives. Promising alternatives for cement and clinker replacement include the use of recycled building materials, amongst which (calcined) bricks and roof tiles. To study the potential of recycled clay-based building materials, the application of calcinated clay itself is studied as well. Compared to cement, the calcination temperature of clay-based materials is significantly lower, resulting in reduced CO₂ output. Reusing these materials is therefore a promising solution for utilizing waste streams while simultaneously reducing the cement content in 3D concrete mixtures. In addition, waste streams can be locally sourced, thereby reducing the emitted CO₂ during transportation. In this research, various alternative binders are examined, such as calcined clay blends (LC3) from recycled tiles and bricks, or locally obtained clay resources. Using various experiments, a high potential for mix designs including these resources has been shown with respect to material strength, while sustaining decent printability and buildability. Therefore, the defined strategies are promising and can lead to a more sustainable, low-CO₂ mixture suitable for 3D printing while using accessible materials.

Keywords: cement replacement, 3DPC, circular building materials, calcined clay, CO₂ reduction

Procedia PDF Downloads 80
704 Applying Concurrent Development Process for the Web Using Aspect-Oriented Approach

Authors: Hiroaki Fukuda

Abstract:

This paper shows a concurrent development process for modern web application, called Rich Internet Application (RIA), and describes its effect using a non-trivial application development. In the last years, RIAs such as Ajax and Flex have become popular based mainly on high-speed network. RIA provides sophisticated interfaces and user experiences, therefore, the development of RIA requires two kinds of engineer: a developer who implements business logic, and a designer who designs interface and experiences. Although collaborative works are becoming important for the development of RIAs, shared resources such as source code make it difficult. For example, if a design of interface is modified after developers have finished business logic implementations, they need to repeat the same implementations, and also tests to verify application’s behavior. MVC architecture and Object-oriented programming (OOP) enables to dividing an application into modules such as interfaces and logic, however, developers and/or designers have to write pieces of code (e.g., event handlers) that make these modules work as an application. On the other hand, Aspect-oriented programming (AOP) is ex- pected to solve complexity of application software development nowadays. AOP provides methods to separate crosscutting concerns that are scattered pieces of code from primary concerns. In this paper, we provide a concurrent development process for RIAs by introducing AOP concept. This process makes it possible to reduce shared resources between developers and designers, therefore they can perform their tasks concurrently. In addition, we describe experiences of development for a practical application using our proposed development process to show its availability.

Keywords: aspect-oriented programming, concurrent, development process, rich internet application

Procedia PDF Downloads 297
703 A Three-Dimensional (3D) Numerical Study of Roofs Shape Impact on Air Quality in Urban Street Canyons with Tree Planting

Authors: Bouabdellah Abed, Mohamed Bouzit, Lakhdar Bouarbi

Abstract:

The objective of this study is to investigate numerically the effect of roof shaped on wind flow and pollutant dispersion in a street canyon with one row of trees of pore volume, Pvol = 96%. A three-dimensional computational fluid dynamics (CFD) model for evaluating air flow and pollutant dispersion within an urban street canyon using Reynolds-averaged Navier–Stokes (RANS) equations and the k-Epsilon EARSM turbulence model as close of the equation system. The numerical model is performed with ANSYS-CFX code. Vehicle emissions were simulated as double line sources along the street. The numerical model was validated against the wind tunnel experiment. Having established this, the wind flow and pollutant dispersion in urban street canyons of six roof shapes are simulated. The numerical simulation agrees reasonably with the wind tunnel data. The results obtained in this work, indicate that the flow in 3D domain is more complicated, this complexity is increased with presence of tree and variability of the roof shapes. The results also indicated that the largest pollutant concentration level for two walls (leeward and windward wall) is observed with the upwind wedge-shaped roof. But the smallest pollutant concentration level is observed with the dome roof-shaped. The results also indicated that the corners eddies provide additional ventilation and lead to lower traffic pollutant concentrations at the street canyon ends.

Keywords: street canyon, pollutant dispersion, trees, building configuration, numerical simulation, k-Epsilon EARSM

Procedia PDF Downloads 353
702 The Effect of Sulfur and Calcium on the Formation of Dioxin in a Bubbling Fluidized Bed Incinerator

Authors: Chien-Song Chyang, Wei-Chih Wang

Abstract:

For the incineration process, the inhibition of dioxin formation is an important issue. Many investigations indicate that adding sulfur compounds in the combustion process can be an effectively inhibition for the dioxin formation. In the process, the ratio of sulfur-to-chlorine plays an important role for the reduction efficiency of dioxin formation. Ca-base sorbent is also a common used for the acid gas removing. Moreover, that is also the indirectly way for dioxin inhibition. Although sulfur and calcium can reduce the dioxin formation, it still have some confusion exists between these additives. To understand and clarify the relationship between the dioxin and simultaneous addition of sulfur and calcium are presented in this study. The experimental data conducted in a pilot scale fluidized bed combustion system at various operating conditions are analysis comprehensively. The focus is on the dioxin of fly ash in this study. The experimental data in this study showed that the PCDD/Fs concentration in the fly ash collected from the baghouse is increased slightly as the simultaneous addition of sulfur and calcium. This work described the CO concentration with the addition of sulfur and calcium at the freeboard temperature from 800°C to 900°C, which is raised by the fuel complexity. The positive correlation exists between the dioxin concentration and CO concentration and carbon contained in the fly ash.. At the same sulfur/chlorine ratio, the toxic equivalent quantity (TEQ) can be reduced by increasing the actual concentration of sulfur and calcium. The homologue profiles showed that the P₅CDD and P₅CDF were the two major sources for the toxicity of dioxin. 2,3,7,8-TCDD and 2,3,7,8-TCDF reduced by the addition of pyrite and hydrated lime. The experimental results showed that the trend of PCDD/Fs concentration in the fly ash was different by the different sulfur/chlorine ratio with the addition of sulfur at 800°C.

Keywords: reduction of dioxin emissions, sulfur-to-chlorine ratio, de-chlorination, Ca-based sorbent

Procedia PDF Downloads 143
701 A Standard Operating Procedure (SOP) for Forensic Soil Analysis: Tested Using a Simulated Crime Scene

Authors: Samara A. Testoni, Vander F. Melo, Lorna A. Dawson, Fabio A. S. Salvador

Abstract:

Soil traces are useful as forensic evidence due to their potential to transfer and adhere to different types of surfaces on a range of objects or persons. The great variability expressed by soil physical, chemical, biological and mineralogical properties show soil traces as complex mixtures. Soils are continuous and variable, no two soil samples being indistinguishable, nevertheless, the complexity of soil characteristics can provide powerful evidence for comparative forensic purposes. This work aimed to establish a Standard Operating Procedure (SOP) for forensic soil analysis in Brazil. We carried out a simulated crime scene with double blind sampling to calibrate the sampling procedures. Samples were collected at a range of locations covering a range of soil types found in South of Brazil: Santa Candida and Boa Vista, neighbourhoods from Curitiba (State of Parana) and in Guarani and Guaraituba, neighbourhoods from Colombo (Curitiba Metropolitan Region). A previously validated sequential analyses of chemical, physical and mineralogical analyses was developed in around 2 g of soil. The suggested SOP and the sequential range of analyses were effective in grouping the samples from the same place and from the same parent material together, as well as successfully discriminated samples from different locations and originated from different rocks. In addition, modifications to the sample treatment and analytical protocol can be made depending on the context of the forensic work.

Keywords: clay mineralogy, forensic soils analysis, sequential analyses, kaolinite, gibbsite

Procedia PDF Downloads 246
700 Assessing Performance of Data Augmentation Techniques for a Convolutional Network Trained for Recognizing Humans in Drone Images

Authors: Masood Varshosaz, Kamyar Hasanpour

Abstract:

In recent years, we have seen growing interest in recognizing humans in drone images for post-disaster search and rescue operations. Deep learning algorithms have shown great promise in this area, but they often require large amounts of labeled data to train the models. To keep the data acquisition cost low, augmentation techniques can be used to create additional data from existing images. There are many techniques of such that can help generate variations of an original image to improve the performance of deep learning algorithms. While data augmentation is potentially assumed to improve the accuracy and robustness of the models, it is important to ensure that the performance gains are not outweighed by the additional computational cost or complexity of implementing the techniques. To this end, it is important to evaluate the impact of data augmentation on the performance of the deep learning models. In this paper, we evaluated the most currently available 2D data augmentation techniques on a standard convolutional network which was trained for recognizing humans in drone images. The techniques include rotation, scaling, random cropping, flipping, shifting, and their combination. The results showed that the augmented models perform 1-3% better compared to a base network. However, as the augmented images only contain the human parts already visible in the original images, a new data augmentation approach is needed to include the invisible parts of the human body. Thus, we suggest a new method that employs simulated 3D human models to generate new data for training the network.

Keywords: human recognition, deep learning, drones, disaster mitigation

Procedia PDF Downloads 86
699 Hospitality and Migration within the Canadian Social Fabric: Guest and Host Factors in Manitoba

Authors: Nathalie Piquemal, Faiçal Zellama, Bathélemy Bolivar, Leyla Sall

Abstract:

Canada defines itself as a country of immigration and a multicultural nation, ideologically, politically and programmatically (in terms of its integration practices). As such, principles of hospitality may seem, at first glance, incontestable, given the convergence of the views of the majority of Canadian politicians on the need to welcome, each year, a significant number of immigrants and to offer them the hospitality that facilitates their transition to Canadian citizenship. However, immigrants are welcomed in a Canadian societal context in which power and resources are unevenly distributed, resulting in complex social relationships between hosts and newcomers. Qualitative data obtained from newcomers in Winnipeg, Manitoba, Canada, focuses on experiences of hospitality, with special attention to host-guest social and power dynamics, contested policies on foreign credential and micro spaces of belongingin a multicultural context. The act of welcoming a newcomer is inherently shaped by both macropolitical structures and everyday relational practices that can lead to experiences of belonging, marginalisation, empowerment and/or disempowerment depending on economic agenda, humanitarian and humanistic orientations. We first explore the extent to which immigrants experience hospitality in relation to unequal distribution of power and resources as well as cultural discontinuities. We then examine ways in which immigrants have been able to find sanctuaries of hospitality within their own ethnocultural communities. Finally, we discuss the complexity of hospitality in a multicultural context and offer critical insights on host factors that may produce, develop and nurture hospitable environments.

Keywords: migration, hospitality, diversity, culture, race

Procedia PDF Downloads 107
698 Efficacy of Technology for Successful Learning Experience; Technology Supported Model for Distance Learning: Case Study of Botho University, Botswana

Authors: Ivy Rose Mathew

Abstract:

The purpose of this study is to outline the efficacy of technology and the opportunities it can bring to implement a successful delivery model in Distance Learning. Distance Learning has proliferated over the past few years across the world. Some of the current challenges faced by current students of distance education include lack of motivation, a sense of isolation and a need for greater and improved communication. Hence the author proposes a creative technology supported model for distance learning exactly mirrored on the traditional face to face learning that can be adopted by distance learning providers. This model suggests the usage of a range of technologies and social networking facilities, with the aim of creating a more engaging and sustaining learning environment to help overcome the isolation often noted by distance learners. While discussing the possibilities, the author also highlights the complexity and practical challenges of implementing such a model. Design/methodology/approach: Theoretical issues from previous research related to successful models for distance learning providers will be considered. And also the analysis of a case study from one of the largest private tertiary institution in Botswana, Botho University will be included. This case study illustrates important aspects of the distance learning delivery model and provides insights on how curriculum development is planned, quality assurance is done, and learner support is assured for successful distance learning experience. Research limitations/implications: While some of the aspects of this study may not be applicable to other contexts, a number of new providers of distance learning can adapt the key principles of this delivery model.

Keywords: distance learning, efficacy, learning experience, technology supported model

Procedia PDF Downloads 242
697 Forecasting Equity Premium Out-of-Sample with Sophisticated Regression Training Techniques

Authors: Jonathan Iworiso

Abstract:

Forecasting the equity premium out-of-sample is a major concern to researchers in finance and emerging markets. The quest for a superior model that can forecast the equity premium with significant economic gains has resulted in several controversies on the choice of variables and suitable techniques among scholars. This research focuses mainly on the application of Regression Training (RT) techniques to forecast monthly equity premium out-of-sample recursively with an expanding window method. A broad category of sophisticated regression models involving model complexity was employed. The RT models include Ridge, Forward-Backward (FOBA) Ridge, Least Absolute Shrinkage and Selection Operator (LASSO), Relaxed LASSO, Elastic Net, and Least Angle Regression were trained and used to forecast the equity premium out-of-sample. In this study, the empirical investigation of the RT models demonstrates significant evidence of equity premium predictability both statistically and economically relative to the benchmark historical average, delivering significant utility gains. They seek to provide meaningful economic information on mean-variance portfolio investment for investors who are timing the market to earn future gains at minimal risk. Thus, the forecasting models appeared to guarantee an investor in a market setting who optimally reallocates a monthly portfolio between equities and risk-free treasury bills using equity premium forecasts at minimal risk.

Keywords: regression training, out-of-sample forecasts, expanding window, statistical predictability, economic significance, utility gains

Procedia PDF Downloads 98
696 Simulation Aided Life Cycle Sustainability Assessment Framework for Manufacturing Design and Management

Authors: Mijoh A. Gbededo, Kapila Liyanage, Ilias Oraifige

Abstract:

Decision making for sustainable manufacturing design and management requires critical considerations due to the complexity and partly conflicting issues of economic, social and environmental factors. Although there are tools capable of assessing the combination of one or two of the sustainability factors, the frameworks have not adequately integrated all the three factors. Case study and review of existing simulation applications also shows the approach lacks integration of the sustainability factors. In this paper we discussed the development of a simulation based framework for support of a holistic assessment of sustainable manufacturing design and management. To achieve this, a strategic approach is introduced to investigate the strengths and weaknesses of the existing decision supporting tools. Investigation reveals that Discrete Event Simulation (DES) can serve as a rock base for other Life Cycle Analysis frameworks. Simio-DES application optimizes systems for both economic and competitive advantage, Granta CES EduPack and SimaPro collate data for Material Flow Analysis and environmental Life Cycle Assessment, while social and stakeholders’ analysis is supported by Analytical Hierarchy Process, a Multi-Criteria Decision Analysis method. Such a common and integrated framework creates a platform for companies to build a computer simulation model of a real system and assess the impact of alternative solutions before implementing a chosen solution.

Keywords: discrete event simulation, life cycle sustainability analysis, manufacturing, sustainability

Procedia PDF Downloads 273
695 Quality Approaches for Mass-Produced Fashion: A Study in Malaysian Garment Manufacturing

Authors: N. J. M. Yusof, T. Sabir, J. McLoughlin

Abstract:

Garment manufacturing industry involves sequential processes that are subjected to uncontrollable variations. The industry depends on the skill of labour in handling the varieties of fabrics and accessories, machines, and also a complicated sewing operation. Due to these reasons, garment manufacturers created systems to monitor and control the product’s quality regularly by conducting quality approaches to minimize variation. The aims of this research were to ascertain the quality approaches deployed by Malaysian garment manufacturers in three key areas-quality systems and tools; quality control and types of inspection; sampling procedures chosen for garment inspection. The focus of this research also aimed to distinguish quality approaches used by companies that supplied the finished garments to both domestic and international markets. The feedback from each of company’s representatives was obtained using the online survey, which comprised of five sections and 44 questions on the organizational profile and quality approaches used in the garment industry. The results revealed that almost all companies had established their own mechanism of process control by conducting a series of quality inspection for daily production either it was formally been set up or vice versa. Quality inspection was the predominant quality control activity in the garment manufacturing and the level of complexity of these activities was substantially dictated by the customers. AQL-based sampling was utilized by companies dealing with the export market, whilst almost all the companies that only concentrated on the domestic market were comfortable using their own sampling procedures for garment inspection. This research provides an insight into the implementation of quality approaches that were perceived as important and useful in the garment manufacturing sector, which is truly labour-intensive.

Keywords: garment manufacturing, quality approaches, quality control, inspection, Acceptance Quality Limit (AQL), sampling

Procedia PDF Downloads 434
694 Development of a Matlab® Program for the Bi-Dimensional Truss Analysis Using the Stiffness Matrix Method

Authors: Angel G. De Leon Hernandez

Abstract:

A structure is defined as a physical system or, in certain cases, an arrangement of connected elements, capable of bearing certain loads. The structures are presented in every part of the daily life, e.g., in the designing of buildings, vehicles and mechanisms. The main goal of a structure designer is to develop a secure, aesthetic and maintainable system, considering the constraint imposed to every case. With the advances in the technology during the last decades, the capabilities of solving engineering problems have increased enormously. Nowadays the computers, play a critical roll in the structural analysis, pitifully, for university students the vast majority of these software are inaccessible due to the high complexity and cost they represent, even when the software manufacturers offer student versions. This is exactly the reason why the idea of developing a more reachable and easy-to-use computing tool. This program is designed as a tool for the university students enrolled in courser related to the structures analysis and designs, as a complementary instrument to achieve a better understanding of this area and to avoid all the tedious calculations. Also, the program can be useful for graduated engineers in the field of structural design and analysis. A graphical user interphase is included in the program to make it even simpler to operate it and understand the information requested and the obtained results. In the present document are included the theoretical basics in which the program is based to solve the structural analysis, the logical path followed in order to develop the program, the theoretical results, a discussion about the results and the validation of those results.

Keywords: stiffness matrix method, structural analysis, Matlab® applications, programming

Procedia PDF Downloads 114