Search results for: turn-over time
15016 Bidirectional Long Short-Term Memory-Based Signal Detection for Orthogonal Frequency Division Multiplexing With All Index Modulation
Authors: Mahmut Yildirim
Abstract:
This paper proposed the bidirectional long short-term memory (Bi-LSTM) network-aided deep learning (DL)-based signal detection for Orthogonal frequency division multiplexing with all index modulation (OFDM-AIM), namely Bi-DeepAIM. OFDM-AIM is developed to increase the spectral efficiency of OFDM with index modulation (OFDM-IM), a promising multi-carrier technique for communication systems beyond 5G. In this paper, due to its strong classification ability, Bi-LSTM is considered an alternative to the maximum likelihood (ML) algorithm, which is used for signal detection in the classical OFDM-AIM scheme. The performance of the Bi-DeepAIM is compared with LSTM network-aided DL-based OFDM-AIM (DeepAIM) and classic OFDM-AIM that uses (ML)-based signal detection via BER performance and computational time criteria. Simulation results show that Bi-DeepAIM obtains better bit error rate (BER) performance than DeepAIM and lower computation time in signal detection than ML-AIM.Keywords: bidirectional long short-term memory, deep learning, maximum likelihood, OFDM with all index modulation, signal detection
Procedia PDF Downloads 7215015 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic
Authors: Diogen Babuc
Abstract:
The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison
Procedia PDF Downloads 10315014 Multi-Criteria Inventory Classification Process Based on Logical Analysis of Data
Authors: Diana López-Soto, Soumaya Yacout, Francisco Ángel-Bello
Abstract:
Although inventories are considered as stocks of money sitting on shelve, they are needed in order to secure a constant and continuous production. Therefore, companies need to have control over the amount of inventory in order to find the balance between excessive and shortage of inventory. The classification of items according to certain criteria such as the price, the usage rate and the lead time before arrival allows any company to concentrate its investment in inventory according to certain ranking or priority of items. This makes the decision making process for inventory management easier and more justifiable. The purpose of this paper is to present a new approach for the classification of new items based on the already existing criteria. This approach is called the Logical Analysis of Data (LAD). It is used in this paper to assist the process of ABC items classification based on multiple criteria. LAD is a data mining technique based on Boolean theory that is used for pattern recognition. This technique has been tested in medicine, industry, credit risk analysis, and engineering with remarkable results. An application on ABC inventory classification is presented for the first time, and the results are compared with those obtained when using the well-known AHP technique and the ANN technique. The results show that LAD presented very good classification accuracy.Keywords: ABC multi-criteria inventory classification, inventory management, multi-class LAD model, multi-criteria classification
Procedia PDF Downloads 88115013 Application of GIS-Based Construction Engineering: An Electronic Document Management System
Authors: Mansour N. Jadid
Abstract:
This paper describes the implementation of a GIS to provide decision support for successfully monitoring the movements and storage of materials, hence ensuring that finished products travel from the point of origin to the destination construction site through the supply-chain management (SCM) system. This system ensures the efficient operation of suppliers, manufacturers, and distributors by determining the shortest path from the point of origin to the final destination to reduce construction costs, minimize time, and enhance productivity. These systems are essential to the construction industry because they reduce costs and save time, thereby improve productivity and effectiveness. This study describes a typical supply-chain model and a geographical information system (GIS)-based SCM that focuses on implementing an electronic document management system, which maps the application framework to integrate geodetic support with the supply-chain system. This process provides guidance for locating the nearest suppliers to fill the information needs of project members in different locations. Moreover, this study illustrates the use of a GIS-based SCM as a collaborative tool in innovative methods for implementing Web mapping services, as well as aspects of their integration by generating an interactive GIS for the construction industry platform.Keywords: construction, coordinate, engineering, GIS, management, map
Procedia PDF Downloads 30315012 Experimental Study of the Fiber Dispersion of Pulp Liquid Flow in Channels with Application to Papermaking
Authors: Masaru Sumida
Abstract:
This study explored the feasibility of improving the hydraulic headbox of papermaking machines by studying the flow of wood-pulp suspensions behind a flat plate inserted in parallel and convergent channels. Pulp fiber concentrations of the wake downstream of the plate were investigated by flow visualization and optical measurements. Changes in the time-averaged and fluctuation of the fiber concentration along the flow direction were examined. In addition, the control of the flow characteristics in the two channels was investigated. The behaviors of the pulp fibers and the wake flow were found to be strongly related to the flow states in the upstream passages partitioned by the plate. The distribution of the fiber concentration was complex because of the formation of a thin water layer on the plate and the generation of Karman’s vortices at the trailing edge of the plate. Compared with the flow in the parallel channel, fluctuations in the fiber concentration decreased in the convergent channel. However, at low flow velocities, the convergent channel has a weak effect on equilibrating the time-averaged fiber concentration. This shows that a rectangular trailing edge cannot adequately disperse pulp suspensions; thus, at low flow velocities, a convergent channel is ineffective in ensuring uniform fiber concentration.Keywords: fiber dispersion, headbox, pulp liquid, wake flow
Procedia PDF Downloads 38615011 Leça da Palmeira Revisited: Sixty-Seven Years of Recurring Work by Álvaro Siza
Authors: Eduardo Jorge Cabral dos Santos Fernandes
Abstract:
Over the last sixty-seven years, Portuguese architect Álvaro Siza Vieira designed several interventions for the Leça da Palmeira waterfront. With this paper, we aim to analyze the history of this set of projects in a chronological approach, seeking to understand the connections that can be established between them. Born in Matosinhos, a fishing and industrial village located near Porto, Álvaro Siza built a remarkable relationship with Leça da Palmeira (a neighboring village located to the north) from a personal and professional point of view throughout his life: it was there that he got married (in the small chapel located next to the Boa Nova lighthouse) and it was there that he designed his first works of great impact, the Boa Nova Tea House and the Ocean Swimming Pool, today classified as national monuments. These two works were the subject of several projects spaced over time, including recent restoration interventions designed by the same author. However, the marks of Siza's intervention in this territory are not limited to these two cases; there were other projects designed for this territory, which we also intend to analyze: the monument to the poet António Nobre (1967-80), the unbuilt project for a restaurant next to Piscina das Marés (presented in 1966 and redesigned in 1993), the reorganization of the Avenida da Liberdade (with a first project, not carried out, in 1965-74, and a reformulation carried out between 1998 and 2006) and, finally, the project for the new APDL facilities, which completes Avenida da Liberdade to the south (1995). Altogether, these interventions are so striking in this territory, from a landscape, formal, functional, and tectonic point of view, that it is difficult to imagine this waterfront without their presence. In all cases, the relationship with the site explains many of the design options. Time after time, the conditions of the pre-existing territory (also affected by the previous interventions of Siza) were considered, so each project created a new circumstance, conditioning the following interventions. This paper is part of a more comprehensive project, which aims to analyze the work of Álvaro Siza in its fundamental relationship with the site.Keywords: Álvaro Siza, contextualism, Leça da Palmeira, landscape
Procedia PDF Downloads 3215010 Spectrophotometric Determination of Photohydroxylated Products of Humic Acid in the Presence of Salicylate Probe
Authors: Julide Hizal Yucesoy, Batuhan Yardimci, Aysem Arda, Resat Apak
Abstract:
Humic substances produce reactive oxygene species such as hydroxyl, phenoxy and superoxide radicals by oxidizing in a wide pH and reduction potential range. Hydroxyl radicals, produced by reducing agents such as antioxidants and/or peroxides, attack on salicylate probe, and form 2,3-dihydroxybenzoate, 2,4-dihydroxybenzoate and 2,5-dihydroxybenzoate species. These species are quantitatively determined by using HPLC Method. Humic substances undergo photodegradation by UV radiation. As a result of their antioxidant properties, they produce hydroxyl radicals. In the presence of salicylate probe, these hydroxyl radicals react with salicylate molecules to form hydroxylated products (dihidroxybenzoate isomers). In this study, humic acid was photodegraded in a photoreactor at 254 nm (400W), formed hydroxyl radicals were caught by salicylate probe. The total concentration of hydroxylated salicylate species was measured by using spectrophotometric CUPRAC Method. And also, using results of time dependent experiments, kinetic of photohydroxylation was determined at different pHs. This method has been applied for the first time to measure the concentration of hydroxylated products. It allows to achieve the results easier than HPLC Method.Keywords: CUPRAC method, humic acid, photohydroxylation, salicylate probe
Procedia PDF Downloads 20615009 The Transformation of Architecture through the Technological Developments in History: Future Architecture Scenario
Authors: Adel Gurel, Ozge Ceylin Yildirim
Abstract:
Nowadays, design and architecture are being affected and underwent change with the rapid advancements in technology, economics, politics, society and culture. Architecture has been transforming with the latest developments after the inclusion of computers into design. Integration of design into the computational environment has revolutionized the architecture and new perspectives in architecture have been gained. The history of architecture shows the various technological developments and changes in which the architecture has transformed with time. Therefore, the analysis of integration between technology and the history of the architectural process makes it possible to build a consensus on the idea of how architecture is to proceed. In this study, each period that occurs with the integration of technology into architecture is addressed within historical process. At the same time, changes in architecture via technology are identified as important milestones and predictions with regards to the future of architecture have been determined. Developments and changes in technology and the use of technology in architecture within years are analyzed in charts and graphs comparatively. The historical process of architecture and its transformation via technology are supported with detailed literature review and they are consolidated with the examination of focal points of 20th-century architecture under the titles; parametric design, genetic architecture, simulation, and biomimicry. It is concluded that with the historical research between past and present; the developments in architecture cannot keep up with the advancements in technology and recent developments in technology overshadow the architecture, even the technology decides the direction of architecture. As a result, a scenario is presented with regards to the reach of technology in the future of architecture and the role of the architect.Keywords: computer technologies, future architecture, scientific developments, transformation
Procedia PDF Downloads 19215008 Artificial Neurons Based on Memristors for Spiking Neural Networks
Authors: Yan Yu, Wang Yu, Chen Xintong, Liu Yi, Zhang Yanzhong, Wang Yanji, Chen Xingyu, Zhang Miaocheng, Tong Yi
Abstract:
Neuromorphic computing based on spiking neural networks (SNNs) has emerged as a promising avenue for building the next generation of intelligent computing systems. Owing to its high-density integration, low power, and outstanding nonlinearity, memristors have attracted emerging attention on achieving SNNs. However, fabricating a low-power and robust memristor-based spiking neuron without extra electrical components is still a challenge for brain-inspired systems. In this work, we demonstrate a TiO₂-based threshold switching (TS) memristor to emulate a leaky integrate-and-fire (LIF) neuron without auxiliary circuits, used to realize single layer fully connected (FC) SNNs. Moreover, our TiO₂-based resistive switching (RS) memristors realize spiking-time-dependent-plasticity (STDP), originating from the Ag diffusion-based filamentary mechanism. This work demonstrates that TiO2-based memristors may provide an efficient method to construct hardware neuromorphic computing systems.Keywords: leaky integrate-and-fire, memristor, spiking neural networks, spiking-time-dependent-plasticity
Procedia PDF Downloads 13415007 Insight into Localized Fertilizer Placement in Major Cereal Crops
Authors: Solomon Yokamo, Dianjun Lu, Xiaoqin Chen, Huoyan Wang
Abstract:
The current ‘high input-high output’ nutrient management model based on homogenous spreading over the entire soil surface remains a key challenge in China’s farming systems, leading to low fertilizer use efficiency and environmental pollution. Localized placement of fertilizer (LPF) to crop root zones has been proposed as a viable approach to boost crop production while protecting environmental pollution. To assess the potential benefits of LPF on three major crops—wheat, rice, and maize—a comprehensive meta-analysis was conducted, encompassing 85 field studies published from 2002-2023. We further validated the practicability and feasibility of one-time root zone N management based on LPF for the three field crops. The meta-analysis revealed that LPF significantly increased the yields of the selected crops (13.62%) and nitrogen recovery efficiency (REN) (33.09%) while reducing cumulative nitrous oxide (N₂O) emission (17.37%) and ammonia (NH₃) volatilization (60.14%) compared to the conventional surface application (CSA). Higher grain yield and REN were achieved with an optimal fertilization depth (FD) of 5-15 cm, moderate N rates, combined NPK application, one-time deep fertilization, and coarse-textured and slightly acidic soils. Field validation experiments showed that localized one-time root zone N management without topdressing increased maize (6.2%), rice (34.6%), and wheat (2.9%) yields while saving N fertilizer (3%) and also increased the net economic benefits (23.71%) compared to CSA. A soil incubation study further proved the potential of LPF to enhance the retention and availability of mineral N in the root zone over an extended period. Thus, LPF could be an important fertilizer management strategy and should be extended to other less-developed and developing regions to win the triple benefit of food security, environmental quality, and economic gains.Keywords: grain yield, LPF, NH₃ volatilization, N₂O emission, N recovery efficiency
Procedia PDF Downloads 2015006 Eradicating Micronutrient Deficiency through Biofortification
Authors: Ihtasham Hamza
Abstract:
In the contemporary world, where the West is afflicted by the diseases of excess nutrition, much of the rest globe suffers at the hands of hunger. A troubling constituent of hunger is micronutrient deficiency, also called hidden hunger. Major dependence on calorie-rich diets and low diet diversification are responsible for high malnutrition rates, especially in African and Asian countries. But the dilemma isn’t immune to solutions. Highlighting the substantial cause to be sole dependence on staples for food, biofortification has emerged as a novel tool to confront the widely distributed jeopardize of hidden hunger. Biofortification potentials the better nutritional approachability to commonalities overcoming various difficulties and reaching the doorstep. The crops associated with biofortification offer a rural-based involvement that, proposal, primarily reaches these more remote populations, which comprise a majority of the malnourished in many countries, and then penetrates to urban populations as assembly overages are marketed. Initial investments in agricultural research at a central location can generate high recurrent benefits at low cost as adapted biofortified cultivars become widely available in countries across time at low recurrent costs as opposed to supplementation which is comparatively expensive and requires continued financing over time, which may be imperilled by fluctuating political curiosity.Keywords: biofortified crops, hunger, malnutrition, agricultural practices
Procedia PDF Downloads 28815005 An Evaluation of the Impact of E-Banking on Operational Efficiency of Banks in Nigeria
Authors: Ibrahim Rabiu Darazo
Abstract:
The research has been conducted on the impact of E-banking on the operational efficiency of Banks in Nigeria, A case of some selected banks (Diamond Bank Plc, GTBankPlc, and Fidelity Bank Plc) in Nigeria. The research is a quantitative research which uses both primary and secondary sources of data collection. Questionnaire were used to obtained accurate data, where 150 Questionnaire were distributed among staff and customers of the three Banks , and the data collected where analysed using chi-square, whereas the secondary data where obtained from relevant text books, journals and relevant web sites. It is clear from the findings that, the use of e-banking by the banks has improved the efficiency of these banks, in terms of providing efficient services to customers electronically, using Internet Banking, Telephone Banking ATMs, reducing time taking to serve customers, e-banking allow new customers to open an account online, customers have access to their account at all the time 24/7.E-banking provide access to customers information from the data base and cost of check and postage were eliminated using e-banking. The recommendation at the end of the research include; the Banks should try to update their electronic gadgets, e-fraud(internal & external) should also be controlled, Banks shall employ qualified man power, Biometric ATMs shall be introduce to reduce fraud using ATM Cards, as it is use in other countries like USA.Keywords: banks, electronic banking, operational efficiency of banks, biometric ATMs
Procedia PDF Downloads 33315004 Identification System for Grading Banana in Food Processing Industry
Authors: Ebenezer O. Olaniyi, Oyebade K. Oyedotun, Khashman Adnan
Abstract:
In the food industry high quality production is required within a limited time to meet up with the demand in the society. In this research work, we have developed a model which can be used to replace the human operator due to their low output in production and slow in making decisions as a result of an individual differences in deciding the defective and healthy banana. This model can perform the vision attributes of human operators in deciding if the banana is defective or healthy for food production based. This research work is divided into two phase, the first phase is the image processing where several image processing techniques such as colour conversion, edge detection, thresholding and morphological operation were employed to extract features for training and testing the network in the second phase. These features extracted in the first phase were used in the second phase; the classification system phase where the multilayer perceptron using backpropagation neural network was employed to train the network. After the network has learned and converges, the network was tested with feedforward neural network to determine the performance of the network. From this experiment, a recognition rate of 97% was obtained and the time taken for this experiment was limited which makes the system accurate for use in the food industry.Keywords: banana, food processing, identification system, neural network
Procedia PDF Downloads 47115003 Plackett-Burman Design to Evaluate the Influence of Operating Parameters on Anaerobic Orthophosphate Release from Enhanced Biological Phosphorus Removal Sludge
Authors: Reza Salehi, Peter L. Dold, Yves Comeau
Abstract:
The aim of the present study was to investigate the effect of a total of 6 operating parameters including pH (X1), temperature (X2), stirring speed (X3), chemical oxygen demand (COD) (X4), volatile suspended solids (VSS) (X5) and time (X6) on anaerobic orthophosphate release from enhanced biological phosphorus removal (EBPR) sludge. An 8-run Plackett Burman design was applied and the statistical analysis of the experimental data was performed using Minitab16.2.4 software package. The Analysis of variance (ANOVA) results revealed that temperature, COD, VSS and time had a significant effect with p-values of less than 0.05 whereas pH and stirring speed were identified as non-significant parameters, but influenced orthophosphate release from the EBPR sludge. The mathematic expression obtained by the first-order multiple linear regression model between orthophosphate release from the EBPR sludge (Y) and the operating parameters (X1-X6) was Y=18.59+1.16X1-3.11X2-0.81X3+3.79X4+9.89X5+4.01X6. The model p-value and coefficient of determination (R2) value were 0.026 and of 99.87%, respectively, which indicates the model is significant and the predicted values of orthophosphate release from the EBPR sludge have been excellently correlated with the observed values.Keywords: anaerobic, operating parameters, orthophosphate release, Plackett-Burman design
Procedia PDF Downloads 27915002 Response Surface Methodology to Obtain Disopyramide Phosphate Loaded Controlled Release Ethyl Cellulose Microspheres
Authors: Krutika K. Sawant, Anil Solanki
Abstract:
The present study deals with the preparation and optimization of ethyl cellulose-containing disopyramide phosphate loaded microspheres using solvent evaporation technique. A central composite design consisting of a two-level full factorial design superimposed on a star design was employed for optimizing the preparation microspheres. The drug:polymer ratio (X1) and speed of the stirrer (X2) were chosen as the independent variables. The cumulative release of the drug at a different time (2, 6, 10, 14, and 18 hr) was selected as the dependent variable. An optimum polynomial equation was generated for the prediction of the response variable at time 10 hr. Based on the results of multiple linear regression analysis and F statistics, it was concluded that sustained action can be obtained when X1 and X2 are kept at high levels. The X1X2 interaction was found to be statistically significant. The drug release pattern fitted the Higuchi model well. The data of a selected batch were subjected to an optimization study using Box-Behnken design, and an optimal formulation was fabricated. Good agreement was observed between the predicted and the observed dissolution profiles of the optimal formulation.Keywords: disopyramide phosphate, ethyl cellulose, microspheres, controlled release, Box-Behnken design, factorial design
Procedia PDF Downloads 45815001 Analysis of Residents’ Travel Characteristics and Policy Improving Strategies
Authors: Zhenzhen Xu, Chunfu Shao, Shengyou Wang, Chunjiao Dong
Abstract:
To improve the satisfaction of residents' travel, this paper analyzes the characteristics and influencing factors of urban residents' travel behavior. First, a Multinominal Logit Model (MNL) model is built to analyze the characteristics of residents' travel behavior, reveal the influence of individual attributes, family attributes and travel characteristics on the choice of travel mode, and identify the significant factors. Then put forward suggestions for policy improvement. Finally, Support Vector Machine (SVM) and Multi-Layer Perceptron (MLP) models are introduced to evaluate the policy effect. This paper selects Futian Street in Futian District, Shenzhen City for investigation and research. The results show that gender, age, education, income, number of cars owned, travel purpose, departure time, journey time, travel distance and times all have a significant influence on residents' choice of travel mode. Based on the above results, two policy improvement suggestions are put forward from reducing public transportation and non-motor vehicle travel time, and the policy effect is evaluated. Before the evaluation, the prediction effect of MNL, SVM and MLP models was evaluated. After parameter optimization, it was found that the prediction accuracy of the three models was 72.80%, 71.42%, and 76.42%, respectively. The MLP model with the highest prediction accuracy was selected to evaluate the effect of policy improvement. The results showed that after the implementation of the policy, the proportion of public transportation in plan 1 and plan 2 increased by 14.04% and 9.86%, respectively, while the proportion of private cars decreased by 3.47% and 2.54%, respectively. The proportion of car trips decreased obviously, while the proportion of public transport trips increased. It can be considered that the measures have a positive effect on promoting green trips and improving the satisfaction of urban residents, and can provide a reference for relevant departments to formulate transportation policies.Keywords: neural network, travel characteristics analysis, transportation choice, travel sharing rate, traffic resource allocation
Procedia PDF Downloads 13815000 The Relationship between Spanish Economic Variables: Evidence from the Wavelet Techniques
Authors: Concepcion Gonzalez-Concepcion, Maria Candelaria Gil-Fariña, Celina Pestano-Gabino
Abstract:
We analyze six relevant economic and financial variables for the period 2000M1-2015M3 in the context of the Spanish economy: a financial index (IBEX35), a commodity (Crude Oil Price in euros), a foreign exchange index (EUR/USD), a bond (Spanish 10-Year Bond), the Spanish National Debt and the Consumer Price Index. The goal of this paper is to analyze the main relations between them by computing the Wavelet Power Spectrum and the Cross Wavelet Coherency associated with Morlet wavelets. By using a special toolbox in MATLAB, we focus our interest on the period variable. We decompose the time-frequency effects and improve the interpretation of the results by non-expert users in the theory of wavelets. The empirical evidence shows certain instability periods and reveals various changes and breaks in the causality relationships for sample data. These variables were individually analyzed with Daubechies Wavelets to visualize high-frequency variance, seasonality, and trend. The results are included in Proceeding 20th International Academic Conference, 2015, International Institute of Social and Economic Sciences (IISES), Madrid.Keywords: economic and financial variables, Spain, time-frequency domain, wavelet coherency
Procedia PDF Downloads 24014999 Design, Optimize the Damping System for Optical Scanning Equipment
Authors: Duy Nhat Tran, Van Tien Pham, Quang Trung Trinh, Tien Hai Tran, Van Cong Bui
Abstract:
In recent years, artificial intelligence and the Internet of Things have experienced significant advancements. Collecting image data and real-time analysis and processing of tasks have become increasingly popular in various aspects of life. Optical scanning devices are widely used to observe and analyze different environments, whether fixed outdoors, mounted on mobile devices, or used in unmanned aerial vehicles. As a result, the interaction between the physical environment and these devices has become more critical in terms of safety. Two commonly used methods for addressing these challenges are active and passive approaches. Each method has its advantages and disadvantages, but combining both methods can lead to higher efficiency. One solution is to utilize direct-drive motors for position control and real-time feedback within the operational range to determine appropriate control parameters with high precision. If the maximum motor torque is smaller than the inertial torque and the rotor reaches the operational limit, the spring system absorbs the impact force. Numerous experiments have been conducted to demonstrate the effectiveness of device protection during operation.Keywords: optical device, collision safety, collision absorption, precise mechanics
Procedia PDF Downloads 6314998 Microstructural and Tribological Properties of Thermally Sprayed High Entropy Alloys Coating
Authors: Abhijith N. V., Abhijit Pattnayak, Deepak Kumar
Abstract:
Nowadays, a group of alloys, namely high entropy alloys (HEA), because of their excellent properties. However, the fabrication of HEAs requires multistage techniques, especially mill-ing, sieving, compaction, sintering, inert media, etc. These processes are laborious, costly, time-oriented, and unsuitable for commercial application. This study adopted a single-stage process-based HVOF thermal spray to develop HEA coating on SS304L substrates. The wear behavior of the deposited HEA coating was explored under different milling time durations (5h, 10h, and 15h, respectively). The effect of feedstock preparation, microstructure, surface chemistry, and mechanical and metallurgical properties on wear resistance was also investigated. The microstructure and composition of both coating and feedstock were evaluated by scanning electron microscopy (SEM) and energy dispersive spectroscopy (EDS) analysis. Finally, the phase distribution was correlated by X-ray diffraction (XRD ) analysis. The results showed that 15h milled powder coating indicated better tribological than the base substrate and 5h,10h milled powder coating. A chemically stable Body Centered Cubic (BCC) solid solution phase was generated within the 15h milled powder-coated system, which resulted in superior tribological properties.Keywords: high entropy alloys coating, wear mechanism, HVOF coating, microstructure
Procedia PDF Downloads 9814997 Internal Audit Function Contributions to the External Audit
Authors: Douglas F. Prawitt, Nathan Y. Sharp, David A. Wood
Abstract:
Consistent with prior experimental and survey studies, we find that IAFs that spend more time directly assisting the external auditor is associated with lower external audit fees. Interestingly, we do not find evidence that external auditors reduce fees based on work previously performed by the IAF. We also find that the time spent assisting the external auditor has a greater negative effect on external audit fees than the time spent performing tasks upon which the auditor may rely but that are not performed as direct assistance to the external audit. Our results also show that previous proxies used to measure this relation is either not associated with or are negatively associated with our direct measures of how the IAF can contribute to the external audit and are highly positively associated with the size and the complexity of the organization. Thus, we conclude the disparate experimental and archival results may be attributable to issues surrounding the construct validity of measures used in previous archival studies and that when measures similar to those used in experimental studies are employed in archival tests, the archival results are consistent with experimental findings. Our research makes four primary contributions to the literature. First, we provide evidence that internal auditing contributes to a reduction in external audit fees. Second, we replicate and provide an explanation for why previous archival studies find that internal auditing has either no association with external audit fees or is associated with an increase in those fees: prior studies generally use proxies of internal audit contribution that do not adequately capture the intended construct. Third, our research expands on survey-based research (e.g., Oil Libya sh.co.) by separately examining the impact on the audit fee of the internal auditors’ work, indirectly assisting external auditors and internal auditors’ prior work upon which external auditors can rely. Finally, we extend prior research by using a new, independent data source to validate and extend prior studies. This data set also allows for a sample of examining the impact of internal auditing on the external audit fee and the use of a more comprehensive external audit fee model that better controls for determinants of the external audit fee.Keywords: internal audit, contribution, external audit, function
Procedia PDF Downloads 12414996 Stage-Gate Based Integrated Project Management Methodology for New Product Development
Authors: Mert Kıranç, Ekrem Duman, Murat Özbilen
Abstract:
In order to achieve new product development (NPD) activities on time and within budgetary constraints, the NPD managers need a well-designed methodology. This study intends to create an integrated project management methodology for the ones who focus on new product development projects. In the scope of the study, four different management systems are combined. These systems are called as 'Schedule-oriented Stage-Gate Method, Risk Management, Change Management and Earned Value Management'. New product development term is quite common in many different industries such as defense industry, construction, health care/dental, higher education, fast moving consumer goods, white goods, electronic devices, marketing and advertising and software development. All product manufacturers run against each other’s for introducing a new product to the market. In order to achieve to produce a more competitive product in the market, an optimum project management methodology is chosen, and this methodology is adapted to company culture. The right methodology helps the company to present perfect product to the customers at the right time. The benefits of proposed methodology are discussed as an application by a company. As a result, how the integrated methodology improves the efficiency and how it achieves the success of the project are unfolded.Keywords: project, project management, management methodology, new product development, risk management, change management, earned value, stage-gate
Procedia PDF Downloads 31214995 Molecular Detection and Characterization of Infectious Bronchitis Virus from Libya
Authors: Abdulwahab Kammon, Tan Sheau Wei, Abdul Rahman Omar, Abdunaser Dayhum, Ibrahim Eldghayes, Monier Sharif
Abstract:
Infectious bronchitis virus (IBV) is a very dynamic and evolving virus which causing major economic losses to the global poultry industry. Recently, the Libyan poultry industry faced severe outbreak of respiratory distress associated with high mortality and dramatic drop in egg production. Tracheal and cloacal swabs were analyzed for several poultry viruses. IBV was detected using SYBR Green I real-time PCR detection based on the nucleocapsid (N) gene. Sequence analysis of the partial N gene indicated high similarity (~ 94%) to IBV strain 3382/06 that was isolated from Taiwan. Even though the IBV strain 3382/06 is more similar to that of the Mass type H120, the isolate has been implicated associated with intertypic recombinant of 3 putative parental IBV strains namely H120, Taiwan strain 1171/92 and China strain CK/CH/LDL/97I. Complete sequencing and antigenicity studies of the Libya IBV strains are currently underway to determine the evolution of the virus and its importance in vaccine induced immunity. In this paper, we documented for the first time the presence of possibly variant IBV strain from Libya which required a dramatic change in the vaccination program.Keywords: Libya, infectious bronchitis, molecular characterization, viruses, vaccine
Procedia PDF Downloads 47014994 The Effect of Supplementary Cementitious Materials on Fresh and Hardened Properties of Self-Compacting Concretes
Authors: Akram Salah Eddine Belaidi, Said Kenai, El-Hadj Kadri, Benchaâ Benabed, Hamza Soualhi
Abstract:
Self-compacting concrete (SCC) was developed in the middle of the 1980’s in Japan. SCC flows alone under its dead weight and consolidates itself without any entry of additional compaction energy and without segregation. As an integral part of a SCC, self-compacting mortars (SCM) may serve as a basis for the mix design of concrete since the measurement of the rheological properties of SCCs. This paper discusses the effect of using natural pozzolana (PZ) and marble powder (MP) in two alternative systems ratios PZ/MP = 1 and 1/3 of the performance of the SCC. A total of 11 SCC’s were prepared having a constant water-binder (w/b) ratio of 0.40 and total cementitious materials content of 475 kg/m3. Then, the fresh properties of the mortars were tested for mini-slump flow diameter and mini-V-funnel flow time for SCMs and Slumps flow test, L-Box height ratio, V-Funnel flow time and sieve stability for SCC. Moreover, the development in the compressive strength was determined at 3, 7, 28, 56, and 90 days. Test results have shown that using of ternary blends improved the fresh properties of the mixtures. The compressive strength of SCC at 90 days with 30% of PZ and MP was similar to those of ordinary concrete use in situ.Keywords: self-compacting mortar, self-compacting concrete, natural pozzolana, marble powder, rheology, compressive strength
Procedia PDF Downloads 37514993 Assisted Prediction of Hypertension Based on Heart Rate Variability and Improved Residual Networks
Authors: Yong Zhao, Jian He, Cheng Zhang
Abstract:
Cardiovascular diseases caused by hypertension are extremely threatening to human health, and early diagnosis of hypertension can save a large number of lives. Traditional hypertension detection methods require special equipment and are difficult to detect continuous blood pressure changes. In this regard, this paper first analyzes the principle of heart rate variability (HRV) and introduces sliding window and power spectral density (PSD) to analyze the time domain features and frequency domain features of HRV, and secondly, designs an HRV-based hypertension prediction network by combining Resnet, attention mechanism, and multilayer perceptron, which extracts the frequency domain through the improved ResNet18 features through a modified ResNet18, its fusion with time-domain features through an attention mechanism, and the auxiliary prediction of hypertension through a multilayer perceptron. Finally, the network was trained and tested using the publicly available SHAREE dataset on PhysioNet, and the test results showed that this network achieved 92.06% prediction accuracy for hypertension and outperformed K Near Neighbor(KNN), Bayes, Logistic, and traditional Convolutional Neural Network(CNN) models in prediction performance.Keywords: feature extraction, heart rate variability, hypertension, residual networks
Procedia PDF Downloads 10614992 Fault-Tolerant Control Study and Classification: Case Study of a Hydraulic-Press Model Simulated in Real-Time
Authors: Jorge Rodriguez-Guerra, Carlos Calleja, Aron Pujana, Iker Elorza, Ana Maria Macarulla
Abstract:
Society demands more reliable manufacturing processes capable of producing high quality products in shorter production cycles. New control algorithms have been studied to satisfy this paradigm, in which Fault-Tolerant Control (FTC) plays a significant role. It is suitable to detect, isolate and adapt a system when a harmful or faulty situation appears. In this paper, a general overview about FTC characteristics are exposed; highlighting the properties a system must ensure to be considered faultless. In addition, a research to identify which are the main FTC techniques and a classification based on their characteristics is presented in two main groups: Active Fault-Tolerant Controllers (AFTCs) and Passive Fault-Tolerant Controllers (PFTCs). AFTC encompasses the techniques capable of re-configuring the process control algorithm after the fault has been detected, while PFTC comprehends the algorithms robust enough to bypass the fault without further modifications. The mentioned re-configuration requires two stages, one focused on detection, isolation and identification of the fault source and the other one in charge of re-designing the control algorithm by two approaches: fault accommodation and control re-design. From the algorithms studied, one has been selected and applied to a case study based on an industrial hydraulic-press. The developed model has been embedded under a real-time validation platform, which allows testing the FTC algorithms and analyse how the system will respond when a fault arises in similar conditions as a machine will have on factory. One AFTC approach has been picked up as the methodology the system will follow in the fault recovery process. In a first instance, the fault will be detected, isolated and identified by means of a neural network. In a second instance, the control algorithm will be re-configured to overcome the fault and continue working without human interaction.Keywords: fault-tolerant control, electro-hydraulic actuator, fault detection and isolation, control re-design, real-time
Procedia PDF Downloads 17714991 Project-Bbased Learning (PBL) Taken to Extremes: Full-Year/Full-Time PBL Replacement of Core Curriculum
Authors: Stephen Grant Atkins
Abstract:
Radical use of project-based learning (PBL) in a small New Zealand business school provides an opportunity to longitudinally examine its effects over a decade of pre-Covid data. Prior to this business school’s implementation of PBL, starting in 2012, the business pedagogy literature presented just one example of PBL replacing an entire core-set of courses. In that instance, a British business school merged four of its ‘degree Year 3’ accounting courses into one PBL semester. As radical as that would have seemed, to students aged 20-to-22, the PBL experiment conducted in a New Zealand business school was notably more extreme: 41 nationally-approved Learning Outcomes (L.O.s), these deriving from 8 separate core courses, were aggregated into one grand set of L.O.s, and then treated as a ‘full-year’/‘full-time’ single course. The 8 courses in question were all components of this business school’s compulsory ‘degree Year 1’ curriculum. Thus, the students involved were notably younger (…ages 17-to-19…), and no ‘part-time’ enrolments were allowed. Of interest are this PBL experiment’s effects on subsequent performance outcomes in ‘degree Years 2 & 3’ (….which continued to operate in their traditional ways). Of special interest is the quality of ‘group project’ outcomes. This is because traditionally, ‘degree Year 1’ course assessments are only minimally based on group work. This PBL experiment altered that practice radically, such that PBL ‘degree Year 1’ alumni entered their remaining two years of business coursework with far more ‘project group’ experience. Timeline-wise, thus of interest here, firstly, is ‘degree Year 2’ performance outcomes data from years 2010-2012 + 2016-2018, and likewise ‘degree Year 3’ data for years 2011-2013 + 2017-2019. Those years provide a pre-&-post comparative baseline for performance outcomes in students never exposed to this school’s radical PBL experiment. That baseline is then compared to PBL alumni outcomes (2013-2016….including’Student Evaluation of Course Quality’ outcomes…) to clarify ‘radical PBL’ effects.Keywords: project-based learning, longitudinal mixed-methods, students criticism, effects-on-learning
Procedia PDF Downloads 9714990 Long-Term Indoor Air Monitoring for Students with Emphasis on Particulate Matter (PM2.5) Exposure
Authors: Seyedtaghi Mirmohammadi, Jamshid Yazdani, Syavash Etemadi Nejad
Abstract:
One of the main indoor air parameters in classrooms is dust pollution and it depends on the particle size and exposure duration. However, there is a lake of data about the exposure level to PM2.5 concentrations in rural area classrooms. The objective of the current study was exposure assessment for PM2.5 for students in the classrooms. One year monitoring was carried out for fifteen schools by time-series sampling to evaluate the indoor air PM2.5 in the rural district of Sari city, Iran. A hygrometer and thermometer were used to measure some psychrometric parameters (temperature, relative humidity, and wind speed) and Real-Time Dust Monitor, (MicroDust Pro, Casella, UK) was used to monitor particulate matters (PM2.5) concentration. The results show the mean indoor PM2.5 concentration in the studied classrooms was 135µg/m3. The regression model indicated that a positive correlation between indoor PM2.5 concentration and relative humidity, also with distance from city center and classroom size. Meanwhile, the regression model revealed that the indoor PM2.5 concentration, the relative humidity, and dry bulb temperature was significant at 0.05, 0.035, and 0.05 levels, respectively. A statistical predictive model was obtained from multiple regressions modeling for indoor PM2.5 concentration and indoor psychrometric parameters conditions.Keywords: classrooms, concentration, humidity, particulate matters, regression
Procedia PDF Downloads 33514989 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory
Authors: Chiung-Hui Chen
Abstract:
The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.Keywords: behavior, big data, hierarchical hidden Markov model, intelligent object
Procedia PDF Downloads 23314988 Broadening Attentional Scope by Seeing Happy Faces
Authors: John McDowall, Crysta Derham
Abstract:
Broaden and build theory of emotion describes how experiencing positive emotions, such as happiness, broadens our ‘thought-action repertoire’ leading us to be more likely to go out and act on our positive emotions. This results in the building of new relationships, resources and skills, which we can draw on in times of need throughout life. In contrast, the experience of negative emotion is thought to narrow our ‘thought-action repertoire’, leading to specific actions to aid in survival. Three experiments aimed to explore the effect of briefly presented schematic faces (happy, sad, and neutral) on attentional scope using the flanker task. Based on the broaden and build theory it was hypothesised that there would be an increase in reaction time in trials primed with a happy face due to a broadening of attention, leading to increased flanker interference. A decrease in reaction time was predicted for trials primed with a sad face, due to a narrowing of attention leading to less flanker interference. Results lended partial support to the broaden and build hypothesis, with reaction times being slower following happy primes in incongruent flanker trials. Recent research is discussed in regards to potential mediators of the relationship between emotion and attention.Keywords: emotion, attention, broaden and build, flanker task
Procedia PDF Downloads 47814987 A Hybrid Data Mining Algorithm Based System for Intelligent Defence Mission Readiness and Maintenance Scheduling
Authors: Shivam Dwivedi, Sumit Prakash Gupta, Durga Toshniwal
Abstract:
It is a challenging task in today’s date to keep defence forces in the highest state of combat readiness with budgetary constraints. A huge amount of time and money is squandered in the unnecessary and expensive traditional maintenance activities. To overcome this limitation Defence Intelligent Mission Readiness and Maintenance Scheduling System has been proposed, which ameliorates the maintenance system by diagnosing the condition and predicting the maintenance requirements. Based on new data mining algorithms, this system intelligently optimises mission readiness for imminent operations and maintenance scheduling in repair echelons. With modified data mining algorithms such as Weighted Feature Ranking Genetic Algorithm and SVM-Random Forest Linear ensemble, it improves the reliability, availability and safety, alongside reducing maintenance cost and Equipment Out of Action (EOA) time. The results clearly conclude that the introduced algorithms have an edge over the conventional data mining algorithms. The system utilizing the intelligent condition-based maintenance approach improves the operational and maintenance decision strategy of the defence force.Keywords: condition based maintenance, data mining, defence maintenance, ensemble, genetic algorithms, maintenance scheduling, mission capability
Procedia PDF Downloads 297