Search results for: time constraint
14981 Comparative Assessment of the Thermal Tolerance of Spotted Stemborer, Chilo partellus Swinhoe (Lepidoptera: Crambidae) and Its Larval Parasitoid, Cotesia sesamiae Cameron (Hymenoptera: Braconidae)
Authors: Reyard Mutamiswa, Frank Chidawanyika, Casper Nyamukondiwa
Abstract:
Under stressful thermal environments, insects adjust their behaviour and physiology to maintain key life-history activities and improve survival. For interacting species, mutual or antagonistic, thermal stress may affect the participants in differing ways, which may then affect the outcome of the ecological relationship. In agroecosystems, this may be the fate of relationships between insect pests and their antagonistic parasitoids under acute and chronic thermal variability. Against this background, we therefore investigated the thermal tolerance of different developmental stages of Chilo partellus Swinhoe (Lepidoptera: Crambidae) and its larval parasitoid Cotesia sesamiae Cameron (Hymenoptera: Braconidae) using both dynamic and static protocols. In laboratory experiments, we determined lethal temperature assays (upper and lower lethal temperatures) using direct plunge protocols in programmable water baths (Systronix, Scientific, South Africa), effects of ramping rate on critical thermal limits following standardized protocols using insulated double-jacketed chambers (‘organ pipes’) connected to a programmable water bath (Lauda Eco Gold, Lauda DR.R. Wobser GMBH and Co. KG, Germany), supercooling points (SCPs) following dynamic protocols using a Pico logger connected to a programmable water bath, heat knock-down time (HKDT) and chill-coma recovery (CCRT) time following static protocols in climate chambers (HPP 260, Memmert GmbH + Co.KG, Germany) connected to a camera (HD Covert Network Camera, DS-2CD6412FWD-20, Hikvision Digital Technology Co., Ltd, China). When exposed for two hours to a static temperature, lower lethal temperatures ranged -9 to 6; -14 to -2 and -1 to 4ºC while upper lethal temperatures ranged from 37 to 48; 41 to 49 and 36 to 39ºC for C. partellus eggs, larvae and C. sesamiae adults respectively. Faster heating rates improved critical thermal maxima (CTmax) in C. partellus larvae and adult C. partellus and C. sesamiae. Lower cooling rates improved critical thermal minima (CTmin) in C. partellus and C. sesamiae adults while compromising CTmin in C. partellus larvae. The mean SCPs for C. partellus larvae, pupae and adults were -11.82±1.78, -10.43±1.73 and -15.75±2.47 respectively with adults having the lowest SCPs. Heat knock-down time and chill-coma recovery time varied significantly between C. partellus larvae and adults. Larvae had higher HKDT than adults, while the later recovered significantly faster following chill-coma. Current results suggest developmental stage differences in C. partellus thermal tolerance (with respect to lethal temperatures and critical thermal limits) and a compromised temperature tolerance of parasitoid C. sesamiae relative to its host, suggesting potential asynchrony between host-parasitoid population phenology and consequently biocontrol efficacy under global change. These results have broad implications to biological pest management insect-natural enemy interactions under rapidly changing thermal environments.Keywords: chill-coma recovery time, climate change, heat knock-down time, lethal temperatures, supercooling point
Procedia PDF Downloads 23814980 Valuation of Entrepreneurship Education (EE) Curriculum and Self-Employment Generation among Graduates of Tertiary Institutions in Edo State, Nigeria
Authors: Angela Obose Oriazowanlan
Abstract:
Despite the introduction of Entrepreneurship education into the Nigerian University curriculum to prepare graduates for self-employment roles in order to abate employment challenges, their unemployment rate still soars high. The study, therefore, examined the relevance of the curriculum contents and its delivery mechanism to equip graduates with appropriate entrepreneurial skills prior to graduation. Four research questions and two hypotheses guided the study. The survey research design was adopted for the study. An infinite population of graduates of a period of five years with 200 sample representatives using the simple random sampling technique was adopted. A 45-item structured questionnaire was used for data gathering. The gathered data thereof was anlysed using the descriptive statistics of mean and standard deviation, while the formulated hypotheses were tested with Z-score at 0.5 level of significance. The findings revealed, among others, that graduates acquisition of appropriate entrepreneurial skills for self-employment generation is low due to curriculum deficiencies, insufficient time allotment, and the delivery mechanism. It was recommended, among others, that the curriculum should be reviewed to improve its relevancy and that sufficient time should be allotted to enable adequate teaching and learning process.Keywords: evaluation of entrepreneurship education (EE) curriculum, self-employment generation, graduates of tertiary institutions, Edo state, Nigeria
Procedia PDF Downloads 9914979 Event Data Representation Based on Time Stamp for Pedestrian Detection
Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita
Abstract:
In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption
Procedia PDF Downloads 9714978 3D Codes for Unsteady Interaction Problems of Continuous Mechanics in Euler Variables
Authors: M. Abuziarov
Abstract:
The designed complex is intended for the numerical simulation of fast dynamic processes of interaction of heterogeneous environments susceptible to the significant formability. The main challenges in solving such problems are associated with the construction of the numerical meshes. Currently, there are two basic approaches to solve this problem. One is using of Lagrangian or Lagrangian Eulerian grid associated with the boundaries of media and the second is associated with the fixed Eulerian mesh, boundary cells of which cut boundaries of the environment medium and requires the calculation of these cut volumes. Both approaches require the complex grid generators and significant time for preparing the code’s data for simulation. In this codes these problems are solved using two grids, regular fixed and mobile local Euler Lagrange - Eulerian (ALE approach) accompanying the contact and free boundaries, the surfaces of shock waves and phase transitions, and other possible features of solutions, with mutual interpolation of integrated parameters. For modeling of both liquids and gases, and deformable solids the Godunov scheme of increased accuracy is used in Lagrangian - Eulerian variables, the same for the Euler equations and for the Euler- Cauchy, describing the deformation of the solid. The increased accuracy of the scheme is achieved by using 3D spatial time dependent solution of the discontinuity problem (3D space time dependent Riemann's Problem solver). The same solution is used to calculate the interaction at the liquid-solid surface (Fluid Structure Interaction problem). The codes does not require complex 3D mesh generators, only the surfaces of the calculating objects as the STL files created by means of engineering graphics are given by the user, which greatly simplifies the preparing the task and makes it convenient to use directly by the designer at the design stage. The results of the test solutions and applications related to the generation and extension of the detonation and shock waves, loading the constructions are presented.Keywords: fluid structure interaction, Riemann's solver, Euler variables, 3D codes
Procedia PDF Downloads 43914977 Water Ingress into Underground Mine Voids in the Central Rand Goldfields Area, South Africa-Fluid Induced Seismicity
Authors: Artur Cichowicz
Abstract:
The last active mine in the Central Rand Goldfields area (50 km x 15 km) ceased operations in 2008. This resulted in the closure of the pumping stations, which previously maintained the underground water level in the mining voids. As a direct consequence of the water being allowed to flood the mine voids, seismic activity has increased directly beneath the populated area of Johannesburg. Monitoring of seismicity in the area has been on-going for over five years using the network of 17 strong ground motion sensors. The objective of the project is to improve strategies for mine closure. The evolution of the seismicity pattern was investigated in detail. Special attention was given to seismic source parameters such as magnitude, scalar seismic moment and static stress drop. Most events are located within historical mine boundaries. The seismicity pattern shows a strong relationship between the presence of the mining void and high levels of seismicity; no seismicity migration patterns were observed outside the areas of old mining. Seven years after the pumping stopped, the evolution of the seismicity has indicated that the area is not yet in equilibrium. The level of seismicity in the area appears to not be decreasing over time since the number of strong events, with Mw magnitudes above 2, is still as high as it was when monitoring began over five years ago. The average rate of seismic deformation is 1.6x1013 Nm/year. Constant seismic deformation was not observed over the last 5 years. The deviation from the average is in the order of 6x10^13 Nm/year, which is a significant deviation. The variation of cumulative seismic moment indicates that a constant deformation rate model is not suitable. Over the most recent five year period, the total cumulative seismic moment released in the Central Rand Basin was 9.0x10^14 Nm. This is equivalent to one earthquake of magnitude 3.9. This is significantly less than what was experienced during the mining operation. Characterization of seismicity triggered by a rising water level in the area can be achieved through the estimation of source parameters. Static stress drop heavily influences ground motion amplitude, which plays an important role in risk assessments of potential seismic hazards in inhabited areas. The observed static stress drop in this study varied from 0.05 MPa to 10 MPa. It was found that large static stress drops could be associated with both small and large events. The temporal evolution of the inter-event time provides an understanding of the physical mechanisms of earthquake interaction. Changes in the characteristics of the inter-event time are produced when a stress change is applied to a group of faults in the region. Results from this study indicate that the fluid-induced source has a shorter inter-event time in comparison to a random distribution. This behaviour corresponds to a clustering of events, in which short recurrence times tend to be close to each other, forming clusters of events.Keywords: inter-event time, fluid induced seismicity, mine closure, spectral parameters of seismic source
Procedia PDF Downloads 28514976 Impact of Job Crafting on Work Engagement and Well-Being among Indian Working Professionals
Authors: Arjita Jhingran
Abstract:
The pandemic was a turning point for flexible employment. In today’s market, employees prefer companies that provide the autonomy to change their work environment and are flexible. Post pandemic employees have become accustomed to modifying, re-designing, and re-aligning their work environment, task, and the way they interact with co-workers based on their preferences after working from home for a long time. In this scenario, the concept of job crafting has come to the forefront, and research on the subject has expanded, particularly during COVID-19. Managers who provide opportunities to craft the job are driving enhanced engagement and well-being. The current study will aim to examine the impact of job crafting on work engagement and psychological well-being among 385 working professionals, ranging in the age group of 21- 39 years. (M age=30 years). The study will also draw comparisons between freelancers and full-time employees, as freelancers have been considered to have more autonomy over their job. A comparison-based among MNC or startups will be studied; as for the majority of startups, autonomy is a primary motivator. Moreover, a difference based on the level of experience will also be observed, which will add to the body of knowledge. The data will be collected through Job Crafting Questionnaire, Utrecht Work Engagement Scale, and Psychological Well-Being Scale. To infer the findings, correlation analysis will be used to study the relationship among variables, and a Three way ANOVA will be used to draw comparisons.Keywords: job crafting, work engagement, well-being, freelancers, start-ups
Procedia PDF Downloads 10514975 Enhancement of Transaction's Authentication for the Europay, MasterCard, and Visa Contactless Card Payments
Authors: Ossama Al-Maliki
Abstract:
Europay, MasterCard, and Visa (EMV) is one of the most popular payment protocol in the world. The EMV protocol supports Chip and PIN Transactions, Chip and Signature transactions, and Contactless transactions. This protocol suffers from tens of £ millions of lost per year due to many fraudulent payments. This is due to several reported vulnerable points in the protocols used for such payments that allow skimming, replay, cloning, Mole Point of Sale (POS), relay, and other attacks to be conducted. In this paper, we are focusing on the EMV contactless specification and we have proposed two proposal solutions to the addition of a localization factor to enhance the payment authentication of such transactions designed to prevent relay, cloning, and Mole-POS attacks. Our proposed solution is a back-end localization scheme to help the Issuer-Bank compare the location of the genuine cardholder in relation to the used POS. Our scheme uses 'something you have' which is the Cardholder Smartphone (CSP) to provide the location of the cardholder at the time of the transaction and without impacting the contactless payment time/protocol. The Issuer-bank obtain the CSP Location using tried and tested localization techniques, and independently of the cardholder. Both of our proposal solutions do not require infrastructure changes, and it uses existing EMV/SP protocol messages to communicate our scheme information.Keywords: NFC, RFID, contactless card, authentication, location, EMV
Procedia PDF Downloads 24214974 Analysis of Inventory Control, Lot Size and Reorder Point for Engro Polymers and Chemicals
Authors: Ali Akber Jaffri, Asad Naseem, Javeria Khan
Abstract:
The purpose of this study is to determine safety stock, maximum inventory level, reordering point, and reordering quantity by rearranging lot sizes for supplier and customer in MRO (maintenance repair operations) warehouse of Engro Polymers & Chemicals. To achieve the aim, physical analysis method and excel commands were carried out to elicit the customer and supplier data provided by the company. Initially, we rearranged the current lot sizes and MOUs (measure of units) in SAP software. Due to change in lot sizes we have to determine the new quantities for safety stock, maximum inventory, reordering point and reordering quantity as per company's demand. By proposed system, we saved extra cost in terms of reducing the time of receiving from vendor and in issuance to customer, ease of material handling in MRO warehouse and also reduce human efforts. The information requirements identified in this study can be utilized in calculating Economic Order Quantity.Keywords: carrying cost, economic order quantity, fast moving, lead time, lot size, MRO, maximum inventory, ordering cost, physical inspection, reorder point
Procedia PDF Downloads 23914973 Impact of Increased Radiology Staffing on After-Hours Radiology Reporting Efficiency and Quality
Authors: Peregrine James Dalziel, Philip Vu Tran
Abstract:
Objective / Introduction: Demand for radiology services from Emergency Departments (ED) continues to increase with greater demands placed on radiology staff providing reports for the management of complex cases. Queuing theory indicates that wide variability of process time with the random nature of request arrival increases the probability of significant queues. This can lead to delays in the time-to-availability of radiology reports (TTA-RR) and potentially impaired ED patient flow. In addition, greater “cognitive workload” of greater volume may lead to reduced productivity and increased errors. We sought to quantify the potential ED flow improvements obtainable from increased radiology providers serving 3 public hospitals in Melbourne Australia. We sought to assess the potential productivity gains, quality improvement and the cost-effectiveness of increased labor inputs. Methods & Materials: The Western Health Medical Imaging Department moved from single resident coverage on weekend days 8:30 am-10:30 pm to a limited period of 2 resident coverage 1 pm-6 pm on both weekend days. The TTA-RR for weekend CT scans was calculated from the PACs database for the 8 month period symmetrically around the date of staffing change. A multivariate linear regression model was developed to isolate the improvement in TTA-RR, between the two 4-months periods. Daily and hourly scan volume at the time of each CT scan was calculated to assess the impact of varying department workload. To assess any improvement in report quality/errors a random sample of 200 studies was assessed to compare the average number of clinically significant over-read addendums to reports between the 2 periods. Cost-effectiveness was assessed by comparing the marginal cost of additional staffing against a conservative estimate of the economic benefit of improved ED patient throughput using the Australian national insurance rebate for private ED attendance as a revenue proxy. Results: The primary resident on call and the type of scan accounted for most of the explained variability in time to report availability (R2=0.29). Increasing daily volume and hourly volume was associated with increased TTA-RR (1.5m (p<0.01) and 4.8m (p<0.01) respectively per additional scan ordered within each time frame. Reports were available 25.9 minutes sooner on average in the 4 months post-implementation of double coverage (p<0.01) with additional 23.6 minutes improvement when 2 residents were on-site concomitantly (p<0.01). The aggregate average improvement in TTA-RR was 24.8 hours per weekend day This represents the increased decision-making time available to ED physicians and potential improvement in ED bed utilisation. 5% of reports from the intervention period contained clinically significant addendums vs 7% in the single resident period but this was not statistically significant (p=0.7). The marginal cost was less than the anticipated economic benefit based assuming a 50% capture of improved TTA-RR inpatient disposition and using the lowest available national insurance rebate as a proxy for economic benefit. Conclusion: TTA-RR improved significantly during the period of increased staff availability, both during the specific period of increased staffing and throughout the day. Increased labor utilisation is cost-effective compared with the potential improved productivity for ED cases requiring CT imaging.Keywords: workflow, quality, administration, CT, staffing
Procedia PDF Downloads 11214972 Non-Parametric Changepoint Approximation for Road Devices
Authors: Loïc Warscotte, Jehan Boreux
Abstract:
The scientific literature of changepoint detection is vast. Today, a lot of methods are available to detect abrupt changes or slight drift in a signal, based on CUSUM or EWMA charts, for example. However, these methods rely on strong assumptions, such as the stationarity of the stochastic underlying process, or even the independence and Gaussian distributed noise at each time. Recently, the breakthrough research on locally stationary processes widens the class of studied stochastic processes with almost no assumptions on the signals and the nature of the changepoint. Despite the accurate description of the mathematical aspects, this methodology quickly suffers from impractical time and space complexity concerning the signals with high-rate data collection, if the characteristics of the process are completely unknown. In this paper, we then addressed the problem of making this theory usable to our purpose, which is monitoring a high-speed weigh-in-motion system (HS-WIM) towards direct enforcement without supervision. To this end, we first compute bounded approximations of the initial detection theory. Secondly, these approximating bounds are empirically validated by generating many independent long-run stochastic processes. The abrupt changes and the drift are both tested. Finally, this relaxed methodology is tested on real signals coming from a HS-WIM device in Belgium, collected over several months.Keywords: changepoint, weigh-in-motion, process, non-parametric
Procedia PDF Downloads 7814971 Anaerobic Digestion of Coffee Wastewater from a Fast Inoculum Adaptation Stage: Replacement of Complex Substrate
Authors: D. Lepe-Cervantes, E. Leon-Becerril, J. Gomez-Romero, O. Garcia-Depraect, A. Lopez-Lopez
Abstract:
In this study, raw coffee wastewater (CWW) was used as a complex substrate for anaerobic digestion. The inoculum adaptation stage, microbial diversity analysis and biomethane potential (BMP) tests were performed. A fast inoculum adaptation stage was used by the replacement of vinasse to CWW in an anaerobic sequential batch reactor (AnSBR) operated at mesophilic conditions. Illumina MiSeq sequencing was used to analyze the microbial diversity. While, BMP tests using inoculum adapted to CWW were carried out at different inoculum to substrate (I/S) ratios (2:1, 3:1 and 4:1, on a VS basis). Results show that the adaptability percentage was increased gradually until it reaches the highest theoretical value in a short time of 10 d; with a methane yield of 359.10 NmL CH4/g COD-removed; Methanobacterium beijingense was the most abundant microbial (75%) and the greatest specific methane production was achieved at I/S ratio 4:1, whereas the lowest was obtained at 2:1, with BMP values of 320 NmL CH4/g VS and 151 NmL CH4/g VS, respectively. In conclusion, gradual replacement of substrate was a feasible method to adapt the inoculum in a short time even using complex raw substrates, whereas in the BMP tests, the specific methane production was proportional to the initial amount of inoculum.Keywords: anaerobic digestion, biomethane potential test, coffee wastewater, fast inoculum adaptation
Procedia PDF Downloads 38114970 Application of Stochastic Models to Annual Extreme Streamflow Data
Authors: Karim Hamidi Machekposhti, Hossein Sedghi
Abstract:
This study was designed to find the best stochastic model (using of time series analysis) for annual extreme streamflow (peak and maximum streamflow) of Karkheh River at Iran. The Auto-regressive Integrated Moving Average (ARIMA) model used to simulate these series and forecast those in future. For the analysis, annual extreme streamflow data of Jelogir Majin station (above of Karkheh dam reservoir) for the years 1958–2005 were used. A visual inspection of the time plot gives a little increasing trend; therefore, series is not stationary. The stationarity observed in Auto-Correlation Function (ACF) and Partial Auto-Correlation Function (PACF) plots of annual extreme streamflow was removed using first order differencing (d=1) in order to the development of the ARIMA model. Interestingly, the ARIMA(4,1,1) model developed was found to be most suitable for simulating annual extreme streamflow for Karkheh River. The model was found to be appropriate to forecast ten years of annual extreme streamflow and assist decision makers to establish priorities for water demand. The Statistical Analysis System (SAS) and Statistical Package for the Social Sciences (SPSS) codes were used to determinate of the best model for this series.Keywords: stochastic models, ARIMA, extreme streamflow, Karkheh river
Procedia PDF Downloads 14814969 Geovisualization of Human Mobility Patterns in Los Angeles Using Twitter Data
Authors: Linna Li
Abstract:
The capability to move around places is doubtless very important for individuals to maintain good health and social functions. People’s activities in space and time have long been a research topic in behavioral and socio-economic studies, particularly focusing on the highly dynamic urban environment. By analyzing groups of people who share similar activity patterns, many socio-economic and socio-demographic problems and their relationships with individual behavior preferences can be revealed. Los Angeles, known for its large population, ethnic diversity, cultural mixing, and entertainment industry, faces great transportation challenges such as traffic congestion, parking difficulties, and long commuting. Understanding people’s travel behavior and movement patterns in this metropolis sheds light on potential solutions to complex problems regarding urban mobility. This project visualizes people’s trajectories in Greater Los Angeles (L.A.) Area over a period of two months using Twitter data. A Python script was used to collect georeferenced tweets within the Greater L.A. Area including Ventura, San Bernardino, Riverside, Los Angeles, and Orange counties. Information associated with tweets includes text, time, location, and user ID. Information associated with users includes name, the number of followers, etc. Both aggregated and individual activity patterns are demonstrated using various geovisualization techniques. Locations of individual Twitter users were aggregated to create a surface of activity hot spots at different time instants using kernel density estimation, which shows the dynamic flow of people’s movement throughout the metropolis in a twenty-four-hour cycle. In the 3D geovisualization interface, the z-axis indicates time that covers 24 hours, and the x-y plane shows the geographic space of the city. Any two points on the z axis can be selected for displaying activity density surface within a particular time period. In addition, daily trajectories of Twitter users were created using space-time paths that show the continuous movement of individuals throughout the day. When a personal trajectory is overlaid on top of ancillary layers including land use and road networks in 3D visualization, the vivid representation of a realistic view of the urban environment boosts situational awareness of the map reader. A comparison of the same individual’s paths on different days shows some regular patterns on weekdays for some Twitter users, but for some other users, their daily trajectories are more irregular and sporadic. This research makes contributions in two major areas: geovisualization of spatial footprints to understand travel behavior using the big data approach and dynamic representation of activity space in the Greater Los Angeles Area. Unlike traditional travel surveys, social media (e.g., Twitter) provides an inexpensive way of data collection on spatio-temporal footprints. The visualization techniques used in this project are also valuable for analyzing other spatio-temporal data in the exploratory stage, thus leading to informed decisions about generating and testing hypotheses for further investigation. The next step of this research is to separate users into different groups based on gender/ethnic origin and compare their daily trajectory patterns.Keywords: geovisualization, human mobility pattern, Los Angeles, social media
Procedia PDF Downloads 11914968 Bidirectional Long Short-Term Memory-Based Signal Detection for Orthogonal Frequency Division Multiplexing With All Index Modulation
Authors: Mahmut Yildirim
Abstract:
This paper proposed the bidirectional long short-term memory (Bi-LSTM) network-aided deep learning (DL)-based signal detection for Orthogonal frequency division multiplexing with all index modulation (OFDM-AIM), namely Bi-DeepAIM. OFDM-AIM is developed to increase the spectral efficiency of OFDM with index modulation (OFDM-IM), a promising multi-carrier technique for communication systems beyond 5G. In this paper, due to its strong classification ability, Bi-LSTM is considered an alternative to the maximum likelihood (ML) algorithm, which is used for signal detection in the classical OFDM-AIM scheme. The performance of the Bi-DeepAIM is compared with LSTM network-aided DL-based OFDM-AIM (DeepAIM) and classic OFDM-AIM that uses (ML)-based signal detection via BER performance and computational time criteria. Simulation results show that Bi-DeepAIM obtains better bit error rate (BER) performance than DeepAIM and lower computation time in signal detection than ML-AIM.Keywords: bidirectional long short-term memory, deep learning, maximum likelihood, OFDM with all index modulation, signal detection
Procedia PDF Downloads 7214967 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic
Authors: Diogen Babuc
Abstract:
The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison
Procedia PDF Downloads 10314966 Multi-Criteria Inventory Classification Process Based on Logical Analysis of Data
Authors: Diana López-Soto, Soumaya Yacout, Francisco Ángel-Bello
Abstract:
Although inventories are considered as stocks of money sitting on shelve, they are needed in order to secure a constant and continuous production. Therefore, companies need to have control over the amount of inventory in order to find the balance between excessive and shortage of inventory. The classification of items according to certain criteria such as the price, the usage rate and the lead time before arrival allows any company to concentrate its investment in inventory according to certain ranking or priority of items. This makes the decision making process for inventory management easier and more justifiable. The purpose of this paper is to present a new approach for the classification of new items based on the already existing criteria. This approach is called the Logical Analysis of Data (LAD). It is used in this paper to assist the process of ABC items classification based on multiple criteria. LAD is a data mining technique based on Boolean theory that is used for pattern recognition. This technique has been tested in medicine, industry, credit risk analysis, and engineering with remarkable results. An application on ABC inventory classification is presented for the first time, and the results are compared with those obtained when using the well-known AHP technique and the ANN technique. The results show that LAD presented very good classification accuracy.Keywords: ABC multi-criteria inventory classification, inventory management, multi-class LAD model, multi-criteria classification
Procedia PDF Downloads 88114965 Application of GIS-Based Construction Engineering: An Electronic Document Management System
Authors: Mansour N. Jadid
Abstract:
This paper describes the implementation of a GIS to provide decision support for successfully monitoring the movements and storage of materials, hence ensuring that finished products travel from the point of origin to the destination construction site through the supply-chain management (SCM) system. This system ensures the efficient operation of suppliers, manufacturers, and distributors by determining the shortest path from the point of origin to the final destination to reduce construction costs, minimize time, and enhance productivity. These systems are essential to the construction industry because they reduce costs and save time, thereby improve productivity and effectiveness. This study describes a typical supply-chain model and a geographical information system (GIS)-based SCM that focuses on implementing an electronic document management system, which maps the application framework to integrate geodetic support with the supply-chain system. This process provides guidance for locating the nearest suppliers to fill the information needs of project members in different locations. Moreover, this study illustrates the use of a GIS-based SCM as a collaborative tool in innovative methods for implementing Web mapping services, as well as aspects of their integration by generating an interactive GIS for the construction industry platform.Keywords: construction, coordinate, engineering, GIS, management, map
Procedia PDF Downloads 30314964 Experimental Study of the Fiber Dispersion of Pulp Liquid Flow in Channels with Application to Papermaking
Authors: Masaru Sumida
Abstract:
This study explored the feasibility of improving the hydraulic headbox of papermaking machines by studying the flow of wood-pulp suspensions behind a flat plate inserted in parallel and convergent channels. Pulp fiber concentrations of the wake downstream of the plate were investigated by flow visualization and optical measurements. Changes in the time-averaged and fluctuation of the fiber concentration along the flow direction were examined. In addition, the control of the flow characteristics in the two channels was investigated. The behaviors of the pulp fibers and the wake flow were found to be strongly related to the flow states in the upstream passages partitioned by the plate. The distribution of the fiber concentration was complex because of the formation of a thin water layer on the plate and the generation of Karman’s vortices at the trailing edge of the plate. Compared with the flow in the parallel channel, fluctuations in the fiber concentration decreased in the convergent channel. However, at low flow velocities, the convergent channel has a weak effect on equilibrating the time-averaged fiber concentration. This shows that a rectangular trailing edge cannot adequately disperse pulp suspensions; thus, at low flow velocities, a convergent channel is ineffective in ensuring uniform fiber concentration.Keywords: fiber dispersion, headbox, pulp liquid, wake flow
Procedia PDF Downloads 38714963 Leça da Palmeira Revisited: Sixty-Seven Years of Recurring Work by Álvaro Siza
Authors: Eduardo Jorge Cabral dos Santos Fernandes
Abstract:
Over the last sixty-seven years, Portuguese architect Álvaro Siza Vieira designed several interventions for the Leça da Palmeira waterfront. With this paper, we aim to analyze the history of this set of projects in a chronological approach, seeking to understand the connections that can be established between them. Born in Matosinhos, a fishing and industrial village located near Porto, Álvaro Siza built a remarkable relationship with Leça da Palmeira (a neighboring village located to the north) from a personal and professional point of view throughout his life: it was there that he got married (in the small chapel located next to the Boa Nova lighthouse) and it was there that he designed his first works of great impact, the Boa Nova Tea House and the Ocean Swimming Pool, today classified as national monuments. These two works were the subject of several projects spaced over time, including recent restoration interventions designed by the same author. However, the marks of Siza's intervention in this territory are not limited to these two cases; there were other projects designed for this territory, which we also intend to analyze: the monument to the poet António Nobre (1967-80), the unbuilt project for a restaurant next to Piscina das Marés (presented in 1966 and redesigned in 1993), the reorganization of the Avenida da Liberdade (with a first project, not carried out, in 1965-74, and a reformulation carried out between 1998 and 2006) and, finally, the project for the new APDL facilities, which completes Avenida da Liberdade to the south (1995). Altogether, these interventions are so striking in this territory, from a landscape, formal, functional, and tectonic point of view, that it is difficult to imagine this waterfront without their presence. In all cases, the relationship with the site explains many of the design options. Time after time, the conditions of the pre-existing territory (also affected by the previous interventions of Siza) were considered, so each project created a new circumstance, conditioning the following interventions. This paper is part of a more comprehensive project, which aims to analyze the work of Álvaro Siza in its fundamental relationship with the site.Keywords: Álvaro Siza, contextualism, Leça da Palmeira, landscape
Procedia PDF Downloads 3214962 Spectrophotometric Determination of Photohydroxylated Products of Humic Acid in the Presence of Salicylate Probe
Authors: Julide Hizal Yucesoy, Batuhan Yardimci, Aysem Arda, Resat Apak
Abstract:
Humic substances produce reactive oxygene species such as hydroxyl, phenoxy and superoxide radicals by oxidizing in a wide pH and reduction potential range. Hydroxyl radicals, produced by reducing agents such as antioxidants and/or peroxides, attack on salicylate probe, and form 2,3-dihydroxybenzoate, 2,4-dihydroxybenzoate and 2,5-dihydroxybenzoate species. These species are quantitatively determined by using HPLC Method. Humic substances undergo photodegradation by UV radiation. As a result of their antioxidant properties, they produce hydroxyl radicals. In the presence of salicylate probe, these hydroxyl radicals react with salicylate molecules to form hydroxylated products (dihidroxybenzoate isomers). In this study, humic acid was photodegraded in a photoreactor at 254 nm (400W), formed hydroxyl radicals were caught by salicylate probe. The total concentration of hydroxylated salicylate species was measured by using spectrophotometric CUPRAC Method. And also, using results of time dependent experiments, kinetic of photohydroxylation was determined at different pHs. This method has been applied for the first time to measure the concentration of hydroxylated products. It allows to achieve the results easier than HPLC Method.Keywords: CUPRAC method, humic acid, photohydroxylation, salicylate probe
Procedia PDF Downloads 20614961 The Transformation of Architecture through the Technological Developments in History: Future Architecture Scenario
Authors: Adel Gurel, Ozge Ceylin Yildirim
Abstract:
Nowadays, design and architecture are being affected and underwent change with the rapid advancements in technology, economics, politics, society and culture. Architecture has been transforming with the latest developments after the inclusion of computers into design. Integration of design into the computational environment has revolutionized the architecture and new perspectives in architecture have been gained. The history of architecture shows the various technological developments and changes in which the architecture has transformed with time. Therefore, the analysis of integration between technology and the history of the architectural process makes it possible to build a consensus on the idea of how architecture is to proceed. In this study, each period that occurs with the integration of technology into architecture is addressed within historical process. At the same time, changes in architecture via technology are identified as important milestones and predictions with regards to the future of architecture have been determined. Developments and changes in technology and the use of technology in architecture within years are analyzed in charts and graphs comparatively. The historical process of architecture and its transformation via technology are supported with detailed literature review and they are consolidated with the examination of focal points of 20th-century architecture under the titles; parametric design, genetic architecture, simulation, and biomimicry. It is concluded that with the historical research between past and present; the developments in architecture cannot keep up with the advancements in technology and recent developments in technology overshadow the architecture, even the technology decides the direction of architecture. As a result, a scenario is presented with regards to the reach of technology in the future of architecture and the role of the architect.Keywords: computer technologies, future architecture, scientific developments, transformation
Procedia PDF Downloads 19214960 Artificial Neurons Based on Memristors for Spiking Neural Networks
Authors: Yan Yu, Wang Yu, Chen Xintong, Liu Yi, Zhang Yanzhong, Wang Yanji, Chen Xingyu, Zhang Miaocheng, Tong Yi
Abstract:
Neuromorphic computing based on spiking neural networks (SNNs) has emerged as a promising avenue for building the next generation of intelligent computing systems. Owing to its high-density integration, low power, and outstanding nonlinearity, memristors have attracted emerging attention on achieving SNNs. However, fabricating a low-power and robust memristor-based spiking neuron without extra electrical components is still a challenge for brain-inspired systems. In this work, we demonstrate a TiO₂-based threshold switching (TS) memristor to emulate a leaky integrate-and-fire (LIF) neuron without auxiliary circuits, used to realize single layer fully connected (FC) SNNs. Moreover, our TiO₂-based resistive switching (RS) memristors realize spiking-time-dependent-plasticity (STDP), originating from the Ag diffusion-based filamentary mechanism. This work demonstrates that TiO2-based memristors may provide an efficient method to construct hardware neuromorphic computing systems.Keywords: leaky integrate-and-fire, memristor, spiking neural networks, spiking-time-dependent-plasticity
Procedia PDF Downloads 13414959 Insight into Localized Fertilizer Placement in Major Cereal Crops
Authors: Solomon Yokamo, Dianjun Lu, Xiaoqin Chen, Huoyan Wang
Abstract:
The current ‘high input-high output’ nutrient management model based on homogenous spreading over the entire soil surface remains a key challenge in China’s farming systems, leading to low fertilizer use efficiency and environmental pollution. Localized placement of fertilizer (LPF) to crop root zones has been proposed as a viable approach to boost crop production while protecting environmental pollution. To assess the potential benefits of LPF on three major crops—wheat, rice, and maize—a comprehensive meta-analysis was conducted, encompassing 85 field studies published from 2002-2023. We further validated the practicability and feasibility of one-time root zone N management based on LPF for the three field crops. The meta-analysis revealed that LPF significantly increased the yields of the selected crops (13.62%) and nitrogen recovery efficiency (REN) (33.09%) while reducing cumulative nitrous oxide (N₂O) emission (17.37%) and ammonia (NH₃) volatilization (60.14%) compared to the conventional surface application (CSA). Higher grain yield and REN were achieved with an optimal fertilization depth (FD) of 5-15 cm, moderate N rates, combined NPK application, one-time deep fertilization, and coarse-textured and slightly acidic soils. Field validation experiments showed that localized one-time root zone N management without topdressing increased maize (6.2%), rice (34.6%), and wheat (2.9%) yields while saving N fertilizer (3%) and also increased the net economic benefits (23.71%) compared to CSA. A soil incubation study further proved the potential of LPF to enhance the retention and availability of mineral N in the root zone over an extended period. Thus, LPF could be an important fertilizer management strategy and should be extended to other less-developed and developing regions to win the triple benefit of food security, environmental quality, and economic gains.Keywords: grain yield, LPF, NH₃ volatilization, N₂O emission, N recovery efficiency
Procedia PDF Downloads 2014958 Eradicating Micronutrient Deficiency through Biofortification
Authors: Ihtasham Hamza
Abstract:
In the contemporary world, where the West is afflicted by the diseases of excess nutrition, much of the rest globe suffers at the hands of hunger. A troubling constituent of hunger is micronutrient deficiency, also called hidden hunger. Major dependence on calorie-rich diets and low diet diversification are responsible for high malnutrition rates, especially in African and Asian countries. But the dilemma isn’t immune to solutions. Highlighting the substantial cause to be sole dependence on staples for food, biofortification has emerged as a novel tool to confront the widely distributed jeopardize of hidden hunger. Biofortification potentials the better nutritional approachability to commonalities overcoming various difficulties and reaching the doorstep. The crops associated with biofortification offer a rural-based involvement that, proposal, primarily reaches these more remote populations, which comprise a majority of the malnourished in many countries, and then penetrates to urban populations as assembly overages are marketed. Initial investments in agricultural research at a central location can generate high recurrent benefits at low cost as adapted biofortified cultivars become widely available in countries across time at low recurrent costs as opposed to supplementation which is comparatively expensive and requires continued financing over time, which may be imperilled by fluctuating political curiosity.Keywords: biofortified crops, hunger, malnutrition, agricultural practices
Procedia PDF Downloads 28814957 An Evaluation of the Impact of E-Banking on Operational Efficiency of Banks in Nigeria
Authors: Ibrahim Rabiu Darazo
Abstract:
The research has been conducted on the impact of E-banking on the operational efficiency of Banks in Nigeria, A case of some selected banks (Diamond Bank Plc, GTBankPlc, and Fidelity Bank Plc) in Nigeria. The research is a quantitative research which uses both primary and secondary sources of data collection. Questionnaire were used to obtained accurate data, where 150 Questionnaire were distributed among staff and customers of the three Banks , and the data collected where analysed using chi-square, whereas the secondary data where obtained from relevant text books, journals and relevant web sites. It is clear from the findings that, the use of e-banking by the banks has improved the efficiency of these banks, in terms of providing efficient services to customers electronically, using Internet Banking, Telephone Banking ATMs, reducing time taking to serve customers, e-banking allow new customers to open an account online, customers have access to their account at all the time 24/7.E-banking provide access to customers information from the data base and cost of check and postage were eliminated using e-banking. The recommendation at the end of the research include; the Banks should try to update their electronic gadgets, e-fraud(internal & external) should also be controlled, Banks shall employ qualified man power, Biometric ATMs shall be introduce to reduce fraud using ATM Cards, as it is use in other countries like USA.Keywords: banks, electronic banking, operational efficiency of banks, biometric ATMs
Procedia PDF Downloads 33314956 Identification System for Grading Banana in Food Processing Industry
Authors: Ebenezer O. Olaniyi, Oyebade K. Oyedotun, Khashman Adnan
Abstract:
In the food industry high quality production is required within a limited time to meet up with the demand in the society. In this research work, we have developed a model which can be used to replace the human operator due to their low output in production and slow in making decisions as a result of an individual differences in deciding the defective and healthy banana. This model can perform the vision attributes of human operators in deciding if the banana is defective or healthy for food production based. This research work is divided into two phase, the first phase is the image processing where several image processing techniques such as colour conversion, edge detection, thresholding and morphological operation were employed to extract features for training and testing the network in the second phase. These features extracted in the first phase were used in the second phase; the classification system phase where the multilayer perceptron using backpropagation neural network was employed to train the network. After the network has learned and converges, the network was tested with feedforward neural network to determine the performance of the network. From this experiment, a recognition rate of 97% was obtained and the time taken for this experiment was limited which makes the system accurate for use in the food industry.Keywords: banana, food processing, identification system, neural network
Procedia PDF Downloads 47114955 Plackett-Burman Design to Evaluate the Influence of Operating Parameters on Anaerobic Orthophosphate Release from Enhanced Biological Phosphorus Removal Sludge
Authors: Reza Salehi, Peter L. Dold, Yves Comeau
Abstract:
The aim of the present study was to investigate the effect of a total of 6 operating parameters including pH (X1), temperature (X2), stirring speed (X3), chemical oxygen demand (COD) (X4), volatile suspended solids (VSS) (X5) and time (X6) on anaerobic orthophosphate release from enhanced biological phosphorus removal (EBPR) sludge. An 8-run Plackett Burman design was applied and the statistical analysis of the experimental data was performed using Minitab16.2.4 software package. The Analysis of variance (ANOVA) results revealed that temperature, COD, VSS and time had a significant effect with p-values of less than 0.05 whereas pH and stirring speed were identified as non-significant parameters, but influenced orthophosphate release from the EBPR sludge. The mathematic expression obtained by the first-order multiple linear regression model between orthophosphate release from the EBPR sludge (Y) and the operating parameters (X1-X6) was Y=18.59+1.16X1-3.11X2-0.81X3+3.79X4+9.89X5+4.01X6. The model p-value and coefficient of determination (R2) value were 0.026 and of 99.87%, respectively, which indicates the model is significant and the predicted values of orthophosphate release from the EBPR sludge have been excellently correlated with the observed values.Keywords: anaerobic, operating parameters, orthophosphate release, Plackett-Burman design
Procedia PDF Downloads 27914954 Response Surface Methodology to Obtain Disopyramide Phosphate Loaded Controlled Release Ethyl Cellulose Microspheres
Authors: Krutika K. Sawant, Anil Solanki
Abstract:
The present study deals with the preparation and optimization of ethyl cellulose-containing disopyramide phosphate loaded microspheres using solvent evaporation technique. A central composite design consisting of a two-level full factorial design superimposed on a star design was employed for optimizing the preparation microspheres. The drug:polymer ratio (X1) and speed of the stirrer (X2) were chosen as the independent variables. The cumulative release of the drug at a different time (2, 6, 10, 14, and 18 hr) was selected as the dependent variable. An optimum polynomial equation was generated for the prediction of the response variable at time 10 hr. Based on the results of multiple linear regression analysis and F statistics, it was concluded that sustained action can be obtained when X1 and X2 are kept at high levels. The X1X2 interaction was found to be statistically significant. The drug release pattern fitted the Higuchi model well. The data of a selected batch were subjected to an optimization study using Box-Behnken design, and an optimal formulation was fabricated. Good agreement was observed between the predicted and the observed dissolution profiles of the optimal formulation.Keywords: disopyramide phosphate, ethyl cellulose, microspheres, controlled release, Box-Behnken design, factorial design
Procedia PDF Downloads 45814953 Analysis of Residents’ Travel Characteristics and Policy Improving Strategies
Authors: Zhenzhen Xu, Chunfu Shao, Shengyou Wang, Chunjiao Dong
Abstract:
To improve the satisfaction of residents' travel, this paper analyzes the characteristics and influencing factors of urban residents' travel behavior. First, a Multinominal Logit Model (MNL) model is built to analyze the characteristics of residents' travel behavior, reveal the influence of individual attributes, family attributes and travel characteristics on the choice of travel mode, and identify the significant factors. Then put forward suggestions for policy improvement. Finally, Support Vector Machine (SVM) and Multi-Layer Perceptron (MLP) models are introduced to evaluate the policy effect. This paper selects Futian Street in Futian District, Shenzhen City for investigation and research. The results show that gender, age, education, income, number of cars owned, travel purpose, departure time, journey time, travel distance and times all have a significant influence on residents' choice of travel mode. Based on the above results, two policy improvement suggestions are put forward from reducing public transportation and non-motor vehicle travel time, and the policy effect is evaluated. Before the evaluation, the prediction effect of MNL, SVM and MLP models was evaluated. After parameter optimization, it was found that the prediction accuracy of the three models was 72.80%, 71.42%, and 76.42%, respectively. The MLP model with the highest prediction accuracy was selected to evaluate the effect of policy improvement. The results showed that after the implementation of the policy, the proportion of public transportation in plan 1 and plan 2 increased by 14.04% and 9.86%, respectively, while the proportion of private cars decreased by 3.47% and 2.54%, respectively. The proportion of car trips decreased obviously, while the proportion of public transport trips increased. It can be considered that the measures have a positive effect on promoting green trips and improving the satisfaction of urban residents, and can provide a reference for relevant departments to formulate transportation policies.Keywords: neural network, travel characteristics analysis, transportation choice, travel sharing rate, traffic resource allocation
Procedia PDF Downloads 13814952 The Relationship between Spanish Economic Variables: Evidence from the Wavelet Techniques
Authors: Concepcion Gonzalez-Concepcion, Maria Candelaria Gil-Fariña, Celina Pestano-Gabino
Abstract:
We analyze six relevant economic and financial variables for the period 2000M1-2015M3 in the context of the Spanish economy: a financial index (IBEX35), a commodity (Crude Oil Price in euros), a foreign exchange index (EUR/USD), a bond (Spanish 10-Year Bond), the Spanish National Debt and the Consumer Price Index. The goal of this paper is to analyze the main relations between them by computing the Wavelet Power Spectrum and the Cross Wavelet Coherency associated with Morlet wavelets. By using a special toolbox in MATLAB, we focus our interest on the period variable. We decompose the time-frequency effects and improve the interpretation of the results by non-expert users in the theory of wavelets. The empirical evidence shows certain instability periods and reveals various changes and breaks in the causality relationships for sample data. These variables were individually analyzed with Daubechies Wavelets to visualize high-frequency variance, seasonality, and trend. The results are included in Proceeding 20th International Academic Conference, 2015, International Institute of Social and Economic Sciences (IISES), Madrid.Keywords: economic and financial variables, Spain, time-frequency domain, wavelet coherency
Procedia PDF Downloads 240