Search results for: time domain reflectometry (TDR)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19375

Search results for: time domain reflectometry (TDR)

14815 Water Ingress into Underground Mine Voids in the Central Rand Goldfields Area, South Africa-Fluid Induced Seismicity

Authors: Artur Cichowicz

Abstract:

The last active mine in the Central Rand Goldfields area (50 km x 15 km) ceased operations in 2008. This resulted in the closure of the pumping stations, which previously maintained the underground water level in the mining voids. As a direct consequence of the water being allowed to flood the mine voids, seismic activity has increased directly beneath the populated area of Johannesburg. Monitoring of seismicity in the area has been on-going for over five years using the network of 17 strong ground motion sensors. The objective of the project is to improve strategies for mine closure. The evolution of the seismicity pattern was investigated in detail. Special attention was given to seismic source parameters such as magnitude, scalar seismic moment and static stress drop. Most events are located within historical mine boundaries. The seismicity pattern shows a strong relationship between the presence of the mining void and high levels of seismicity; no seismicity migration patterns were observed outside the areas of old mining. Seven years after the pumping stopped, the evolution of the seismicity has indicated that the area is not yet in equilibrium. The level of seismicity in the area appears to not be decreasing over time since the number of strong events, with Mw magnitudes above 2, is still as high as it was when monitoring began over five years ago. The average rate of seismic deformation is 1.6x1013 Nm/year. Constant seismic deformation was not observed over the last 5 years. The deviation from the average is in the order of 6x10^13 Nm/year, which is a significant deviation. The variation of cumulative seismic moment indicates that a constant deformation rate model is not suitable. Over the most recent five year period, the total cumulative seismic moment released in the Central Rand Basin was 9.0x10^14 Nm. This is equivalent to one earthquake of magnitude 3.9. This is significantly less than what was experienced during the mining operation. Characterization of seismicity triggered by a rising water level in the area can be achieved through the estimation of source parameters. Static stress drop heavily influences ground motion amplitude, which plays an important role in risk assessments of potential seismic hazards in inhabited areas. The observed static stress drop in this study varied from 0.05 MPa to 10 MPa. It was found that large static stress drops could be associated with both small and large events. The temporal evolution of the inter-event time provides an understanding of the physical mechanisms of earthquake interaction. Changes in the characteristics of the inter-event time are produced when a stress change is applied to a group of faults in the region. Results from this study indicate that the fluid-induced source has a shorter inter-event time in comparison to a random distribution. This behaviour corresponds to a clustering of events, in which short recurrence times tend to be close to each other, forming clusters of events.

Keywords: inter-event time, fluid induced seismicity, mine closure, spectral parameters of seismic source

Procedia PDF Downloads 285
14814 Impact of Job Crafting on Work Engagement and Well-Being among Indian Working Professionals

Authors: Arjita Jhingran

Abstract:

The pandemic was a turning point for flexible employment. In today’s market, employees prefer companies that provide the autonomy to change their work environment and are flexible. Post pandemic employees have become accustomed to modifying, re-designing, and re-aligning their work environment, task, and the way they interact with co-workers based on their preferences after working from home for a long time. In this scenario, the concept of job crafting has come to the forefront, and research on the subject has expanded, particularly during COVID-19. Managers who provide opportunities to craft the job are driving enhanced engagement and well-being. The current study will aim to examine the impact of job crafting on work engagement and psychological well-being among 385 working professionals, ranging in the age group of 21- 39 years. (M age=30 years). The study will also draw comparisons between freelancers and full-time employees, as freelancers have been considered to have more autonomy over their job. A comparison-based among MNC or startups will be studied; as for the majority of startups, autonomy is a primary motivator. Moreover, a difference based on the level of experience will also be observed, which will add to the body of knowledge. The data will be collected through Job Crafting Questionnaire, Utrecht Work Engagement Scale, and Psychological Well-Being Scale. To infer the findings, correlation analysis will be used to study the relationship among variables, and a Three way ANOVA will be used to draw comparisons.

Keywords: job crafting, work engagement, well-being, freelancers, start-ups

Procedia PDF Downloads 105
14813 Enhancement of Transaction's Authentication for the Europay, MasterCard, and Visa Contactless Card Payments

Authors: Ossama Al-Maliki

Abstract:

Europay, MasterCard, and Visa (EMV) is one of the most popular payment protocol in the world. The EMV protocol supports Chip and PIN Transactions, Chip and Signature transactions, and Contactless transactions. This protocol suffers from tens of £ millions of lost per year due to many fraudulent payments. This is due to several reported vulnerable points in the protocols used for such payments that allow skimming, replay, cloning, Mole Point of Sale (POS), relay, and other attacks to be conducted. In this paper, we are focusing on the EMV contactless specification and we have proposed two proposal solutions to the addition of a localization factor to enhance the payment authentication of such transactions designed to prevent relay, cloning, and Mole-POS attacks. Our proposed solution is a back-end localization scheme to help the Issuer-Bank compare the location of the genuine cardholder in relation to the used POS. Our scheme uses 'something you have' which is the Cardholder Smartphone (CSP) to provide the location of the cardholder at the time of the transaction and without impacting the contactless payment time/protocol. The Issuer-bank obtain the CSP Location using tried and tested localization techniques, and independently of the cardholder. Both of our proposal solutions do not require infrastructure changes, and it uses existing EMV/SP protocol messages to communicate our scheme information.

Keywords: NFC, RFID, contactless card, authentication, location, EMV

Procedia PDF Downloads 242
14812 Analysis of Inventory Control, Lot Size and Reorder Point for Engro Polymers and Chemicals

Authors: Ali Akber Jaffri, Asad Naseem, Javeria Khan

Abstract:

The purpose of this study is to determine safety stock, maximum inventory level, reordering point, and reordering quantity by rearranging lot sizes for supplier and customer in MRO (maintenance repair operations) warehouse of Engro Polymers & Chemicals. To achieve the aim, physical analysis method and excel commands were carried out to elicit the customer and supplier data provided by the company. Initially, we rearranged the current lot sizes and MOUs (measure of units) in SAP software. Due to change in lot sizes we have to determine the new quantities for safety stock, maximum inventory, reordering point and reordering quantity as per company's demand. By proposed system, we saved extra cost in terms of reducing the time of receiving from vendor and in issuance to customer, ease of material handling in MRO warehouse and also reduce human efforts. The information requirements identified in this study can be utilized in calculating Economic Order Quantity.

Keywords: carrying cost, economic order quantity, fast moving, lead time, lot size, MRO, maximum inventory, ordering cost, physical inspection, reorder point

Procedia PDF Downloads 239
14811 Impact of Increased Radiology Staffing on After-Hours Radiology Reporting Efficiency and Quality

Authors: Peregrine James Dalziel, Philip Vu Tran

Abstract:

Objective / Introduction: Demand for radiology services from Emergency Departments (ED) continues to increase with greater demands placed on radiology staff providing reports for the management of complex cases. Queuing theory indicates that wide variability of process time with the random nature of request arrival increases the probability of significant queues. This can lead to delays in the time-to-availability of radiology reports (TTA-RR) and potentially impaired ED patient flow. In addition, greater “cognitive workload” of greater volume may lead to reduced productivity and increased errors. We sought to quantify the potential ED flow improvements obtainable from increased radiology providers serving 3 public hospitals in Melbourne Australia. We sought to assess the potential productivity gains, quality improvement and the cost-effectiveness of increased labor inputs. Methods & Materials: The Western Health Medical Imaging Department moved from single resident coverage on weekend days 8:30 am-10:30 pm to a limited period of 2 resident coverage 1 pm-6 pm on both weekend days. The TTA-RR for weekend CT scans was calculated from the PACs database for the 8 month period symmetrically around the date of staffing change. A multivariate linear regression model was developed to isolate the improvement in TTA-RR, between the two 4-months periods. Daily and hourly scan volume at the time of each CT scan was calculated to assess the impact of varying department workload. To assess any improvement in report quality/errors a random sample of 200 studies was assessed to compare the average number of clinically significant over-read addendums to reports between the 2 periods. Cost-effectiveness was assessed by comparing the marginal cost of additional staffing against a conservative estimate of the economic benefit of improved ED patient throughput using the Australian national insurance rebate for private ED attendance as a revenue proxy. Results: The primary resident on call and the type of scan accounted for most of the explained variability in time to report availability (R2=0.29). Increasing daily volume and hourly volume was associated with increased TTA-RR (1.5m (p<0.01) and 4.8m (p<0.01) respectively per additional scan ordered within each time frame. Reports were available 25.9 minutes sooner on average in the 4 months post-implementation of double coverage (p<0.01) with additional 23.6 minutes improvement when 2 residents were on-site concomitantly (p<0.01). The aggregate average improvement in TTA-RR was 24.8 hours per weekend day This represents the increased decision-making time available to ED physicians and potential improvement in ED bed utilisation. 5% of reports from the intervention period contained clinically significant addendums vs 7% in the single resident period but this was not statistically significant (p=0.7). The marginal cost was less than the anticipated economic benefit based assuming a 50% capture of improved TTA-RR inpatient disposition and using the lowest available national insurance rebate as a proxy for economic benefit. Conclusion: TTA-RR improved significantly during the period of increased staff availability, both during the specific period of increased staffing and throughout the day. Increased labor utilisation is cost-effective compared with the potential improved productivity for ED cases requiring CT imaging.

Keywords: workflow, quality, administration, CT, staffing

Procedia PDF Downloads 112
14810 Non-Parametric Changepoint Approximation for Road Devices

Authors: Loïc Warscotte, Jehan Boreux

Abstract:

The scientific literature of changepoint detection is vast. Today, a lot of methods are available to detect abrupt changes or slight drift in a signal, based on CUSUM or EWMA charts, for example. However, these methods rely on strong assumptions, such as the stationarity of the stochastic underlying process, or even the independence and Gaussian distributed noise at each time. Recently, the breakthrough research on locally stationary processes widens the class of studied stochastic processes with almost no assumptions on the signals and the nature of the changepoint. Despite the accurate description of the mathematical aspects, this methodology quickly suffers from impractical time and space complexity concerning the signals with high-rate data collection, if the characteristics of the process are completely unknown. In this paper, we then addressed the problem of making this theory usable to our purpose, which is monitoring a high-speed weigh-in-motion system (HS-WIM) towards direct enforcement without supervision. To this end, we first compute bounded approximations of the initial detection theory. Secondly, these approximating bounds are empirically validated by generating many independent long-run stochastic processes. The abrupt changes and the drift are both tested. Finally, this relaxed methodology is tested on real signals coming from a HS-WIM device in Belgium, collected over several months.

Keywords: changepoint, weigh-in-motion, process, non-parametric

Procedia PDF Downloads 78
14809 Anaerobic Digestion of Coffee Wastewater from a Fast Inoculum Adaptation Stage: Replacement of Complex Substrate

Authors: D. Lepe-Cervantes, E. Leon-Becerril, J. Gomez-Romero, O. Garcia-Depraect, A. Lopez-Lopez

Abstract:

In this study, raw coffee wastewater (CWW) was used as a complex substrate for anaerobic digestion. The inoculum adaptation stage, microbial diversity analysis and biomethane potential (BMP) tests were performed. A fast inoculum adaptation stage was used by the replacement of vinasse to CWW in an anaerobic sequential batch reactor (AnSBR) operated at mesophilic conditions. Illumina MiSeq sequencing was used to analyze the microbial diversity. While, BMP tests using inoculum adapted to CWW were carried out at different inoculum to substrate (I/S) ratios (2:1, 3:1 and 4:1, on a VS basis). Results show that the adaptability percentage was increased gradually until it reaches the highest theoretical value in a short time of 10 d; with a methane yield of 359.10 NmL CH4/g COD-removed; Methanobacterium beijingense was the most abundant microbial (75%) and the greatest specific methane production was achieved at I/S ratio 4:1, whereas the lowest was obtained at 2:1, with BMP values of 320 NmL CH4/g VS and 151 NmL CH4/g VS, respectively. In conclusion, gradual replacement of substrate was a feasible method to adapt the inoculum in a short time even using complex raw substrates, whereas in the BMP tests, the specific methane production was proportional to the initial amount of inoculum.

Keywords: anaerobic digestion, biomethane potential test, coffee wastewater, fast inoculum adaptation

Procedia PDF Downloads 381
14808 Application of Stochastic Models to Annual Extreme Streamflow Data

Authors: Karim Hamidi Machekposhti, Hossein Sedghi

Abstract:

This study was designed to find the best stochastic model (using of time series analysis) for annual extreme streamflow (peak and maximum streamflow) of Karkheh River at Iran. The Auto-regressive Integrated Moving Average (ARIMA) model used to simulate these series and forecast those in future. For the analysis, annual extreme streamflow data of Jelogir Majin station (above of Karkheh dam reservoir) for the years 1958–2005 were used. A visual inspection of the time plot gives a little increasing trend; therefore, series is not stationary. The stationarity observed in Auto-Correlation Function (ACF) and Partial Auto-Correlation Function (PACF) plots of annual extreme streamflow was removed using first order differencing (d=1) in order to the development of the ARIMA model. Interestingly, the ARIMA(4,1,1) model developed was found to be most suitable for simulating annual extreme streamflow for Karkheh River. The model was found to be appropriate to forecast ten years of annual extreme streamflow and assist decision makers to establish priorities for water demand. The Statistical Analysis System (SAS) and Statistical Package for the Social Sciences (SPSS) codes were used to determinate of the best model for this series.

Keywords: stochastic models, ARIMA, extreme streamflow, Karkheh river

Procedia PDF Downloads 148
14807 Geovisualization of Human Mobility Patterns in Los Angeles Using Twitter Data

Authors: Linna Li

Abstract:

The capability to move around places is doubtless very important for individuals to maintain good health and social functions. People’s activities in space and time have long been a research topic in behavioral and socio-economic studies, particularly focusing on the highly dynamic urban environment. By analyzing groups of people who share similar activity patterns, many socio-economic and socio-demographic problems and their relationships with individual behavior preferences can be revealed. Los Angeles, known for its large population, ethnic diversity, cultural mixing, and entertainment industry, faces great transportation challenges such as traffic congestion, parking difficulties, and long commuting. Understanding people’s travel behavior and movement patterns in this metropolis sheds light on potential solutions to complex problems regarding urban mobility. This project visualizes people’s trajectories in Greater Los Angeles (L.A.) Area over a period of two months using Twitter data. A Python script was used to collect georeferenced tweets within the Greater L.A. Area including Ventura, San Bernardino, Riverside, Los Angeles, and Orange counties. Information associated with tweets includes text, time, location, and user ID. Information associated with users includes name, the number of followers, etc. Both aggregated and individual activity patterns are demonstrated using various geovisualization techniques. Locations of individual Twitter users were aggregated to create a surface of activity hot spots at different time instants using kernel density estimation, which shows the dynamic flow of people’s movement throughout the metropolis in a twenty-four-hour cycle. In the 3D geovisualization interface, the z-axis indicates time that covers 24 hours, and the x-y plane shows the geographic space of the city. Any two points on the z axis can be selected for displaying activity density surface within a particular time period. In addition, daily trajectories of Twitter users were created using space-time paths that show the continuous movement of individuals throughout the day. When a personal trajectory is overlaid on top of ancillary layers including land use and road networks in 3D visualization, the vivid representation of a realistic view of the urban environment boosts situational awareness of the map reader. A comparison of the same individual’s paths on different days shows some regular patterns on weekdays for some Twitter users, but for some other users, their daily trajectories are more irregular and sporadic. This research makes contributions in two major areas: geovisualization of spatial footprints to understand travel behavior using the big data approach and dynamic representation of activity space in the Greater Los Angeles Area. Unlike traditional travel surveys, social media (e.g., Twitter) provides an inexpensive way of data collection on spatio-temporal footprints. The visualization techniques used in this project are also valuable for analyzing other spatio-temporal data in the exploratory stage, thus leading to informed decisions about generating and testing hypotheses for further investigation. The next step of this research is to separate users into different groups based on gender/ethnic origin and compare their daily trajectory patterns.

Keywords: geovisualization, human mobility pattern, Los Angeles, social media

Procedia PDF Downloads 119
14806 Self-serving Anchoring of Self-judgments

Authors: Elitza Z. Ambrus, Bjoern Hartig, Ryan McKay

Abstract:

Individuals’ self-judgments might be malleable and influenced by comparison with a random value. On the one hand, self-judgments reflect our self-image, which is typically considered to be stable in adulthood. Indeed, people also strive hard to maintain a fixed, positive moral image of themselves. On the other hand, research has shown the robustness of the so-called anchoring effect on judgments and decisions. The anchoring effect refers to the influence of a previously considered comparative value (anchor) on a consecutive absolute judgment and reveals that individuals’ estimates of various quantities are flexible and can be influenced by a salient random value. The present study extends the anchoring paradigm to the domain of the self. We also investigate whether participants are more susceptible to self-serving anchors, i.e., anchors that enhance participant’s self-image, especially their moral self-image. In a pre-reregistered study via the online platform Prolific, 249 participants (156 females, 89 males, 3 other and 1 who preferred not to specify their gender; M = 35.88, SD = 13.91) ranked themselves on eight personality characteristics. However, in the anchoring conditions, respondents were asked to first indicate whether they thought they would rank higher or lower than a given anchor value before providing their estimated rank in comparison to 100 other anonymous participants. A high and a low anchor value were employed to differentiate between anchors in a desirable (self-serving) direction and anchors in an undesirable (self-diminishing) direction. In the control treatment, there was no comparison question. Subsequently, participants provided their self-rankings on the eight personality traits with two personal characteristics for each combination of the factors desirable/undesirable and moral/non-moral. We found evidence of an anchoring effect for self-judgments. Moreover, anchoring was more efficient when people were anchored in a self-serving direction: the anchoring effect was enhanced when supporting a more favorable self-view and mitigated (even reversed) when implying a deterioration of the self-image. The self-serving anchoring was more pronounced for moral than for non-moral traits. The data also provided evidence in support of a better-than-average effect in general as well as a magnified better-than-average effect for moral traits. Taken together, these results suggest that self-judgments might not be as stable in adulthood as previously thought. In addition, considerations of constructing and maintaining a positive self-image might interact with the anchoring effect on self-judgments. Potential implications of our results concern the construction and malleability of self-judgments as well as the psychological mechanism shaping anchoring.

Keywords: anchoring, better-than-average effect, self-judgments, self-serving anchoring

Procedia PDF Downloads 180
14805 Bidirectional Long Short-Term Memory-Based Signal Detection for Orthogonal Frequency Division Multiplexing With All Index Modulation

Authors: Mahmut Yildirim

Abstract:

This paper proposed the bidirectional long short-term memory (Bi-LSTM) network-aided deep learning (DL)-based signal detection for Orthogonal frequency division multiplexing with all index modulation (OFDM-AIM), namely Bi-DeepAIM. OFDM-AIM is developed to increase the spectral efficiency of OFDM with index modulation (OFDM-IM), a promising multi-carrier technique for communication systems beyond 5G. In this paper, due to its strong classification ability, Bi-LSTM is considered an alternative to the maximum likelihood (ML) algorithm, which is used for signal detection in the classical OFDM-AIM scheme. The performance of the Bi-DeepAIM is compared with LSTM network-aided DL-based OFDM-AIM (DeepAIM) and classic OFDM-AIM that uses (ML)-based signal detection via BER performance and computational time criteria. Simulation results show that Bi-DeepAIM obtains better bit error rate (BER) performance than DeepAIM and lower computation time in signal detection than ML-AIM.

Keywords: bidirectional long short-term memory, deep learning, maximum likelihood, OFDM with all index modulation, signal detection

Procedia PDF Downloads 72
14804 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic

Authors: Diogen Babuc

Abstract:

The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.

Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison

Procedia PDF Downloads 103
14803 Multi-Criteria Inventory Classification Process Based on Logical Analysis of Data

Authors: Diana López-Soto, Soumaya Yacout, Francisco Ángel-Bello

Abstract:

Although inventories are considered as stocks of money sitting on shelve, they are needed in order to secure a constant and continuous production. Therefore, companies need to have control over the amount of inventory in order to find the balance between excessive and shortage of inventory. The classification of items according to certain criteria such as the price, the usage rate and the lead time before arrival allows any company to concentrate its investment in inventory according to certain ranking or priority of items. This makes the decision making process for inventory management easier and more justifiable. The purpose of this paper is to present a new approach for the classification of new items based on the already existing criteria. This approach is called the Logical Analysis of Data (LAD). It is used in this paper to assist the process of ABC items classification based on multiple criteria. LAD is a data mining technique based on Boolean theory that is used for pattern recognition. This technique has been tested in medicine, industry, credit risk analysis, and engineering with remarkable results. An application on ABC inventory classification is presented for the first time, and the results are compared with those obtained when using the well-known AHP technique and the ANN technique. The results show that LAD presented very good classification accuracy.

Keywords: ABC multi-criteria inventory classification, inventory management, multi-class LAD model, multi-criteria classification

Procedia PDF Downloads 881
14802 Application of GIS-Based Construction Engineering: An Electronic Document Management System

Authors: Mansour N. Jadid

Abstract:

This paper describes the implementation of a GIS to provide decision support for successfully monitoring the movements and storage of materials, hence ensuring that finished products travel from the point of origin to the destination construction site through the supply-chain management (SCM) system. This system ensures the efficient operation of suppliers, manufacturers, and distributors by determining the shortest path from the point of origin to the final destination to reduce construction costs, minimize time, and enhance productivity. These systems are essential to the construction industry because they reduce costs and save time, thereby improve productivity and effectiveness. This study describes a typical supply-chain model and a geographical information system (GIS)-based SCM that focuses on implementing an electronic document management system, which maps the application framework to integrate geodetic support with the supply-chain system. This process provides guidance for locating the nearest suppliers to fill the information needs of project members in different locations. Moreover, this study illustrates the use of a GIS-based SCM as a collaborative tool in innovative methods for implementing Web mapping services, as well as aspects of their integration by generating an interactive GIS for the construction industry platform.

Keywords: construction, coordinate, engineering, GIS, management, map

Procedia PDF Downloads 303
14801 Experimental Study of the Fiber Dispersion of Pulp Liquid Flow in Channels with Application to Papermaking

Authors: Masaru Sumida

Abstract:

This study explored the feasibility of improving the hydraulic headbox of papermaking machines by studying the flow of wood-pulp suspensions behind a flat plate inserted in parallel and convergent channels. Pulp fiber concentrations of the wake downstream of the plate were investigated by flow visualization and optical measurements. Changes in the time-averaged and fluctuation of the fiber concentration along the flow direction were examined. In addition, the control of the flow characteristics in the two channels was investigated. The behaviors of the pulp fibers and the wake flow were found to be strongly related to the flow states in the upstream passages partitioned by the plate. The distribution of the fiber concentration was complex because of the formation of a thin water layer on the plate and the generation of Karman’s vortices at the trailing edge of the plate. Compared with the flow in the parallel channel, fluctuations in the fiber concentration decreased in the convergent channel. However, at low flow velocities, the convergent channel has a weak effect on equilibrating the time-averaged fiber concentration. This shows that a rectangular trailing edge cannot adequately disperse pulp suspensions; thus, at low flow velocities, a convergent channel is ineffective in ensuring uniform fiber concentration.

Keywords: fiber dispersion, headbox, pulp liquid, wake flow

Procedia PDF Downloads 385
14800 Leça da Palmeira Revisited: Sixty-Seven Years of Recurring Work by Álvaro Siza

Authors: Eduardo Jorge Cabral dos Santos Fernandes

Abstract:

Over the last sixty-seven years, Portuguese architect Álvaro Siza Vieira designed several interventions for the Leça da Palmeira waterfront. With this paper, we aim to analyze the history of this set of projects in a chronological approach, seeking to understand the connections that can be established between them. Born in Matosinhos, a fishing and industrial village located near Porto, Álvaro Siza built a remarkable relationship with Leça da Palmeira (a neighboring village located to the north) from a personal and professional point of view throughout his life: it was there that he got married (in the small chapel located next to the Boa Nova lighthouse) and it was there that he designed his first works of great impact, the Boa Nova Tea House and the Ocean Swimming Pool, today classified as national monuments. These two works were the subject of several projects spaced over time, including recent restoration interventions designed by the same author. However, the marks of Siza's intervention in this territory are not limited to these two cases; there were other projects designed for this territory, which we also intend to analyze: the monument to the poet António Nobre (1967-80), the unbuilt project for a restaurant next to Piscina das Marés (presented in 1966 and redesigned in 1993), the reorganization of the Avenida da Liberdade (with a first project, not carried out, in 1965-74, and a reformulation carried out between 1998 and 2006) and, finally, the project for the new APDL facilities, which completes Avenida da Liberdade to the south (1995). Altogether, these interventions are so striking in this territory, from a landscape, formal, functional, and tectonic point of view, that it is difficult to imagine this waterfront without their presence. In all cases, the relationship with the site explains many of the design options. Time after time, the conditions of the pre-existing territory (also affected by the previous interventions of Siza) were considered, so each project created a new circumstance, conditioning the following interventions. This paper is part of a more comprehensive project, which aims to analyze the work of Álvaro Siza in its fundamental relationship with the site.

Keywords: Álvaro Siza, contextualism, Leça da Palmeira, landscape

Procedia PDF Downloads 31
14799 Spectrophotometric Determination of Photohydroxylated Products of Humic Acid in the Presence of Salicylate Probe

Authors: Julide Hizal Yucesoy, Batuhan Yardimci, Aysem Arda, Resat Apak

Abstract:

Humic substances produce reactive oxygene species such as hydroxyl, phenoxy and superoxide radicals by oxidizing in a wide pH and reduction potential range. Hydroxyl radicals, produced by reducing agents such as antioxidants and/or peroxides, attack on salicylate probe, and form 2,3-dihydroxybenzoate, 2,4-dihydroxybenzoate and 2,5-dihydroxybenzoate species. These species are quantitatively determined by using HPLC Method. Humic substances undergo photodegradation by UV radiation. As a result of their antioxidant properties, they produce hydroxyl radicals. In the presence of salicylate probe, these hydroxyl radicals react with salicylate molecules to form hydroxylated products (dihidroxybenzoate isomers). In this study, humic acid was photodegraded in a photoreactor at 254 nm (400W), formed hydroxyl radicals were caught by salicylate probe. The total concentration of hydroxylated salicylate species was measured by using spectrophotometric CUPRAC Method. And also, using results of time dependent experiments, kinetic of photohydroxylation was determined at different pHs. This method has been applied for the first time to measure the concentration of hydroxylated products. It allows to achieve the results easier than HPLC Method.

Keywords: CUPRAC method, humic acid, photohydroxylation, salicylate probe

Procedia PDF Downloads 206
14798 The Transformation of Architecture through the Technological Developments in History: Future Architecture Scenario

Authors: Adel Gurel, Ozge Ceylin Yildirim

Abstract:

Nowadays, design and architecture are being affected and underwent change with the rapid advancements in technology, economics, politics, society and culture. Architecture has been transforming with the latest developments after the inclusion of computers into design. Integration of design into the computational environment has revolutionized the architecture and new perspectives in architecture have been gained. The history of architecture shows the various technological developments and changes in which the architecture has transformed with time. Therefore, the analysis of integration between technology and the history of the architectural process makes it possible to build a consensus on the idea of how architecture is to proceed. In this study, each period that occurs with the integration of technology into architecture is addressed within historical process. At the same time, changes in architecture via technology are identified as important milestones and predictions with regards to the future of architecture have been determined. Developments and changes in technology and the use of technology in architecture within years are analyzed in charts and graphs comparatively. The historical process of architecture and its transformation via technology are supported with detailed literature review and they are consolidated with the examination of focal points of 20th-century architecture under the titles; parametric design, genetic architecture, simulation, and biomimicry. It is concluded that with the historical research between past and present; the developments in architecture cannot keep up with the advancements in technology and recent developments in technology overshadow the architecture, even the technology decides the direction of architecture. As a result, a scenario is presented with regards to the reach of technology in the future of architecture and the role of the architect.

Keywords: computer technologies, future architecture, scientific developments, transformation

Procedia PDF Downloads 191
14797 Artificial Neurons Based on Memristors for Spiking Neural Networks

Authors: Yan Yu, Wang Yu, Chen Xintong, Liu Yi, Zhang Yanzhong, Wang Yanji, Chen Xingyu, Zhang Miaocheng, Tong Yi

Abstract:

Neuromorphic computing based on spiking neural networks (SNNs) has emerged as a promising avenue for building the next generation of intelligent computing systems. Owing to its high-density integration, low power, and outstanding nonlinearity, memristors have attracted emerging attention on achieving SNNs. However, fabricating a low-power and robust memristor-based spiking neuron without extra electrical components is still a challenge for brain-inspired systems. In this work, we demonstrate a TiO₂-based threshold switching (TS) memristor to emulate a leaky integrate-and-fire (LIF) neuron without auxiliary circuits, used to realize single layer fully connected (FC) SNNs. Moreover, our TiO₂-based resistive switching (RS) memristors realize spiking-time-dependent-plasticity (STDP), originating from the Ag diffusion-based filamentary mechanism. This work demonstrates that TiO2-based memristors may provide an efficient method to construct hardware neuromorphic computing systems.

Keywords: leaky integrate-and-fire, memristor, spiking neural networks, spiking-time-dependent-plasticity

Procedia PDF Downloads 134
14796 Insight into Localized Fertilizer Placement in Major Cereal Crops

Authors: Solomon Yokamo, Dianjun Lu, Xiaoqin Chen, Huoyan Wang

Abstract:

The current ‘high input-high output’ nutrient management model based on homogenous spreading over the entire soil surface remains a key challenge in China’s farming systems, leading to low fertilizer use efficiency and environmental pollution. Localized placement of fertilizer (LPF) to crop root zones has been proposed as a viable approach to boost crop production while protecting environmental pollution. To assess the potential benefits of LPF on three major crops—wheat, rice, and maize—a comprehensive meta-analysis was conducted, encompassing 85 field studies published from 2002-2023. We further validated the practicability and feasibility of one-time root zone N management based on LPF for the three field crops. The meta-analysis revealed that LPF significantly increased the yields of the selected crops (13.62%) and nitrogen recovery efficiency (REN) (33.09%) while reducing cumulative nitrous oxide (N₂O) emission (17.37%) and ammonia (NH₃) volatilization (60.14%) compared to the conventional surface application (CSA). Higher grain yield and REN were achieved with an optimal fertilization depth (FD) of 5-15 cm, moderate N rates, combined NPK application, one-time deep fertilization, and coarse-textured and slightly acidic soils. Field validation experiments showed that localized one-time root zone N management without topdressing increased maize (6.2%), rice (34.6%), and wheat (2.9%) yields while saving N fertilizer (3%) and also increased the net economic benefits (23.71%) compared to CSA. A soil incubation study further proved the potential of LPF to enhance the retention and availability of mineral N in the root zone over an extended period. Thus, LPF could be an important fertilizer management strategy and should be extended to other less-developed and developing regions to win the triple benefit of food security, environmental quality, and economic gains.

Keywords: grain yield, LPF, NH₃ volatilization, N₂O emission, N recovery efficiency

Procedia PDF Downloads 19
14795 Hard and Soft Skills in Marketing Education: Using Serious Games to Engage Higher Order Processing

Authors: Ann Devitt, Mairead Brady, Markus Lamest, Stephen Gomez

Abstract:

This study set out to explore the use of an online collaborative serious game for student learning in a postgraduate introductory marketing module. The simulation game aimed to bridge the theory-practice divide in marketing by allowing students to apply theory in a safe, simulated marketplace. This study addresses the following research questions: Does an online marketing simulation game engage students higher order cognitive skills? Does collaborative activity required develop students’ “soft” skills, such as communication and negotiation? What specific affordances of the online simulation promote learning? This qualitative case study took place in 2014 with 40 postgraduate students on a Business Masters Programme. The two-week intensive module combined lectures with collaborative activity on a marketing simulation game, MMX from Pearsons. The game requires student teams to compete against other teams in a marketplace and design a marketing plan to maximize key performance indicators. The data for this study comprise essays written by students after the module reflecting on their learning on the module. A thematic analysis was conducted of the essays using the following a priori theme sets: 6 levels of the cognitive domain of Blooms taxonomy; 5 principles of Cooperative Learning; affordances of simulation environments including experiential learning; motivation and engagement; goal orientation. Preliminary findings would strongly suggest that the game facilitated students identifying the value of theory in practice, in particular for future employment; enhanced their understanding of group dynamics and their role within that; and impacted very strongly, both positively and negatively on motivation. In particular the game mechanics of MMX, which hinges on the correct identification of a target consumer group, was identified as a key determinant of extrinsic and intrinsic motivation for learners. The findings also suggest that the situation of the simulation game within a broader module which required post-game reflection was valuable in identifying key learning of marketing concepts in both the positive and the negative experiences of the game.

Keywords: simulation, marketing, serious game, cooperative learning, bloom's taxonomy

Procedia PDF Downloads 551
14794 Representation and Reality: Media Influences on Japanese Attitudes towards China

Authors: Shuk Ting Kinnia Yau

Abstract:

As China has become more and more influential in the global and geo-political arena, mutual understanding between Japan and China has also become a topic of paramount importance. There have always been tensions between the two countries, but unfortunately, each country tends to blame the other for fanning emotions. This research will investigate portrayals of China and the Chinese people in Japanese media such as newspapers, TV news, TV drama, and cinema over this period, focusing on media sources that have particularly wide viewership or readership. By doing so, it attempts to detect any general trends in the positive or negative character of such portrayals and to see if they correlate with the results of surveys of attitudes among the general population. To the degree that correlations may be found, the question arises as to whether the media portrayals are a reflection of societal attitudes towards the Chinese, on one hand, or may be playing a role in promoting such attitudes, on the other. The relationship here is, without doubt, more complex than a simple one-way relationship of cause and effect, but indications of some direction of causality may be suggested by trends in one occurring before or after the other. Evidence will also be sought of possible longer-term trends in media portrayals of China and the Chinese people in Japan during the post-2012 period, i.e., Abe Shinzo’s second term as prime minister, in comparison to earlier periods. Perceptions of Japan’s view of China and the Chinese, both inside and outside the scholarly world, tend to be oversimplified and are often incomprehensive. This research calls attention to the role played by the media in promoting or de-promoting Sino-Japanese relations. By analyzing the nature and background of images of China and the Chinese people presented in the Japanese media, especially under the new Abe Regime, this research seeks to promote a more balanced and comprehensive understanding of attitudes in Japanese society towards its gigantic neighbor. Scholars have seen the increasingly fragile Sino-Japanese relationship as inseparable from the real-world political conflicts that have become more frequent in recent years and have sought to draw a correlation between the two. The influence of the media, however, remains a mostly under-explored domain in the academic world. Against this background, this research aims to provide an enriched scholarly understanding of Japan’s perception of China by investigating to what extent such perception can be seen to be affected by subjective or selective forms of presentation of China found in the Japanese media, or vice versa.

Keywords: Abe Shinzo, China, Japan, media

Procedia PDF Downloads 309
14793 Eradicating Micronutrient Deficiency through Biofortification

Authors: Ihtasham Hamza

Abstract:

In the contemporary world, where the West is afflicted by the diseases of excess nutrition, much of the rest globe suffers at the hands of hunger. A troubling constituent of hunger is micronutrient deficiency, also called hidden hunger. Major dependence on calorie-rich diets and low diet diversification are responsible for high malnutrition rates, especially in African and Asian countries. But the dilemma isn’t immune to solutions. Highlighting the substantial cause to be sole dependence on staples for food, biofortification has emerged as a novel tool to confront the widely distributed jeopardize of hidden hunger. Biofortification potentials the better nutritional approachability to commonalities overcoming various difficulties and reaching the doorstep. The crops associated with biofortification offer a rural-based involvement that, proposal, primarily reaches these more remote populations, which comprise a majority of the malnourished in many countries, and then penetrates to urban populations as assembly overages are marketed. Initial investments in agricultural research at a central location can generate high recurrent benefits at low cost as adapted biofortified cultivars become widely available in countries across time at low recurrent costs as opposed to supplementation which is comparatively expensive and requires continued financing over time, which may be imperilled by fluctuating political curiosity.

Keywords: biofortified crops, hunger, malnutrition, agricultural practices

Procedia PDF Downloads 288
14792 An Evaluation of the Impact of E-Banking on Operational Efficiency of Banks in Nigeria

Authors: Ibrahim Rabiu Darazo

Abstract:

The research has been conducted on the impact of E-banking on the operational efficiency of Banks in Nigeria, A case of some selected banks (Diamond Bank Plc, GTBankPlc, and Fidelity Bank Plc) in Nigeria. The research is a quantitative research which uses both primary and secondary sources of data collection. Questionnaire were used to obtained accurate data, where 150 Questionnaire were distributed among staff and customers of the three Banks , and the data collected where analysed using chi-square, whereas the secondary data where obtained from relevant text books, journals and relevant web sites. It is clear from the findings that, the use of e-banking by the banks has improved the efficiency of these banks, in terms of providing efficient services to customers electronically, using Internet Banking, Telephone Banking ATMs, reducing time taking to serve customers, e-banking allow new customers to open an account online, customers have access to their account at all the time 24/7.E-banking provide access to customers information from the data base and cost of check and postage were eliminated using e-banking. The recommendation at the end of the research include; the Banks should try to update their electronic gadgets, e-fraud(internal & external) should also be controlled, Banks shall employ qualified man power, Biometric ATMs shall be introduce to reduce fraud using ATM Cards, as it is use in other countries like USA.

Keywords: banks, electronic banking, operational efficiency of banks, biometric ATMs

Procedia PDF Downloads 332
14791 A Multiple Perspectives Approach on the Well-Being of Students with Autism Spectrum Disorder

Authors: Joanne Danker, Iva Strnadová, Therese Cumming

Abstract:

As a consequence of the increased evidence of the bi-directional relationship between student well-being and positive educational outcomes, there has been a surge in the number of research studies dedicated to understanding the notion of student well-being and the ways to enhance it. In spite of these efforts, the concept of student well-being remains elusive. Additionally, studies on student well-being mainly consulted adults' perspectives and failed to take into account students' views, which if considered, could contribute to a clearer understanding of the complex concept of student well-being. Furthermore, there is a lack of studies focusing on the well-being of students with autism spectrum disorder (ASD), and these students continue to fare worse in post-school outcomes as compared to students without disabilities, indicating a significant gap in the current research literature. Findings from research conducted on students without disabilities may not be applicable to students with ASD as their educational experiences may differ due to the characteristics associated with ASD. Thus, the purpose of this study was to explore how students with ASD, their parents, and teachers conceptualise student well-being. It also aims to identify the barriers and assets of the well-being of these students. To collect data, 19 teachers and 11 parents participated in interviews while 16 high school students with ASD were involved in a photovoice project regarding their well-being in school. Grounded theory approaches such as open and axial coding, memo-writing, diagramming, and making constant comparisons were adopted to analyse the data. All three groups of participants conceptualised student well-being as a multidimensional construct consisting of several domains. These domains were relationships, engagement, positive/negative emotions, and accomplishment. Three categories of barriers were identified. These were environmental, attitudes and behaviours of others, and impact of characteristics associated with ASD. The identified internal assets that could contribute to student well-being were acceptance, resilience, self-regulation, and ability to work with others. External assets were knowledgeable and inclusive school community, and having access to various school programs and resources. It is crucial that schools and policymakers provide ample resources and programs to adequately support the development of each identified domain of student well-being. This could in turn enhance student well-being and lead to more successful educational outcomes for students with ASD.

Keywords: autism spectrum disorder, grounded theory approach, school experiences, student well-being

Procedia PDF Downloads 288
14790 Identification System for Grading Banana in Food Processing Industry

Authors: Ebenezer O. Olaniyi, Oyebade K. Oyedotun, Khashman Adnan

Abstract:

In the food industry high quality production is required within a limited time to meet up with the demand in the society. In this research work, we have developed a model which can be used to replace the human operator due to their low output in production and slow in making decisions as a result of an individual differences in deciding the defective and healthy banana. This model can perform the vision attributes of human operators in deciding if the banana is defective or healthy for food production based. This research work is divided into two phase, the first phase is the image processing where several image processing techniques such as colour conversion, edge detection, thresholding and morphological operation were employed to extract features for training and testing the network in the second phase. These features extracted in the first phase were used in the second phase; the classification system phase where the multilayer perceptron using backpropagation neural network was employed to train the network. After the network has learned and converges, the network was tested with feedforward neural network to determine the performance of the network. From this experiment, a recognition rate of 97% was obtained and the time taken for this experiment was limited which makes the system accurate for use in the food industry.

Keywords: banana, food processing, identification system, neural network

Procedia PDF Downloads 471
14789 Plackett-Burman Design to Evaluate the Influence of Operating Parameters on Anaerobic Orthophosphate Release from Enhanced Biological Phosphorus Removal Sludge

Authors: Reza Salehi, Peter L. Dold, Yves Comeau

Abstract:

The aim of the present study was to investigate the effect of a total of 6 operating parameters including pH (X1), temperature (X2), stirring speed (X3), chemical oxygen demand (COD) (X4), volatile suspended solids (VSS) (X5) and time (X6) on anaerobic orthophosphate release from enhanced biological phosphorus removal (EBPR) sludge. An 8-run Plackett Burman design was applied and the statistical analysis of the experimental data was performed using Minitab16.2.4 software package. The Analysis of variance (ANOVA) results revealed that temperature, COD, VSS and time had a significant effect with p-values of less than 0.05 whereas pH and stirring speed were identified as non-significant parameters, but influenced orthophosphate release from the EBPR sludge. The mathematic expression obtained by the first-order multiple linear regression model between orthophosphate release from the EBPR sludge (Y) and the operating parameters (X1-X6) was Y=18.59+1.16X1-3.11X2-0.81X3+3.79X4+9.89X5+4.01X6. The model p-value and coefficient of determination (R2) value were 0.026 and of 99.87%, respectively, which indicates the model is significant and the predicted values of orthophosphate release from the EBPR sludge have been excellently correlated with the observed values.

Keywords: anaerobic, operating parameters, orthophosphate release, Plackett-Burman design

Procedia PDF Downloads 279
14788 Assessment of Hypersaline Outfalls via Computational Fluid Dynamics Simulations: A Case Study of the Gold Coast Desalination Plant Offshore Multiport Brine Diffuser

Authors: Mitchell J. Baum, Badin Gibbes, Greg Collecutt

Abstract:

This study details a three-dimensional field-scale numerical investigation conducted for the Gold Coast Desalination Plant (GCDP) offshore multiport brine diffuser. Quantitative assessment of diffuser performance with regard to trajectory, dilution and mapping of seafloor concentration distributions was conducted for 100% plant operation. The quasi-steady Computational Fluid Dynamics (CFD) simulations were performed using the Reynolds averaged Navier-Stokes equations with a k-ω shear stress transport turbulence closure scheme. The study compliments a field investigation, which measured brine plume characteristics under similar conditions. CFD models used an iterative mesh in a domain with dimensions 400 m long, 200 m wide and an average depth of 24.2 m. Acoustic Doppler current profiler measurements conducted in the companion field study exhibited considerable variability over the water column. The effect of this vertical variability on simulated discharge outcomes was examined. Seafloor slope was also accommodated into the model. Ambient currents varied predominantly in the longshore direction – perpendicular to the diffuser structure. Under these conditions, the alternating port orientation of the GCDP diffuser resulted in simultaneous subjection to co-propagating and counter-propagating ambient regimes. Results from quiescent ambient simulations suggest broad agreement with empirical scaling arguments traditionally employed in design and regulatory assessments. Simulated dynamic ambient regimes showed the influence of ambient crossflow upon jet trajectory, dilution and seafloor concentration is significant. The effect of ambient flow structure and the subsequent influence on jet dynamics is discussed, along with the implications for using these different simulation approaches to inform regulatory decisions.

Keywords: computational fluid dynamics, desalination, field-scale simulation, multiport brine diffuser, negatively buoyant jet

Procedia PDF Downloads 214
14787 Response Surface Methodology to Obtain Disopyramide Phosphate Loaded Controlled Release Ethyl Cellulose Microspheres

Authors: Krutika K. Sawant, Anil Solanki

Abstract:

The present study deals with the preparation and optimization of ethyl cellulose-containing disopyramide phosphate loaded microspheres using solvent evaporation technique. A central composite design consisting of a two-level full factorial design superimposed on a star design was employed for optimizing the preparation microspheres. The drug:polymer ratio (X1) and speed of the stirrer (X2) were chosen as the independent variables. The cumulative release of the drug at a different time (2, 6, 10, 14, and 18 hr) was selected as the dependent variable. An optimum polynomial equation was generated for the prediction of the response variable at time 10 hr. Based on the results of multiple linear regression analysis and F statistics, it was concluded that sustained action can be obtained when X1 and X2 are kept at high levels. The X1X2 interaction was found to be statistically significant. The drug release pattern fitted the Higuchi model well. The data of a selected batch were subjected to an optimization study using Box-Behnken design, and an optimal formulation was fabricated. Good agreement was observed between the predicted and the observed dissolution profiles of the optimal formulation.

Keywords: disopyramide phosphate, ethyl cellulose, microspheres, controlled release, Box-Behnken design, factorial design

Procedia PDF Downloads 458
14786 Analysis of Residents’ Travel Characteristics and Policy Improving Strategies

Authors: Zhenzhen Xu, Chunfu Shao, Shengyou Wang, Chunjiao Dong

Abstract:

To improve the satisfaction of residents' travel, this paper analyzes the characteristics and influencing factors of urban residents' travel behavior. First, a Multinominal Logit Model (MNL) model is built to analyze the characteristics of residents' travel behavior, reveal the influence of individual attributes, family attributes and travel characteristics on the choice of travel mode, and identify the significant factors. Then put forward suggestions for policy improvement. Finally, Support Vector Machine (SVM) and Multi-Layer Perceptron (MLP) models are introduced to evaluate the policy effect. This paper selects Futian Street in Futian District, Shenzhen City for investigation and research. The results show that gender, age, education, income, number of cars owned, travel purpose, departure time, journey time, travel distance and times all have a significant influence on residents' choice of travel mode. Based on the above results, two policy improvement suggestions are put forward from reducing public transportation and non-motor vehicle travel time, and the policy effect is evaluated. Before the evaluation, the prediction effect of MNL, SVM and MLP models was evaluated. After parameter optimization, it was found that the prediction accuracy of the three models was 72.80%, 71.42%, and 76.42%, respectively. The MLP model with the highest prediction accuracy was selected to evaluate the effect of policy improvement. The results showed that after the implementation of the policy, the proportion of public transportation in plan 1 and plan 2 increased by 14.04% and 9.86%, respectively, while the proportion of private cars decreased by 3.47% and 2.54%, respectively. The proportion of car trips decreased obviously, while the proportion of public transport trips increased. It can be considered that the measures have a positive effect on promoting green trips and improving the satisfaction of urban residents, and can provide a reference for relevant departments to formulate transportation policies.

Keywords: neural network, travel characteristics analysis, transportation choice, travel sharing rate, traffic resource allocation

Procedia PDF Downloads 138