Search results for: step count
1896 A Collaborative Teaching and Learning Model between Academy and Industry for Multidisciplinary Engineering Education
Authors: Moon-Soo Kim
Abstract:
In order to cope with the increasing demand for multidisciplinary learning between academy and industry, a collaborative teaching and learning model and related operational tools enabling applications to engineering education are essential. This study proposes a web-based collaborative framework for interactive teaching and learning between academy and industry as an initial step for the development of a web- and mobile-based integrated system for both engineering students and industrial practitioners. The proposed web-based collaborative teaching and learning framework defines several entities such as learner, solver and supporter or sponsor for industrial problems, and also has a systematic architecture to build information system including diverse functions enabling effective interaction among the defined entities regardless of time and places. Furthermore, the framework, which includes knowledge and information self-reinforcing mechanism, focuses on the previous problem-solving records as well as subsequent learners’ creative reusing in solving process of new problems.Keywords: collaborative teaching and learning model, academy and industry, web-based collaborative framework, self-reinforcing mechanism
Procedia PDF Downloads 3261895 Ragging and Sludging Measurement in Membrane Bioreactors
Authors: Pompilia Buzatu, Hazim Qiblawey, Albert Odai, Jana Jamaleddin, Mustafa Nasser, Simon J. Judd
Abstract:
Membrane bioreactor (MBR) technology is challenged by the tendency for the membrane permeability to decrease due to ‘clogging’. Clogging includes ‘sludging’, the filling of the membrane channels with sludge solids, and ‘ragging’, the aggregation of short filaments to form long rag-like particles. Both sludging and ragging demand manual intervention to clear out the solids, which is time-consuming, labour-intensive and potentially damaging to the membranes. These factors impact on costs more significantly than membrane surface fouling which, unlike clogging, is largely mitigated by the chemical clean. However, practical evaluation of MBR clogging has thus far been limited. This paper presents the results of recent work attempting to quantify sludging and clogging based on simple bench-scale tests. Results from a novel ragging simulation trial indicated that rags can be formed within 24-36 hours from dispersed < 5 mm-long filaments at concentrations of 5-10 mg/L under gently agitated conditions. Rag formation occurred for both a cotton wool standard and samples taken from an operating municipal MBR, with between 15% and 75% of the added fibrous material forming a single rag. The extent of rag formation depended both on the material type or origin – lint from laundering operations forming zero rags – and the filament length. Sludging rates were quantified using a bespoke parallel-channel test cell representing the membrane channels of an immersed flat sheet MBR. Sludge samples were provided from two local MBRs, one treating municipal and the other industrial effluent. Bulk sludge properties measured comprised mixed liquor suspended solids (MLSS) concentration, capillary suction time (CST), particle size, soluble COD (sCOD) and rheology (apparent viscosity μₐ vs shear rate γ). The fouling and sludging propensity of the sludge was determined using the test cell, ‘fouling’ being quantified as the pressure incline rate against flux via the flux step test (for which clogging was absent) and sludging by photographing the channel and processing the image to determine the ratio of the clogged to unclogged regions. A substantial difference in rheological and fouling behaviour was evident between the two sludge sources, the industrial sludge having a higher viscosity but less shear-thinning than the municipal. Fouling, as manifested by the pressure increase Δp/Δt, as a function of flux from classic flux-step experiments (where no clogging was evident), was more rapid for the industrial sludge. Across all samples of both sludge origins the expected trend of increased fouling propensity with increased CST and sCOD was demonstrated, whereas no correlation was observed between clogging rate and these parameters. The relative contribution of fouling and clogging was appraised by adjusting the clogging propensity via increasing the MLSS both with and without a commensurate increase in the COD. Results indicated that whereas for the municipal sludge the fouling propensity was affected by the increased sCOD, there was no associated increased in the sludging propensity (or cake formation). The clogging rate actually decreased on increasing the MLSS. Against this, for the industrial sludge the clogging rate dramatically increased with solids concentration despite a decrease in the soluble COD. From this was surmised that sludging did not relate to fouling.Keywords: clogging, membrane bioreactors, ragging, sludge
Procedia PDF Downloads 1861894 Quantum Dot Biosensing for Advancing Precision Cancer Detection
Authors: Sourav Sarkar, Manashjit Gogoi
Abstract:
In the evolving landscape of cancer diagnostics, optical biosensing has emerged as a promising tool due to its sensitivity and specificity. This study explores the potential of CdS/ZnS core-shell quantum dots (QDs) capped with 3-Mercaptopropionic acid (3-MPA), which aids in the linking chemistry of QDs to various cancer antibodies. The QDs, with their unique optical and electronic properties, have been integrated into the biosensor design. Their high quantum yield and size-dependent emission spectra have been exploited to improve the sensor’s detection capabilities. The study presents the design of this QD-enhanced optical biosensor. The use of these QDs can also aid multiplexed detection, enabling simultaneous monitoring of different cancer biomarkers. This innovative approach holds significant potential for advancing cancer diagnostics, contributing to timely and accurate detection. Future work will focus on optimizing the biosensor design for clinical applications and exploring the potential of QDs in other biosensing applications. This study underscores the potential of integrating nanotechnology and biosensing for cancer research, paving the way for next-generation diagnostic tools. It is a step forward in our quest for achieving precision oncology.Keywords: quantum dots, biosensing, cancer, device
Procedia PDF Downloads 611893 An Adaptive Virtual Desktop Service in Cloud Computing Platform
Authors: Shuen-Tai Wang, Hsi-Ya Chang
Abstract:
Cloud computing is becoming more and more matured over the last few years and consequently the demands for better cloud services is increasing rapidly. One of the research topics to improve cloud services is the desktop computing in virtualized environment. This paper aims at the development of an adaptive virtual desktop service in cloud computing platform based on our previous research on the virtualization technology. We implement cloud virtual desktop and application software streaming technology that make it possible for providing Virtual Desktop as a Service (VDaaS). Given the development of remote desktop virtualization, it allows shifting the user’s desktop from the traditional PC environment to the cloud-enabled environment, which is stored on a remote virtual machine rather than locally. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for online cloud service. Users no longer need to burden the platform maintenances and drastically reduces the overall cost of hardware and software licenses. Moreover, this flexible remote desktop service represents the next significant step to the mobile workplace, and it lets users access their desktop environments from virtually anywhere.Keywords: cloud computing, virtualization, virtual desktop, VDaaS
Procedia PDF Downloads 2881892 Seismic Microzonation of El-Fayoum New City, Egypt
Authors: Suzan Salem, Heba Moustafa, Abd El-Aziz Abd El-Aal
Abstract:
Seismic micro hazard zonation for urban areas is the first step towards a seismic risk analysis and mitigation strategy. Essential here is to obtain a proper understanding of the local subsurface conditions and to evaluate ground-shaking effects. In the present study, an attempt has been made to evaluate the seismic hazard considering local site effects by carrying out detailed geotechnical and geophysical site characterization in El-Fayoum New City. Seismic hazard analysis and microzonation of El-Fayoum New City are addressed in three parts: in the first part, estimation of seismic hazard is done using seismotectonic and geological information. The second part deals with site characterization using geotechnical and shallow geophysical techniques. In the last part, local site effects are assessed by carrying out one-dimensional (1-D) ground response analysis using the equivalent linear method by program SHAKE 2000. Finally, microzonation maps have been prepared. The detailed methodology, along with experimental details, collected data, results and maps are presented in this paper.Keywords: El-Fayoum, microzonation, seismotectonic, Egypt
Procedia PDF Downloads 3851891 Storage Assignment Strategies to Reduce Manual Picking Errors with an Emphasis on an Ageing Workforce
Authors: Heiko Diefenbach, Christoph H. Glock
Abstract:
Order picking, i.e., the order-based retrieval of items in a warehouse, is an important time- and cost-intensive process for many logistic systems. Despite the ongoing trend of automation, most order picking systems are still manual picker-to-parts systems, where human pickers walk through the warehouse to collect ordered items. Human work in warehouses is not free from errors, and order pickers may at times pick the wrong or the incorrect number of items. Errors can cause additional costs and significant correction efforts. Moreover, age might increase a person’s likelihood to make mistakes. Hence, the negative impact of picking errors might increase for an aging workforce currently witnessed in many regions globally. A significant amount of research has focused on making order picking systems more efficient. Among other factors, storage assignment, i.e., the assignment of items to storage locations (e.g., shelves) within the warehouse, has been subject to optimization. Usually, the objective is to assign items to storage locations such that order picking times are minimized. Surprisingly, there is a lack of research concerned with picking errors and respective prevention approaches. This paper hypothesize that the storage assignment of items can affect the probability of pick errors. For example, storing similar-looking items apart from one other might reduce confusion. Moreover, storing items that are hard to count or require a lot of counting at easy-to-access and easy-to-comprehend self heights might reduce the probability to pick the wrong number of items. Based on this hypothesis, the paper discusses how to incorporate error-prevention measures into mathematical models for storage assignment optimization. Various approaches with respective benefits and shortcomings are presented and mathematically modeled. To investigate the newly developed models further, they are compared to conventional storage assignment strategies in a computational study. The study specifically investigates how the importance of error prevention increases with pickers being more prone to errors due to age, for example. The results suggest that considering error-prevention measures for storage assignment can reduce error probabilities with only minor decreases in picking efficiency. The results might be especially relevant for an aging workforce.Keywords: an aging workforce, error prevention, order picking, storage assignment
Procedia PDF Downloads 2091890 Urban Transport System Resilience Guidelines
Authors: Evangelia Gaitanidou, Evangelos Bekiaris
Abstract:
Considering that resilience implies the ability of a system to adapt continuously in order to respond to its operational goals, a system is considered as more or less resilient depending on the level and time of recovering from disruptive events and/or shocks to its initial state. Regarding transport systems, enhancing resilience is considered imperative for two main reasons: Such systems provide critical support to every socio-economic activity, while being one of the most important economic sectors and, secondly, the paths that convey people, goods and information, are the same through which risks are propagated. RESOLUTE (RESilience management guidelines and Operationalization appLied to Urban Transport Environment) Horizon 2020 research project is answering those needs, by proposing and testing a set of guidelines for resilience management of the urban transport system. The methods and steps towards this goal, through a step-wise methodology, taking into account established models like FRAM (Functional Resonance Analysis Model), and upon gathering existing practices are described in this paper, together with an overview of the produced guidelines. The overall aim is to create a framework which public transport authorities could consult and apply, for rendering their infrastructure resilient against natural disaster and other threats.Keywords: guidelines, infrastructure, resilience, transport
Procedia PDF Downloads 2521889 Human Rights Legislations and Evolution Effect on Attitudes
Authors: Sherin Kamal Zaki Kallini
Abstract:
The ratification of an global human rights prison instrument affords signatory States with an opportunity to count on a hard and fast of obligations and rights for the gain of their residents, imparting expanded possibilities, possibilities, and manner to access an improved best of existence – to be, to appear, and to become. developed countries commonly experience cultural, political, social, monetary, prison, and regulatory alterations in reaction to this transition. In a methodologically proactive technique, mechanisms undergo a visible and understandable manner of qualitative and quantitative exchange. Conversely, in countries undergoing improvement, the response to such ratification varies. some display high quality coverage modifications, whilst others stay stagnant or regress. Cameroon falls into the second one category, no matter efforts, as it legally prohibits 50% of its populace with disabilities from obtaining the reputation of a person with a incapacity. The overarching goal of this communique is to spotlight those deficiencies and their adverse outcomes on various components of existence, fostering recognition among beneficiaries and advocating for extra inclusive alterations within the united states. Our task employs a popular and participatory methodological approach by related to beneficiaries and their groups in its training. it is also inclusive, representing the diversity of disabilities and tasty natural and criminal folks from numerous backgrounds. active consultations occur at all tiers of the sports. anticipated consequences include raising focus globally among countries, worldwide cooperation businesses, NGOs, and other inclusive improvement actors. We are looking for their support for nearby advocacy efforts to absolutely enforce the United countries convention on the Rights of persons with Disabilities (CRPD). concurrently, we hope they specific harmony with the sufferers in Cameroon who have been left behind and endorse legal reforms to align domestic and global rules with the promotion and safety of incapacity rights.Keywords: sustainable development, human rights, the right to development, the human rights-based approach to development, environmental rights, economic development, social sustainability human rights protection, human rights violations, workers’ rights, justice, security.
Procedia PDF Downloads 141888 Evaluation Methods for Question Decomposition Formalism
Authors: Aviv Yaniv, Ron Ben Arosh, Nadav Gasner, Michael Konviser, Arbel Yaniv
Abstract:
This paper introduces two methods for the evaluation of Question Decomposition Meaning Representation (QDMR) as predicted by sequence-to-sequence model and COPYNET parser for natural language questions processing, motivated by the fact that previous evaluation metrics used for this task do not take into account some characteristics of the representation, such as partial ordering structure. To this end, several heuristics to extract such partial dependencies are formulated, followed by the hereby proposed evaluation methods denoted as Proportional Graph Matcher (PGM) and Conversion to Normal String Representation (Nor-Str), designed to better capture the accuracy level of QDMR predictions. Experiments are conducted to demonstrate the efficacy of the proposed evaluation methods and show the added value suggested by one of them- the Nor-Str, for better distinguishing between high and low-quality QDMR when predicted by models such as COPYNET. This work represents an important step forward in the development of better evaluation methods for QDMR predictions, which will be critical for improving the accuracy and reliability of natural language question-answering systems.Keywords: NLP, question answering, question decomposition meaning representation, QDMR evaluation metrics
Procedia PDF Downloads 811887 Role of Authorized Agencies to Combat Financial Crime in Bangladesh
Authors: Khan Sarfaraz, Mohammad Ali Mia
Abstract:
Money laundering and other financial crime have become a global threat in recent years, impacting both developed and poor countries. In developing countries like Bangladesh, it is more difficult to combat financial crime than in developing countries because of the inadequate regulatory environment and vulnerable financial system. Bangladesh's central bank issues guidelines to facilitate the implementation of the prevention of the money laundering act. According to the guideline of Bangladesh Bank, all financial institution has to develop anti-money laundering policy to ensure the safety and soundness of their institutions. The paper aims to focus on the role of authorized agencies in combating financial crime. In this paper, the latest trends in financial crimes have been discussed from global and Asian perspectives. The preventive measures for money laundering and other financial crimes have been discussed elaborately. So far, financial crime is a sophisticated and dynamic crime, and criminals continuously took innovative processes to use the financial system to launder money. The study will take a step in pointing out new techniques, effects and challenges of financial crime in Bangladesh.Keywords: financial crime, illegal money transfer, online gambling, money laundering, authorized agencies
Procedia PDF Downloads 811886 Effects of Caprine Arthritis-Encephalitis Virus (CAEV) Infection on the Expression of Cathelicidin Genes in Goat Blood Leukocytes
Authors: Daria Reczynska, Justyna Jarczak, Michal Czopowicz, Danuta Sloniewska, Karina Horbanczuk, Wieslaw Jarmuz, Jaroslaw Kaba, Emilia Bagnicka
Abstract:
Since people, animals and plants are constantly exposed to pathogens they have developed very complex systems of defense. Among ca. 1000 antimicrobial peptides from different families so far identified, approximately 30 belonging to cathelicidin family can be found in mammals. Cathelicidins probably constitute the first line of defense because they can act at a physiological salt concentration which is present in healthy tissues. Moreover, the low salt concentration which is present in infected tissues inhibits their activity. In goat bactenecin 7.5 (BAC7.5), bactenecin 5 (BAC5), myeloid antimicrobial peptide 28 (MAP28), myeloid antimicrobial peptide 34 (MAP34 A and B), goat bactenecin3.4 (ChBac3.4) were identified. Caprine arthritis-encephalitis (CAE) caused by small ruminant lentivirus (SRLV) is economic problem. The main CAE symptoms are weight loss, arthritis, pneumonia and mastitis (significant elevation of the somatic cell count and deterioration of some technological parameters). The study was conducted on 24 dairy goats. The animals were divided into two groups: experimental (SRLV-infected) and control (non-infected). The blood samples were collected five times: on the 1st, 7th, 30th, 90th and 150thday of lactation. The levels of transcripts of BAC7.5, BAC5, MAP28 and MAP34 genes in blood leucocytes were measured using qPCR method. There were no differences in mRNA levels of studied genes between stages of lactation. The differences were observed in expressions of BAC5, MAP28 and MAP34 genes with lower levels in the experimental group. There was no difference in BAC7.5 expression between groups. The decreased levels of transcripts of cathelicidin genes in blood leucocytes of SRLV-infected goats may indicate the disturbances of homeostasis in organisms. It can be concluded that SRLV infection seems to inhibit expression of cathelicidin genes. The study was financed by a grant from the National Scientific Center No. UMO-2013/09/B/NZ/03514.Keywords: goat, CAEV, cathelicidins, blood leukocytes, gene expression
Procedia PDF Downloads 2881885 The Application of FSI Techniques in Modeling of Realist Pulmonary Systems
Authors: Abdurrahim Bolukbasi, Hassan Athari, Dogan Ciloglu
Abstract:
The modeling lung respiratory system which has complex anatomy and biophysics presents several challenges including tissue-driven flow patterns and wall motion. Also, the lung pulmonary system because of that they stretch and recoil with each breath, has not static walls and structures. The direct relationship between air flow and tissue motion in the lung structures naturally prefers an FSI simulation technique. Therefore, in order to toward the realistic simulation of pulmonary breathing mechanics the development of a coupled FSI computational model is an important step. A simple but physiologically-relevant three dimensional deep long geometry is designed and fluid-structure interaction (FSI) coupling technique is utilized for simulating the deformation of the lung parenchyma tissue which produces airflow fields. The real understanding of respiratory tissue system as a complex phenomenon have been investigated with respect to respiratory patterns, fluid dynamics and tissue visco-elasticity and tidal breathing period. Procedia PDF Downloads 3291884 Environmental Forensic Analysis of the Shoreline Microplastics Debris on the Limbe Coastline, Cameroon
Authors: Ndumbe Eric Esongami, Manga Veronica Ebot, Foba Josepha Tendo, Yengong Fabrice Lamfu, Tiku David Tambe
Abstract:
The prevalence and unpleasant nature of plastics pollution constantly observed on beach shore on stormy events has prompt researchers worldwide to thesis on sustainable economic and environmental designs on plastics, especially in Cameroon, a major touristic destination in the Central Africa Region. The inconsistent protocols develop by researchers has added to this burden, thus the morphological nature of microplastic remediation is a call for concerns. The prime aim of the study is to morphologically identify, quantify and forensically understands the distribution of each plastics polymer composition. Duplicates of 2×2 m (4m2) quadrants were sampled in each beach/month over 8 months period across five purposive beaches along the Limbe – Idenau coastline, Cameroon. Collected plastic samples were thoroughly washed and separation done using a 2 mm sieve. Only particles of size, < 2 mm, were considered and forward follow the microplastics laboratory analytical processes. Established step by step methodological procedures of particle filtration, organic matter digestion, density separation, particle extraction and polymer identification including microscope and were applied for the beach microplastics samples. Microplastics were observed in each sample/beach/month with an overall abundance of 241 particles/number weighs 89.15 g in total and with a mean abundance of 2 particles/m2 (0.69 g/m2) and 6 particles/month (2.0 g/m2). The accumulation of beach shoreline MPs rose dramatically towards decreasing size with microbeads and fiber only found in the < 1 mm size fraction. Approximately 75% of beach MPs contamination were found in LDB 2, LDB 1 and IDN beaches/average particles/number while the most dominant polymer type frequently observed also were PP, PE, and PS in all morphologically parameters analysed. Beach MPs accumulation significantly varied temporally and spatially at p = 0.05. ANOVA and Spearman’s rank correlation used shows linear relationships between the sizes categories considered in this study. In terms of polymer MPs analysis, the colour class recorded that white coloured MPs was dominant, 50 particles/number (22.25 g) with recorded abundance/number in PP (25), PE (15) and PS (5). The shape class also revealed that irregularly shaped MPs was dominant, 98 particles/number (30.5 g) with higher abundance/number in PP (39), PE (33), and PS (11). Similarly, MPs type class shows that fragmented MPs type was also dominant, 80 particles/number (25.25 g) with higher abundance/number in PP (30), PE (28) and PS (15). Equally, the sized class forward revealed that 1.5 – 1.99 mm sized ranged MPs had the highest abundance of 102 particles/number (51.77 g) with higher concentration observed in PP (47), PE (41), and PS (7) as well and finally, the weight class also show that 0.01 g weighs MPs was dominated by 98 particles/number (56.57 g) with varied numeric abundance seen in PP (49), PE (29) and PS (13). The forensic investigation of the pollution indicated that majority of the beach microplastic is sourced from the site/nearby area. The investigation could draw useful conclusions regarding the pathways of pollution. The fragmented microplastic, a significant component in the sample, was found to be sourced from recreational activities and partly from fishing boat installations and repairs activities carried out close to the shore.Keywords: forensic analysis, beach MPs, particle/number, polymer composition, cameroon
Procedia PDF Downloads 891883 Utility of Thromboelastography Derived Maximum Amplitude and R-Time (MA-R) Ratio as a Predictor of Mortality in Trauma Patients
Authors: Arulselvi Subramanian, Albert Venencia, Sanjeev Bhoi
Abstract:
Coagulopathy of trauma is an early endogenous coagulation abnormality that occurs shortly resulting in high mortality. In emergency trauma situations, viscoelastic tests may be better in identifying the various phenotypes of coagulopathy and demonstrate the contribution of platelet function to coagulation. We aimed to determine thrombin generation and clot strength, by estimating a ratio of Maximum amplitude and R-time (MA-R ratio) for identifying trauma coagulopathy and predicting subsequent mortality. Methods: We conducted a prospective cohort analysis of acutely injured trauma patients of the adult age groups (18- 50 years), admitted within 24hrs of injury, for one year at a Level I trauma center and followed up on 3rd day and 5th day of injury. Patients with h/o coagulation abnormalities, liver disease, renal impairment, with h/o intake of drugs were excluded. Thromboelastography was done and a ratio was calculated by dividing the MA by the R-time (MA-R). Patients were further stratified into sub groups based on the calculated MA-R quartiles. First sampling was done within 24 hours of injury; follow up on 3rd and 5thday of injury. Mortality was the primary outcome. Results: 100 acutely injured patients [average, 36.6±14.3 years; 94% male; injury severity score 12.2(9-32)] were included in the study. Median (min-max) on admission MA-R ratio was 15.01(0.4-88.4) which declined 11.7(2.2-61.8) on day three and slightly rose on day 5 13.1(0.06-68). There were no significant differences between sub groups in regard to age, or gender. In the lowest MA-R ratios subgroup; MA-R1 (<8.90; n = 27), injury severity score was significantly elevated. MA-R2 (8.91-15.0; n = 23), MA-R3 (15.01-19.30; n = 24) and MA-R4 (>19.3; n = 26) had no difference between their admission laboratory investigations, however slight decline was observed in hemoglobin, red blood cell count and platelet counts compared to the other subgroups. Also significantly prolonged R time, shortened alpha angle and MA were seen in MA-R1. Elevated incidence of mortality also significantly correlated with on admission low MA-R ratios (p 0.003). Temporal changes in the MA-R ratio did not correlated with mortality. Conclusion: The MA-R ratio provides a snapshot of early clot function, focusing specifically on thrombin burst and clot strength. In our observation, patients with the lowest MA-R time ratio (MA-R1) had significantly increased mortality compared with all other groups (45.5% MA-R1 compared with <25% in MA-R2 to MA-R3, and 9.1% in MA-R4; p < 0.003). Maximum amplitude and R-time may prove highly useful to predict at-risk patients early, when other physiologic indicators are absent.Keywords: coagulopathy, trauma, thromboelastography, mortality
Procedia PDF Downloads 1771882 QSAR and Anti-Depressant Studies of Some Novel Phenothiazine Derivatives
Authors: D. L. Tambe, S. Dighe Nachiket
Abstract:
Objective: Depression is a common but serious illness and the phenothiazine derivatives shows prominent effect against the depression hence work was undertaken to validate this use scientifically. Material and Methods: Synthesis of phenothiazine derivatives are done by the substitution of various groups, but the basic scheme of synthesis is started with synthesis of 4-(Cyclohexylidene) Benzoic acid using PABA. After that with the further six step of synthesis of 3-(10H-phenothiazin-2-yl)-N, 5-diphenyl-4H-1, 2, 4-triazol-4-amine is done which is final product. Antidepressant activity of all the synthesized compounds was evaluated by despair swim test by using Sprague Dawley Rats. Standard drug imipramine was used as the control. In the despair swim test, all the synthesized derivatives showed antidepressant activity. Results: Among the all phenothiazine derivatives four compounds (6.6-7.2 (14H –phenyl ), 9.43 (1H OH), 8.50 (1H NH phenothiazine),6.85-8.21(14H phenyl), 8.50 (1H NH phenothiazine), 11.82 (1H – OH), 6.6-7.2 (8H –phenyl ), 9.43 (1H OH), 8.50 (1H NH phenothiazine), 4.2 (1H NH) and 6.85-8.21(8H phenyl), 8.50 (1H NH phenothiazine), 3.9 (1H NH) 11.82 (1H – OH) showed significant antidepressant activity comparing with control drug imipramine. Conclusion: Various Novel phenothiazine derivatives show more potent antidepressant activity and it plays more beneficial role in human health for the treatment of depression.Keywords: antidepressant activities, despair swim test, phenothiazine, Sprague Dawley Rats
Procedia PDF Downloads 3851881 A New Approach towards the Development of Next Generation CNC
Authors: Yusri Yusof, Kamran Latif
Abstract:
Computer Numeric Control (CNC) machine has been widely used in the industries since its inception. Currently, in CNC technology has been used for various operations like milling, drilling, packing and welding etc. with the rapid growth in the manufacturing world the demand of flexibility in the CNC machines has rapidly increased. Previously, the commercial CNC failed to provide flexibility because its structure was of closed nature that does not provide access to the inner features of CNC. Also CNC’s operating ISO data interface model was found to be limited. Therefore, to overcome that problem, Open Architecture Control (OAC) technology and STEP-NC data interface model are introduced. At present the Personal Computer (PC) has been the best platform for the development of open-CNC systems. In this paper, both ISO data interface model interpretation, its verification and execution has been highlighted with the introduction of the new techniques. The proposed is composed of ISO data interpretation, 3D simulation and machine motion control modules. The system is tested on an old 3 axis CNC milling machine. The results are found to be satisfactory in performance. This implementation has successfully enabled sustainable manufacturing environment.Keywords: CNC, ISO 6983, ISO 14649, LabVIEW, open architecture control, reconfigurable manufacturing systems, sustainable manufacturing, Soft-CNC
Procedia PDF Downloads 5191880 Association between Noise Levels, Particulate Matter Concentrations and Traffic Intensities in a Near-Highway Urban Area
Authors: Mohammad Javad Afroughi, Vahid Hosseini, Jason S. Olfert
Abstract:
Both traffic-generated particles and noise have been associated with the development of cardiovascular diseases, especially in near-highway environments. Although noise and particulate matters (PM) have different mechanisms of dispersion, sharing the same emission source in urban areas (road traffics) can result in a similar degree of variability in their levels. This study investigated the temporal variation of and correlation between noise levels, PM concentrations and traffic intensities near a major highway in Tehran, Iran. Tehran particulate concentration is highly influenced by road traffic. Additionally, Tehran ultrafine particles (UFP, PM<0.1 µm) are mostly emitted from combustion processes of motor vehicles. This gives a high possibility of a strong association between traffic-related noise and UFP in near-highway environments of this megacity. Hourly average of equivalent continuous sound pressure level (Leq), total number concentration of UFPs, mass concentration of PM2.5 and PM10, as well as traffic count and speed were simultaneously measured over a period of three days in winter. Additionally, meteorological data including temperature, relative humidity, wind speed and direction were collected in a weather station, located 3 km from the monitoring site. Noise levels showed relatively low temporal variability in near-highway environments compared to PM concentrations. Hourly average of Leq ranged from 63.8 to 69.9 dB(A) (mean ~ 68 dB(A)), while hourly concentration of particles varied from 30,800 to 108,800 cm-3 for UFP (mean ~ 64,500 cm-3), 41 to 75 µg m-3 for PM2.5 (mean ~ 53 µg m-3), and 62 to 112 µg m-3 for PM10 (mean ~ 88 µg m-3). The Pearson correlation coefficient revealed strong relationship between noise and UFP (r ~ 0.61) overall. Under downwind conditions, UFP number concentration showed the strongest association with noise level (r ~ 0.63). The coefficient decreased to a lesser degree under upwind conditions (r ~ 0.24) due to the significant role of wind and humidity in UFP dynamics. Furthermore, PM2.5 and PM10 correlated moderately with noise (r ~ 0.52 and 0.44 respectively). In general, traffic counts were more strongly associated with noise and PM compared to traffic speeds. It was concluded that noise level combined with meteorological data can be used as a proxy to estimate PM concentrations (specifically UFP number concentration) in near-highway environments of Tehran. However, it is important to measure joint variability of noise and particles to study their health effects in epidemiological studies.Keywords: noise, particulate matter, PM10, PM2.5, ultrafine particle
Procedia PDF Downloads 1951879 Photocatalytic Packed‐Bed Flow Reactor for Continuous Room‐Temperature Hydrogen Release from Liquid Organic Carriers
Authors: Malek Y. S. Ibrahim, Jeffrey A. Bennett, Milad Abolhasani
Abstract:
Despite the potential of hydrogen (H2) storage in liquid organic carriers to achieve carbon neutrality, the energy required for H2 release and the cost of catalyst recycling has hindered its large-scale adoption. In response, a photo flow reactor packed with rhodium (Rh)/titania (TiO2) photocatalyst was reported for the continuous and selective acceptorless dehydrogenation of 1,2,3,4-tetrahydroquinoline to H2 gas and quinoline under visible light irradiation at room temperature. The tradeoff between the reactor pressure drop and its photocatalytic surface area was resolved by selective in-situ photodeposition of Rh in the photo flow reactor post-packing on the outer surface of the TiO2 microparticles available to photon flux, thereby reducing the optimal Rh loading by 10 times compared to a batch reactor, while facilitating catalyst reuse and regeneration. An example of using quinoline as a hydrogen acceptor to lower the energy of the hydrogen production step was demonstrated via the water-gas shift reaction.Keywords: hydrogen storage, flow chemistry, photocatalysis, solar hydrogen
Procedia PDF Downloads 1021878 Surface Hole Defect Detection of Rolled Sheets Based on Pixel Classification Approach
Authors: Samira Taleb, Sakina Aoun, Slimane Ziani, Zoheir Mentouri, Adel Boudiaf
Abstract:
Rolling is a pressure treatment technique that modifies the shape of steel ingots or billets between rotating rollers. During this process, defects may form on the surface of the rolled sheets and are likely to affect the performance and quality of the finished product. In our study, we developed a method for detecting surface hole defects using a pixel classification approach. This work includes several steps. First, we performed image preprocessing to delimit areas with and without hole defects on the sheet image. Then, we developed the histograms of each area to generate the gray level membership intervals of the pixels that characterize each area. As we noticed an intersection between the characteristics of the gray level intervals of the images of the two areas, we finally performed a learning step based on a series of detection tests to refine the membership intervals of each area, and to choose the defect detection criterion in order to optimize the recognition of the surface hole.Keywords: classification, defect, surface, detection, hole
Procedia PDF Downloads 301877 Study of the Best Algorithm to Estimate Sunshine Duration from Global Radiation on Horizontal Surface for Tropical Region
Authors: Tovondahiniriko Fanjirindratovo, Olga Ramiarinjanahary, Paulisimone Rasoavonjy
Abstract:
The sunshine duration, which is the sum of all the moments when the solar beam radiation is up to a minimal value, is an important parameter for climatology, tourism, agriculture and solar energy. Its measure is usually given by a pyrheliometer installed on a two-axis solar tracker. Due to the high cost of this device and the availability of global radiation on a horizontal surface, on the other hand, several studies have been done to make a correlation between global radiation and sunshine duration. Most of these studies are fitted for the northern hemisphere using a pyrheliometric database. The aim of the present work is to list and assess all the existing methods and apply them to Reunion Island, a tropical region in the southern hemisphere. Using a database of ten years, global, diffuse and beam radiation for a horizontal surface are employed in order to evaluate the uncertainty of existing algorithms for a tropical region. The methodology is based on indirect comparison because the solar beam radiation is not measured but calculated by the beam radiation on a horizontal surface and the sun elevation angle.Keywords: Carpentras method, data fitting, global radiation, sunshine duration, Slob and Monna algorithm, step algorithm
Procedia PDF Downloads 1301876 Investigation on the Kinetic Mechanism of the Reduction of Fe₂O₃/CoO-Decorated Carbon Xerogel
Authors: Mohammad Reza Ghaani, Michele Catti
Abstract:
The reduction of CoO/Fe₂O₃ oxides supported on carbon xerogels was studied to elucidate the effect of nano-size distribution of the catalyst in carbon matrices. Resorcinol formaldehyde xerogels were synthesized, impregnated with iron and cobalt nitrates, and subsequently heated to obtain the oxides. The mechanism of oxide reduction to metal was investigated by in-situ synchrotron X-ray diffraction in dynamic, non-isothermal conditions. Kinetic profiles of the reactions were obtained by plotting the diffraction intensities of selected Bragg peaks vs. temperature. The extracted Temperature-Programmed-Reduction (TPR) diagrams were analyzed by appropriate kinetic models, leading to best results with the Avrami-Erofeev model for all reduction reactions considered. The activation energies for the two-step reduction of iron oxide were 65 and 37 kJmol⁻¹, respectively. The average value for the reduction of CoO to Co was found to be around 21 kJ mol⁻¹. Such results may contribute to develop efficient and inexpensive non-noble metal-based catalysts in element form, e.g., Fe, Co, via heterogenization of metal complexes on mesoporous supports.Keywords: non-isothermal kinetics, carbon aerogel, in-situ synchrotron X-ray diffraction, reduction mechanisms
Procedia PDF Downloads 2491875 Environmentally Sustainable Transparent Wood: A Fully Green Approach from Bleaching to Impregnation for Energy-Efficient Engineered Wood Components
Authors: Francesca Gullo, Paola Palmero, Massimo Messori
Abstract:
Transparent wood is considered a promising structural material for the development of environmentally friendly, energy-efficient engineered components. To obtain transparent wood from natural wood materials two approaches can be used: i) bottom-up and ii) top-down. Through the second method, the color of natural wood samples is lightened through a chemical bleaching process that acts on chromophore groups of lignin, such as the benzene ring, quinonoid, vinyl, phenolics, and carbonyl groups. These chromophoric units form complex conjugate systems responsible for the brown color of wood. There are two strategies to remove color and increase the whiteness of wood: i) lignin removal and ii) lignin bleaching. In the lignin removal strategy, strong chemicals containing chlorine (chlorine, hypochlorite, and chlorine dioxide) and oxidizers (oxygen, ozone, and peroxide) are used to completely destroy and dissolve the lignin. In lignin bleaching methods, a moderate reductive (hydrosulfite) or oxidative (hydrogen peroxide) is commonly used to alter or remove the groups and chromophore systems of lignin, selectively discoloring the lignin while keeping the macrostructure intact. It is, therefore, essential to manipulate nanostructured wood by precisely controlling the nanopores in the cell walls by monitoring both chemical treatments and process conditions, for instance, the treatment time, the concentration of chemical solutions, the pH value, and the temperature. The elimination of wood light scattering is the second step in the fabrication of transparent wood materials, which can be achieved through two-step approaches: i) the polymer impregnation method and ii) the densification method. For the polymer impregnation method, the wood scaffold is treated with polymers having a corresponding refractive index (e.g., PMMA and epoxy resins) under vacuum to obtain the transparent composite material, which can finally be pressed to align the cellulose fibers and reduce interfacial defects in order to have a finished product with high transmittance (>90%) and excellent light-guiding. However, both the solution-based bleaching and the impregnation processes used to produce transparent wood generally consume large amounts of energy and chemicals, including some toxic or pollutant agents, and are difficult to scale up industrially. Here, we report a method to produce optically transparent wood by modifying the lignin structure with a chemical reaction at room temperature using small amounts of hydrogen peroxide in an alkaline environment. This method preserves the lignin, which results only deconjugated and acts as a binder, providing both a strong wood scaffold and suitable porosity for infiltration of biobased polymers while reducing chemical consumption, the toxicity of the reagents used, polluting waste, petroleum by-products, energy and processing time. The resulting transparent wood demonstrates high transmittance and low thermal conductivity. Through the combination of process efficiency and scalability, the obtained materials are promising candidates for application in the field of construction for modern energy-efficient buildings.Keywords: bleached wood, energy-efficient components, hydrogen peroxide, transparent wood, wood composites
Procedia PDF Downloads 591874 Models Comparison for Solar Radiation
Authors: Djelloul Benatiallah
Abstract:
Due to the current high consumption and recent industry growth, the depletion of fossil and natural energy supplies like oil, gas, and uranium is declining. Due to pollution and climate change, there needs to be a swift switch to renewable energy sources. Research on renewable energy is being done to meet energy needs. Solar energy is one of the renewable resources that can currently meet all of the world's energy needs. In most parts of the world, solar energy is a free and unlimited resource that can be used in a variety of ways, including photovoltaic systems for the generation of electricity and thermal systems for the generation of heatfor the residential sector's production of hot water. In this article, we'll conduct a comparison. The first step entails identifying the two empirical models that will enable us to estimate the daily irradiations on a horizontal plane. On the other hand, we compare it using the data obtained from measurements made at the Adrar site over the four distinct seasons. The model 2 provides a better estimate of the global solar components, with an absolute mean error of less than 7% and a correlation coefficient of more than 0.95, as well as a relative coefficient of the bias error that is less than 6% in absolute value and a relative RMSE that is less than 10%, according to a comparison of the results obtained by simulating the two models.Keywords: solar radiation, renewable energy, fossil, photovoltaic systems
Procedia PDF Downloads 831873 Nanotechnology-Based Treatment of Liver Cancer
Authors: Lucian Mocan
Abstract:
We present method of Nanoparticle enhanced laser thermal ablation of HepG2 cells (Human hepatocellular liver carcinomacell line), using gold nanoparticles combuned with a specific growth factor and demonstrate its selective therapeutic efficacy usig ex vivo specimens. Ex vivo-perfused liver specimens were obtained from hepatocellular carcinoma patients similarly to the surgical technique of transplantation. Ab bound to GNPs was inoculated intra-arterially onto the resulting specimen and determined the specific delivery of the nano-bioconjugate into the malignant tissue by means of the capillary bed. The extent of necrosis was considerable following laser therapy and at the same time surrounding parenchyma was not seriously affected. The selective photothermal ablation of the malignant liver tissue was obtained after the selective accumulation of Ab bound to GNPs into tumor cells following ex-vivo intravascular perfusion. These unique results may represent a major step in liver cancer treatment using nanolocalized thermal ablation by laser heating.Keywords: HepG2 cells, gold nanoparticles, nanoparticle functionalization, laser irradiation
Procedia PDF Downloads 3711872 A Tool for Assessing Performance and Structural Quality of Business Process
Authors: Mariem Kchaou, Wiem Khlif, Faiez Gargouri
Abstract:
Modeling business processes is an essential task when evaluating, improving, or documenting existing business processes. To be efficient in such tasks, a business process model (BPM) must have high structural quality and high performance. Evidently, evaluating the performance of a business process model is a necessary step to reduce time, cost, while assessing the structural quality aims to improve the understandability and the modifiability of the BPMN model. To achieve these objectives, a set of structural and performance measures have been proposed. Since the diversity of measures, we propose a framework that integrates both structural and performance aspects for classifying them. Our measure classification is based on business process model perspectives (e.g., informational, functional, organizational, behavioral, and temporal), and the elements (activity, event, actor, etc.) involved in computing the measures. Then, we implement this framework in a tool assisting the structural quality and the performance of a business process. The tool helps the designers to select an appropriate subset of measures associated with the corresponding perspective and to calculate and interpret their values in order to improve the structural quality and the performance of the model.Keywords: performance, structural quality, perspectives, tool, classification framework, measures
Procedia PDF Downloads 1601871 High-Throughput, Purification-Free, Multiplexed Profiling of Circulating miRNA for Discovery, Validation, and Diagnostics
Authors: J. Hidalgo de Quintana, I. Stoner, M. Tackett, G. Doran, C. Rafferty, A. Windemuth, J. Tytell, D. Pregibon
Abstract:
We have developed the Multiplexed Circulating microRNA assay that allows the detection of up to 68 microRNA targets per sample. The assay combines particlebased multiplexing, using patented Firefly hydrogel particles, with single step RT-PCR signal. Thus, the Circulating microRNA assay leverages PCR sensitivity while eliminating the need for separate reverse transcription reactions and mitigating amplification biases introduced by target-specific qPCR. Furthermore, the ability to multiplex targets in each well eliminates the need to split valuable samples into multiple reactions. Results from the Circulating microRNA assay are interpreted using Firefly Analysis Workbench, which allows visualization, normalization, and export of experimental data. To aid discovery and validation of biomarkers, we have generated fixed panels for Oncology, Cardiology, Neurology, Immunology, and Liver Toxicology. Here we present the data from several studies investigating circulating and tumor microRNA, showcasing the ability of the technology to sensitively and specifically detect microRNA biomarker signatures from fluid specimens.Keywords: biomarkers, biofluids, miRNA, photolithography, flowcytometry
Procedia PDF Downloads 3721870 Seismic Base Shear Force Depending on Building Fundamental Period and Site Conditions: Deterministic Formulation and Probabilistic Analysis
Authors: S. Dorbani, M. Badaoui, D. Benouar
Abstract:
The aim of this paper is to investigate the effect of the building fundamental period of reinforced concrete buildings of (6, 9, and 12-storey), with different floor plans: Symmetric, mono-symmetric, and unsymmetric. These structures are erected at different epicentral distances. Using the Boumerdes, Algeria (2003) earthquake data, we focused primarily on the establishment of the deterministic formulation linking the base shear force to two parameters: The first one is the fundamental period that represents the numerical fingerprint of the structure, and the second one is the epicentral distance used to represent the impact of the earthquake on this force. In a second step, with a view to highlight the effect of uncertainty in these parameters on the analyzed response, these parameters are modeled as random variables with a log-normal distribution. The variability of the coefficients of variation of the chosen uncertain parameters, on the statistics on the seismic base shear force, showed that the effect of uncertainty on fundamental period on this force statistics is low compared to the epicentral distance uncertainty influence.Keywords: base shear force, fundamental period, epicentral distance, uncertainty, lognormal variables, statistics
Procedia PDF Downloads 3221869 Switched System Diagnosis Based on Intelligent State Filtering with Unknown Models
Authors: Nada Slimane, Foued Theljani, Faouzi Bouani
Abstract:
The paper addresses the problem of fault diagnosis for systems operating in several modes (normal or faulty) based on states assessment. We use, for this purpose, a methodology consisting of three main processes: 1) sequential data clustering, 2) linear model regression and 3) state filtering. Typically, Kalman Filter (KF) is an algorithm that provides estimation of unknown states using a sequence of I/O measurements. Inevitably, although it is an efficient technique for state estimation, it presents two main weaknesses. First, it merely predicts states without being able to isolate/classify them according to their different operating modes, whether normal or faulty modes. To deal with this dilemma, the KF is endowed with an extra clustering step based fully on sequential version of the k-means algorithm. Second, to provide state estimation, KF requires state space models, which can be unknown. A linear regularized regression is used to identify the required models. To prove its effectiveness, the proposed approach is assessed on a simulated benchmark.Keywords: clustering, diagnosis, Kalman Filtering, k-means, regularized regression
Procedia PDF Downloads 1851868 The Convolution Recurrent Network of Using Residual LSTM to Process the Output of the Downsampling for Monaural Speech Enhancement
Authors: Shibo Wei, Ting Jiang
Abstract:
Convolutional-recurrent neural networks (CRN) have achieved much success recently in the speech enhancement field. The common processing method is to use the convolution layer to compress the feature space by multiple upsampling and then model the compressed features with the LSTM layer. At last, the enhanced speech is obtained by deconvolution operation to integrate the global information of the speech sequence. However, the feature space compression process may cause the loss of information, so we propose to model the upsampling result of each step with the residual LSTM layer, then join it with the output of the deconvolution layer and input them to the next deconvolution layer, by this way, we want to integrate the global information of speech sequence better. The experimental results show the network model (RES-CRN) we introduce can achieve better performance than LSTM without residual and overlaying LSTM simply in the original CRN in terms of scale-invariant signal-to-distortion ratio (SI-SNR), speech quality (PESQ), and intelligibility (STOI).Keywords: convolutional-recurrent neural networks, speech enhancement, residual LSTM, SI-SNR
Procedia PDF Downloads 2061867 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study
Authors: K. Adu Michael, K. Alese Boniface
Abstract:
Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.Keywords: client/customer, problem statement, requirements engineering, software developers
Procedia PDF Downloads 411