Search results for: transit signal priority
1293 Resource Allocation and Task Scheduling with Skill Level and Time Bound Constraints
Authors: Salam Saudagar, Ankit Kamboj, Niraj Mohan, Satgounda Patil, Nilesh Powar
Abstract:
Task Assignment and Scheduling is a challenging Operations Research problem when there is a limited number of resources and comparatively higher number of tasks. The Cost Management team at Cummins needs to assign tasks based on a deadline and must prioritize some of the tasks as per business requirements. Moreover, there is a constraint on the resources that assignment of tasks should be done based on an individual skill level, that may vary for different tasks. Another constraint is for scheduling the tasks that should be evenly distributed in terms of number of working hours, which adds further complexity to this problem. The proposed greedy approach to solve assignment and scheduling problem first assigns the task based on management priority and then by the closest deadline. This is followed by an iterative selection of an available resource with the least allocated total working hours for a task, i.e. finding the local optimal choice for each task with the goal of determining the global optimum. The greedy approach task allocation is compared with a variant of Hungarian Algorithm, and it is observed that the proposed approach gives an equal allocation of working hours among the resources. The comparative study of the proposed approach is also done with manual task allocation and it is noted that the visibility of the task timeline has increased from 2 months to 6 months. An interactive dashboard app is created for the greedy assignment and scheduling approach and the tasks with more than 2 months horizon that were waiting in a queue without a delivery date initially are now analyzed effectively by the business with expected timelines for completion.Keywords: assignment, deadline, greedy approach, Hungarian algorithm, operations research, scheduling
Procedia PDF Downloads 1471292 Auto Classification of Multiple ECG Arrhythmic Detection via Machine Learning Techniques: A Review
Authors: Ng Liang Shen, Hau Yuan Wen
Abstract:
Arrhythmia analysis of ECG signal plays a major role in diagnosing most of the cardiac diseases. Therefore, a single arrhythmia detection of an electrocardiographic (ECG) record can determine multiple pattern of various algorithms and match accordingly each ECG beats based on Machine Learning supervised learning. These researchers used different features and classification methods to classify different arrhythmia types. A major problem in these studies is the fact that the symptoms of the disease do not show all the time in the ECG record. Hence, a successful diagnosis might require the manual investigation of several hours of ECG records. The point of this paper presents investigations cardiovascular ailment in Electrocardiogram (ECG) Signals for Cardiac Arrhythmia utilizing examination of ECG irregular wave frames via heart beat as correspond arrhythmia which with Machine Learning Pattern Recognition.Keywords: electrocardiogram, ECG, classification, machine learning, pattern recognition, detection, QRS
Procedia PDF Downloads 3761291 Mathematical Based Forecasting of Heart Attack
Authors: Razieh Khalafi
Abstract:
Myocardial infarction (MI) or acute myocardial infarction (AMI), commonly known as a heart attack, occurs when blood flow stops to part of the heart causing damage to the heart muscle. An ECG can often show evidence of a previous heart attack or one that's in progress. The patterns on the ECG may indicate which part of your heart has been damaged, as well as the extent of the damage. In chaos theory, the correlation dimension is a measure of the dimensionality of the space occupied by a set of random points, often referred to as a type of fractal dimension. In this research by considering ECG signal as a random walk we work on forecasting the oncoming heart attack by analyzing the ECG signals using the correlation dimension. In order to test the model a set of ECG signals for patients before and after heart attack was used and the strength of model for forecasting the behavior of these signals were checked. Results shows this methodology can forecast the ECG and accordingly heart attack with high accuracy.Keywords: heart attack, ECG, random walk, correlation dimension, forecasting
Procedia PDF Downloads 5411290 Innovation and Analysis of Vibrating Fork Level Switch
Authors: Kuen-Ming Shu, Cheng-Yu Chen
Abstract:
A vibrating-fork sensor can measure the level height of solids and liquids and operates according to the principle that vibrations created by piezoelectric ceramics are transmitted to the vibrating fork, which produces resonance. When the vibrating fork touches an object, its resonance frequency changes and produces a signal that returns to a controller for immediate adjustment, so as to effectively monitor raw material loading. The design of the vibrating fork in a vibrating-fork material sensor is crucial. In this paper, ANSYS finite element analysis software is used to perform modal analysis on the vibrations of the vibrating fork. In addition, to design and produce a superior vibrating fork, the dimensions and welding shape of the vibrating fork are compared in a simulation performed using the Taguchi method.Keywords: vibrating fork, piezoelectric ceramics, sound wave, ANSYS, Taguchi method, modal analysis
Procedia PDF Downloads 2491289 Photographic Documentation of Archaeological Collections in the Grand Egyptian Museum
Authors: Sameh El Mahdy
Abstract:
Recording and documenting archaeological collections, especially photographic documentation, is considered one of the very important matters that museums care about and give great priority, as photographic documentation is of great importance. We monitor some of them for example, Photographs of collectibles are considered evidence and an archival record that proves the condition of the collectibles at various stages. A photo of the possessions is placed on the paper record of the possessions registration. These photos are used in inventorying archaeological collections. These pictures are viewed by researchers and scholars interested in studying these collections. These images are used in advertising campaigns for museum displays of archaeological collections. The Grand Egyptian Museum is considered one of the museums that is a unique model in terms of establishing a specific system that is used when photographing archaeological collections. The Grand Egyptian Museum sets standards for the photos that are taken inside the Grand Egyptian Museum. We mention some of them for example, Pictures must be of high quality. It is necessary to set a color scale for the drawing in order to clarify the dimensions of the collectibles in the picture and also in order to clarify the natural colors of the collectibles without any additions. Putting the numbers of the collectibles in the pictures, especially the number of the Grand Egyptian Museum. To take a good photo of the artifacts in the Grand Egyptian Museum, there are many steps: (1) Create a good location, (2) How to handle the Artifacts. (3) Choose the best position for the artifact, (4) Make the light to create a good photo without shadows to make the photo represent all the artifact details. (5) Be sure of the camera settings, and their quality. All of these steps and other ones are the best criteria for taking the best photo, which helps us in the database to represent the details of the artifact in our interface.Keywords: grand egyptian museum, photographing, museum collections, registration and documentation
Procedia PDF Downloads 401288 Enhanced Bit Error Rate in Visible Light Communication: A New LED Hexagonal Array Distribution
Authors: Karim Matter, Heba Fayed, Ahmed Abd-Elaziz, Moustafa Hussein
Abstract:
Due to the exponential growth of mobile devices and wireless services, a huge demand for radiofrequency has increased. The presence of several frequencies causes interference between cells, which must be minimized to get the lower Bit Error Rate (BER). For this reason, it is of great interest to use visible light communication (VLC). This paper suggests a VLC system that decreases the BER by applying a new LED distribution with a hexagonal shape using a Frequency Reuse (FR) concept to mitigate the interference between the reused frequencies inside the hexagonal shape. The BER is measured in two scenarios, Line of Sight (LoS) and Non-Line of Sight (Non-LoS), for each technique that we used. The recommended values of BER in the proposed model for Soft Frequency Reuse (SFR) in the case of Los at 4, 8, and 10 dB signal to noise ratio (SNR), are 3.6×10⁻⁶, 6.03×10⁻¹³, and 2.66×10⁻¹⁸, respectively.Keywords: visible light communication (VLC), field of view (FoV), hexagonal array, frequency reuse
Procedia PDF Downloads 1601287 Load Characteristics of Improved Howland Current Pump for Bio-Impedance Measurement
Authors: Zhao Weijie, Lin Xinjian, Liu Xiaojuan, Li Lihua
Abstract:
The Howland current pump is widely used in bio-impedance measurement. Much attention has been focused on the output impedance of the Howland circuit. Here we focus on the maximum load of the Howland source and discuss the relationship between the circuit parameters at maximum load. We conclude that the signal input terminal of the feedback resistor should be as large as possible, but that the current-limiting resistor should be smaller. The op-amp saturation voltage should also be high. The bandwidth of the circuit is proportional to the bandwidth of the op-amp. The Howland current pump was simulated using multisim12. When the AD8066AR was selected as the op-amp, the maximum load was 11.5 kΩ, and the Howland current pump had a stable output ipp to 2mAp up to 200 kHz. However, with an OPA847 op-amp and a load of 6.3 kΩ, the output current was also stable, and the frequency was as high as 3 MHz.Keywords: bio-impedance, improved Howland current pump, load characteristics, bioengineering
Procedia PDF Downloads 5131286 Servant Leadership for Elder Care in St. Camillus Health Systems, USA
Authors: Anthoni Jeorge
Abstract:
Throughout the history of the world, servant leadership has been researched, and favourable results such as individual, team, and organizational have been linked to the construct. This research paper designates St. Camillus de Lellis, a practitioner of servant leadership and founder of the Ministers of the Sick as a servant leader in his approach to care for the sick. Service is the visible face of his servant leadership. First of all, despite many challenges, St. Camillus de Lellis practiced leadership by the example of compassionate service to the sick. Second, he made service to the sick the highest priority of his life. Third, Camillus displayed servant leadership such that his manner of leadership gave birth to a New School of Service to the Sick. The paper identifies the distinctive dimensions and essential elements which characterized his service-centered leadership. Furthermore, discuss the six major characteristics of a servant leader as set forth by St. Camillus’s life example. The research illustrates the transformational power of servant leadership infield healthcare in general and, in doing so, provides servant leadership seekers ways servant leadership can transform elder care in one’s own field (St. Camillus Health Systems). Thus, it ascertains that servant leadership is best-fit for humanized elder care. Supported by the review of literature, the paper ascertains that Camillus, by identifying himself with the sick, gained deeper insights concerning the pain and suffering of the population. Uniquely drawn from his true grit, Camillus’ service-centered leadership is value-based, people-oriented, and compassion-filled. His way of service to the sick is the prolongation of gestures of mercy and compassion. It is hoped that the results of this study will help health care workers and servant leadership practitioners to humanize elder care and cultivate servant leadership attitude in their health care services to the sick. By incorporating such service-oriented elements into their leadership orientation, health care workers will be true servant leaders of the sick.Keywords: leadership, service, healthcare, compassion
Procedia PDF Downloads 1641285 Combined Odd Pair Autoregressive Coefficients for Epileptic EEG Signals Classification by Radial Basis Function Neural Network
Authors: Boukari Nassim
Abstract:
This paper describes the use of odd pair autoregressive coefficients (Yule _Walker and Burg) for the feature extraction of electroencephalogram (EEG) signals. In the classification: the radial basis function neural network neural network (RBFNN) is employed. The RBFNN is described by his architecture and his characteristics: as the RBF is defined by the spread which is modified for improving the results of the classification. Five types of EEG signals are defined for this work: Set A, Set B for normal signals, Set C, Set D for interictal signals, set E for ictal signal (we can found that in Bonn university). In outputs, two classes are given (AC, AD, AE, BC, BD, BE, CE, DE), the best accuracy is calculated at 99% for the combined odd pair autoregressive coefficients. Our method is very effective for the diagnosis of epileptic EEG signals.Keywords: epilepsy, EEG signals classification, combined odd pair autoregressive coefficients, radial basis function neural network
Procedia PDF Downloads 3461284 Enhanced Weighted Centroid Localization Algorithm for Indoor Environments
Authors: I. Nižetić Kosović, T. Jagušt
Abstract:
Lately, with the increasing number of location-based applications, demand for highly accurate and reliable indoor localization became urgent. This is a challenging problem, due to the measurement variance which is the consequence of various factors like obstacles, equipment properties and environmental changes in complex nature of indoor environments. In this paper we propose low-cost custom-setup infrastructure solution and localization algorithm based on the Weighted Centroid Localization (WCL) method. Localization accuracy is increased by several enhancements: calibration of RSSI values gained from wireless nodes, repetitive measurements of RSSI to exclude deviating values from the position estimation, and by considering orientation of the device according to the wireless nodes. We conducted several experiments to evaluate the proposed algorithm. High accuracy of ~1m was achieved.Keywords: indoor environment, received signal strength indicator, weighted centroid localization, wireless localization
Procedia PDF Downloads 2321283 Comprehensive Analysis of Power Allocation Algorithms for OFDM Based Communication Systems
Authors: Rakesh Dubey, Vaishali Bahl, Dalveer Kaur
Abstract:
The spiralling urge for high rate data transmission over wireless mediums needs intelligent use of electromagnetic resources considering restrictions like power ingestion, spectrum competence, robustness against multipath propagation and implementation intricacy. Orthogonal frequency division multiplexing (OFDM) is a capable technique for next generation wireless communication systems. For such high rate data transfers there is requirement of proper allocation of resources like power and capacity amongst the sub channels. This paper illustrates various available methods of allocating power and the capacity requirement with the constraint of Shannon limit.Keywords: Additive White Gaussian Noise, Multi-Carrier Modulation, Orthogonal Frequency Division Multiplexing (OFDM), Signal to Noise Ratio (SNR), Water Filling
Procedia PDF Downloads 5541282 Questioning the Sustainability in Development: The Resilience of Local Variety of Rice in the Changing Dayak Community of Central Kalimantan, Indonesia
Authors: Semiarto Aji Purwanto, Sutji Shinto
Abstract:
Over a quarter century, the idea of sustainable development has become a global discussion. In Indonesia, more than five decades since the development of the country took priority over any other matter, a discussion on the need of development is still an intriguing. Far from the enthusiasm of development programs run by the Indonesian government since 1967, the Dayak community in the interior of Kalimantan tropical forest was significantly abandoned from the changes. There were not many programs for the interior because the focus of development mostly was in Java island. Consequently, the Dayak live their life as shifting cultivator that has been practiced for centuries. Our ethnographic observation conducted in April-July 2016, found that today, they still maintain the knowledge and keeping the existence of local variety of rice. While in Java, these varieties have been replaced by more-productive-and-resistant-to-pest varieties, the Dayak still maintain more than 60s varieties. From the biodiversity’s perspective, it is a delightful news; while from the cultural perspective, the persistence of their custom regarding to the practice of traditional cultivation is fascinating as well. The local knowledge of agriculture is well conserved and practice daily. It is revealed that the resilience of those rice varieties is related to the local social structure since the distribution of each variety usually limited to the particular clans in the community. While experiencing the lack of programs for village development, the community has maintained the local leadership and its government structure at the village level. The paper will explore the effect of how a neglected area, which was disregarded by development program, sustains their culture and biodiversity. We would like to discuss the concept of sustainability whether it needed for the development programs, for the changes into a modern civilisation, or for the sake of the local to survive.Keywords: sustainable development, local knowledge, rice, resilience, Kalimantan, Indonesia
Procedia PDF Downloads 2831281 Phantom and Clinical Evaluation of Block Sequential Regularized Expectation Maximization Reconstruction Algorithm in Ga-PSMA PET/CT Studies Using Various Relative Difference Penalties and Acquisition Durations
Authors: Fatemeh Sadeghi, Peyman Sheikhzadeh
Abstract:
Introduction: Block Sequential Regularized Expectation Maximization (BSREM) reconstruction algorithm was recently developed to suppress excessive noise by applying a relative difference penalty. The aim of this study was to investigate the effect of various strengths of noise penalization factor in the BSREM algorithm under different acquisition duration and lesion sizes in order to determine an optimum penalty factor by considering both quantitative and qualitative image evaluation parameters in clinical uses. Materials and Methods: The NEMA IQ phantom and 15 clinical whole-body patients with prostate cancer were evaluated. Phantom and patients were injected withGallium-68 Prostate-Specific Membrane Antigen(68 Ga-PSMA)and scanned on a non-time-of-flight Discovery IQ Positron Emission Tomography/Computed Tomography(PET/CT) scanner with BGO crystals. The data were reconstructed using BSREM with a β-value of 100-500 at an interval of 100. These reconstructions were compared to OSEM as a widely used reconstruction algorithm. Following the standard NEMA measurement procedure, background variability (BV), recovery coefficient (RC), contrast recovery (CR) and residual lung error (LE) from phantom data and signal-to-noise ratio (SNR), signal-to-background ratio (SBR) and tumor SUV from clinical data were measured. Qualitative features of clinical images visually were ranked by one nuclear medicine expert. Results: The β-value acts as a noise suppression factor, so BSREM showed a decreasing image noise with an increasing β-value. BSREM, with a β-value of 400 at a decreased acquisition duration (2 min/ bp), made an approximately equal noise level with OSEM at an increased acquisition duration (5 min/ bp). For the β-value of 400 at 2 min/bp duration, SNR increased by 43.7%, and LE decreased by 62%, compared with OSEM at a 5 min/bp duration. In both phantom and clinical data, an increase in the β-value is translated into a decrease in SUV. The lowest level of SUV and noise were reached with the highest β-value (β=500), resulting in the highest SNR and lowest SBR due to the greater noise reduction than SUV reduction at the highest β-value. In compression of BSREM with different β-values, the relative difference in the quantitative parameters was generally larger for smaller lesions. As the β-value decreased from 500 to 100, the increase in CR was 160.2% for the smallest sphere (10mm) and 12.6% for the largest sphere (37mm), and the trend was similar for SNR (-58.4% and -20.5%, respectively). BSREM visually was ranked more than OSEM in all Qualitative features. Conclusions: The BSREM algorithm using more iteration numbers leads to more quantitative accuracy without excessive noise, which translates into higher overall image quality and lesion detectability. This improvement can be used to shorter acquisition time.Keywords: BSREM reconstruction, PET/CT imaging, noise penalization, quantification accuracy
Procedia PDF Downloads 971280 Robust Control of a Single-Phase Inverter Using Linear Matrix Inequality Approach
Authors: Chivon Choeung, Heng Tang, Panha Soth, Vichet Huy
Abstract:
This paper presents a robust control strategy for a single-phase DC-AC inverter with an output LC-filter. An all-pass filter is utilized to create an artificial β-signal so that the proposed controller can be simply used in dq-synchronous frame. The proposed robust controller utilizes a state feedback control with integral action in the dq-synchronous frame. A linear matrix inequality-based optimization scheme is used to determine stabilizing gains of the controllers to maximize the convergence rate to steady state in the presence of uncertainties. The uncertainties of the system are described as the potential variation range of the inductance and resistance in the LC-filter.Keywords: single-phase inverter, linear matrix inequality, robust control, all-pass filter
Procedia PDF Downloads 1411279 Application of New Sprouted Wheat Brine for Delicatessen Products From Horse Meat, Beef and Pork
Authors: Gulmira Kenenbay, Urishbay Chomanov, Aruzhan Shoman, Rabiga Kassimbek
Abstract:
The main task of the meat-processing industry is the production of meat products as the main source of animal protein, ensuring the vital activity of the human body, in the required volumes, high quality, diverse assortment. Providing the population with high-quality food products what are biologically full, balanced in composition of basic nutrients and enriched by targeted physiologically active components, is one of the highest priority scientific and technical problems to be solved. In this regard, the formulation of a new brine from sprouted wheat for meat delicacies from horse meat, beef and pork has been developed. The new brine contains flavored aromatic ingredients, juice of the germinated wheat and vegetable juice. The viscosity of meat of horse meat, beef and pork were studied during massaging. Thermodynamic indices, water activity and binding energy of horse meat, beef and pork with application of new brine are investigated. A recipe for meat products with vegetable additives has been developed. Organoleptic evaluation of meat products was carried out. Physicochemical parameters of meat products with vegetable additives are carried out. Analysis of the obtained data shows that the values of the index aw (water activity) and the binding energy of moisture in the experimental samples of meat products are higher than in the control samples. It has been established by investigations that with increasing water activity and the binding energy of moisture, the tenderness of ready meat delicacies increases with the use of a new brine.Keywords: compounding, functional products, delicatessen products, brine, vegetable additives
Procedia PDF Downloads 1781278 Assessing Walkability in New Cities around Cairo
Authors: Lobna Ahmed Galal
Abstract:
Modal integration has given minimal consideration in cities of developing countries, as well as the declining dominance of public transport, and predominance of informal transport, the modal share of informal taxis in greater Cairo has increased from 6% in 1987 to 37% in 2001 and this has since risen even higher, informal and non-motorized modes of transport acting as a gap filler by feeding other modes of transport, not by design or choice, but often by lack of accessible and affordable public transport. Yet non-motorized transport is peripheral, with minimal priority in urban planning and investments, lacking of strong polices to support non-motorized transport, for authorities development is associated with technology and motorized transport, and promotion of non-motorized transport may be considered corresponding to development, as well as social stigma against non-motorized transport, as it is seen a travel mode for the poor. Cairo as a city of a developing country, has poor quality infrastructure for non-motorized transport, suffering from absence of dedicated corridors, and when existing they are often encroached for commercial purposes, widening traffic lanes at the expense of sidewalks, absence of footpaths, or being overcrowded, poor lighting, making walking unsafe and yet, lack of financial supply to such facilities as it is often considered beyond city capabilities. This paper will deal with the objective measuring of the built environment relating to walking, in some neighborhoods of new cities around Cairo, In addition to comparing the results of the objective measures of the built environment with the level of self-reported survey. The first paper's objective is to show how the index ‘walkability of community neighborhoods’ works in the contexts in neighborhoods of new cities around Cairo. The procedure of objective measuring is of a high potential to be carried out by using GIS.Keywords: assessing, built environment, Cairo, walkability
Procedia PDF Downloads 3831277 Filter for the Measurement of Supraharmonics in Distribution Networks
Authors: Sivaraman Karthikeyan
Abstract:
Due to rapidly developing power electronics devices and technologies such as power line communication or self-commutating converters, voltage and current distortion, as well as interferences, have increased in the frequency range of 2 kHz to 150 kHz; there is an urgent need for regulation of electromagnetic compatibility (EMC) standards in this frequency range. Measuring or testing compliance with emission and immunity limitations necessitates the use of precise, repeatable measuring methods. Appropriate filters to minimize the fundamental component and its harmonics below 2 kHz in the measuring signal would improve the measurement accuracy in this frequency range leading to better analysis. This paper discusses filter suggestions in the current measurement standard and proposes an infinite impulse response (IIR) filter design that is optimized for a low number of poles, strong fundamental damping, and high accuracy above 2 kHz. The new filter’s transfer function is delivered as a result. An analog implementation is derived from the overall design.Keywords: supraharmonics, 2 kHz, 150 kHz, filter, analog filter
Procedia PDF Downloads 1461276 Stable Tending Control of Complex Power Systems: An Example of Localized Design of Power System Stabilizers
Authors: Wenjuan Du
Abstract:
The phase compensation method was proposed based on the concept of the damping torque analysis (DTA). It is a method for the design of a PSS (power system stabilizer) to suppress local-mode power oscillations in a single-machine infinite-bus power system. This paper presents the application of the phase compensation method for the design of a PSS in a multi-machine power system. The application is achieved by examining the direct damping contribution of the stabilizer to the power oscillations. By using linearized equal area criterion, a theoretical proof to the application for the PSS design is presented. Hence PSS design in the paper is an example of stable tending control by localized method.Keywords: phase compensation method, power system small-signal stability, power system stabilizer
Procedia PDF Downloads 6411275 A New Floating Point Implementation of Base 2 Logarithm
Authors: Ahmed M. Mansour, Ali M. El-Sawy, Ahmed T. Sayed
Abstract:
Logarithms reduce products to sums and powers to products; they play an important role in signal processing, communication and information theory. They are primarily used for hardware calculations, handling multiplications, divisions, powers, and roots effectively. There are three commonly used bases for logarithms; the logarithm with base-10 is called the common logarithm, the natural logarithm with base-e and the binary logarithm with base-2. This paper demonstrates different methods of calculation for log2 showing the complexity of each and finds out the most accurate and efficient besides giving in- sights to their hardware design. We present a new method called Floor Shift for fast calculation of log2, and then we combine this algorithm with Taylor series to improve the accuracy of the output, we illustrate that by using two examples. We finally compare the algorithms and conclude with our remarks.Keywords: logarithms, log2, floor, iterative, CORDIC, Taylor series
Procedia PDF Downloads 5321274 Targeted Delivery of Docetaxel Drug Using Cetuximab Conjugated Vitamin E TPGS Micelles Increases the Anti-Tumor Efficacy and Inhibit Migration of MDA-MB-231 Triple Negative Breast Cancer
Authors: V. K. Rajaletchumy, S. L. Chia, M. I. Setyawati, M. S. Muthu, S. S. Feng, D. T. Leong
Abstract:
Triple negative breast cancers (TNBC) can be classified as one of the most aggressive with a high rate of local recurrences and systematic metastases. TNBCs are insensitive to existing hormonal therapy or targeted therapies such as the use of monoclonal antibodies, due to the lack of oestrogen receptor (ER) and progesterone receptor (PR) and the absence of overexpression of human epidermal growth factor receptor 2 (HER2) compared with other types of breast cancers. The absence of targeted therapies for selective delivery of therapeutic agents into tumours, led to the search for druggable targets in TNBC. In this study, we developed a targeted micellar system of cetuximab-conjugated micelles of D-α-tocopheryl polyethylene glycol succinate (vitamin E TPGS) for targeted delivery of docetaxel as a model anticancer drug for the treatment of TNBCs. We examined the efficacy of our micellar system in xenograft models of triple negative breast cancers and explored the effect of the micelles on post-treatment tumours in order to elucidate the mechanism underlying the nanomedicine treatment in oncology. The targeting micelles were found preferentially accumulated in tumours immediately after the administration of the micelles compare to normal tissue. The fluorescence signal gradually increased up to 12 h at the tumour site and sustained for up to 24 h, reflecting the increases in targeted micelles (TPFC) micelles in MDA-MB-231/Luc cells. In comparison, for the non-targeting micelles (TPF), the fluorescence signal was evenly distributed all over the body of the mice. Only a slight increase in fluorescence at the chest area was observed after 24 h post-injection, reflecting the moderate uptake of micelles by the tumour. The successful delivery of docetaxel into tumour by the targeted micelles (TPDC) exhibited a greater degree of tumour growth inhibition than Taxotere® after 15 days of treatment. The ex vivo study has demonstrated that tumours treated with targeting micelles exhibit enhanced cell cycle arrest and attenuated proliferation compared with the control and with those treated non-targeting micelles. Furthermore, the ex vivo investigation revealed that both the targeting and non-targeting micellar formulations shows significant inhibition of cell migration with migration indices reduced by 0.098- and 0.28-fold, respectively, relative to the control. Overall, both the in vivo and ex vivo data increased the confidence that our micellar formulations effectively targeted and inhibited EGF-overexpressing MDA-MB-231 tumours.Keywords: biodegradable polymers, cancer nanotechnology, drug targeting, molecular biomaterials, nanomedicine
Procedia PDF Downloads 2811273 Investigating Teachers’ Approaches in Teaching English and Students’ Communicative Ability in a Tertiary College
Authors: Adel Ben Mohamed
Abstract:
The widespread use of the English language around the world has pushed many countries to consider such a language as a top priority in their educational system. One of these countries is the Sultanate of Oman. In this frame, the Omani government has allocated huge budgets as well as resources in order to implement the English language in its education system. The importance of English is prevalent in Oman. This is clearly noticeable through remarkable signs. For instance, most of the official documents in Oman are in both Arabic (the mother tongue) or English. In addition to that, there is a mushroom of English language institutes all over the country. In 2020, there are over fourteen English language institutes and centers in Oman (esl base, 2020). Moreover, these days most of the Omani parents are sending their children for tuition to learn the English language. Hence, it is apparent that the Sultanate of Oman is giving a great value to the importance of English in attaining various goals. However, in the world of work, what is more, important today is fluency rather than accuracy. Therefore, many people go for communication English rather than technical English. For example, Oman Daily Observer newspaper published a job advertisement of a sale assistant on 23rd of November 2020, recommended that speaking very well English is a must to be hired for the position (Oman Observer, 2020). In line with this and because of the great importance of the English language in Oman, the ministry of higher education has placed much emphasis on this official foreign language. Therefore, in the Omani educational system, all post -secondary students must sit for one year in one of the higher education institutions as a General Foundation Programmes (GFP) prior to moving to their respective majors in diploma level. Accordingly, the implementation of any teaching approach is determined by different factors: some are directly linked to teachers while others are related to organizational variables.Keywords: teaching approaches, communicative, ability, investigating
Procedia PDF Downloads 931272 The Perspectives of Preparing Psychology Practitioners in Armenian Universities
Authors: L. Petrosyan
Abstract:
The problem of psychologist training remains a key priority in Armenia. During the Soviet period, the notion of a psychologist was obscure not only in Armenia but also in other Soviet republics. The breakup of the Soviet Union triggered a gradual change in this area activating the cooperation with specialists from other countries. The need for recovery from the psychological trauma caused by the 1988 earthquake pushed forward the development of practical psychology in Armenia. This phenomenon led to positive changes in perception of and interest to a psychologist profession.Armenian universities started designing special programs for psychologists’ preparation. Armenian psychologists combined their efforts in the field of training relevant specialists. During the recent years, the Bologna educational system was introduced in Armenia which led to implementation of education quality improvement programs. Nevertheless, even today the issue of psychologists’ training is not yet settled in Armenian universities. So far graduate psychologists haven’t got a clear idea of personal and professional qualities of a psychologist. Recently, as a result of educational reforms, the psychology curricula underwent changes, but so far they have not led to a desired outcome. Almost all curricula in certain specialties are aimed to form professional competencies and strengthen practical skills. A survey conducted in Armenia aimed to identify what are the ideas of young psychology specialists on the image of a psychologist. The survey respondents were 45 specialists holding bachelor’s degree as well as 30 master degree graduates, who have not been working yet. The research reveals that we need to change the approach of preparing psychology practitioners in the universities of Armenia. Such an approach to psychologist training will make it possible to train qualified specialists for enhancement of modern psychology theory and practice.Keywords: practitioners, psychology degree, study, professional competencies
Procedia PDF Downloads 4521271 Vision Zero for the Caribbean Using the Systemic Approach for Road Safety: A Case Study Analyzing Jamaican Road Crash Data (Ongoing)
Authors: Rachelle McFarlane
Abstract:
The Second Decade of Action Road Safety has begun with increased focus on countries who are disproportionately affected by road fatalities. Researchers highlight the low effectiveness of road safety campaigns in Latin America and the Caribbean (LAC) still reporting approximately 130,000 deaths and six million injuries annually. The regional fatality rate 19.2 per 100,000 with heightened concern for persons 15 to 44 years. In 2021, 483 Jamaicans died in 435 crashes, with 33% of these fatalities occurring during Covid-19 curfew hours. The study objective is to conduct a systemic safety review of Jamaican road crashes and provide a framework for its use in complementing traditional methods. The methodology involves the use of the FHWA Systemic Safety Project Selection Tool for analysis. This tool reviews systemwide data in order to identify risk factors across the network associated with severe and fatal crashes, rather that only hotspots. A total of 10,379 crashes with 745 fatalities and serious injuries were reviewed. Of the focus crash types listed, 50% of ‘Pedestrian Accidents’ resulted in fatalities and serious injuries, followed by 32% ‘Bicycle’, 24% ‘Single’ and 12% of ‘Head-on’. This study seeks to understand the associated risk factors with these priority crash types across the network and recommend cost-effective countermeasures across common sites. As we press towards Vision Zero, the inclusion of the systemic safety review method, complementing traditional methods, may create a wider impact in reducing road fatalities and serious injury by targeting issues across network with similarities; focus crash types and contributing factors.Keywords: systemic safety review, risk factors, road crashes, crash types
Procedia PDF Downloads 911270 A Network Optimization Study of Logistics for Enhancing Emergency Preparedness in Asia-Pacific
Authors: Giuseppe Timperio, Robert De Souza
Abstract:
The combination of factors such as temperamental climate change, rampant urbanization of risk exposed areas, political and social instabilities, is posing an alarming base for the further growth of number and magnitude of humanitarian crises worldwide. Given the unique features of humanitarian supply chain such as unpredictability of demand in space, time, and geography, spike in the number of requests for relief items in the first days after the calamity, uncertain state of logistics infrastructures, large volumes of unsolicited low-priority items, a proactive approach towards design of disaster response operations is needed to achieve high agility in mobilization of emergency supplies in the immediate aftermath of the event. This paper is an attempt in that direction, and it provides decision makers with crucial strategic insights for a more effective network design for disaster response. Decision sciences and ICT are integrated to analyse the robustness and resilience of a prepositioned network of emergency strategic stockpiles for a real-life case about Indonesia, one of the most vulnerable countries in Asia-Pacific, with the model being built upon a rich set of quantitative data. At this aim, a network optimization approach was implemented, with several what-if scenarios being accurately developed and tested. Findings of this study are able to support decision makers facing challenges related with disaster relief chains resilience, particularly about optimal configuration of supply chain facilities and optimal flows across the nodes, while considering the network structure from an end-to-end in-country distribution perspective.Keywords: disaster preparedness, humanitarian logistics, network optimization, resilience
Procedia PDF Downloads 1761269 Mutation Profiling of Paediatric Solid Tumours in a Cohort of South African Patients
Authors: L. Lamola, E. Manolas, A. Krause
Abstract:
Background: The incidence of childhood cancer incidence is increasing gradually in low-middle income countries, such as South Africa. Globally, there is an extensive range of familial- and hereditary-cancer syndromes, where underlying germline variants increase the likelihood of developing cancer in childhood. Next-Generation Sequencing (NGS) technologies have been key in determining the occurrence and genetic contribution of germline variants to paediatric cancer development. We aimed to design and evaluate a candidate gene panel specific to inherited cancer-predisposing genes to provide a comprehensive insight into the contribution of germline variants to childhood cancer. Methods: 32 paediatric patients (aged 0-18 years) diagnosed with a malignant tumour were recruited, and biological samples were obtained. After quality control, DNA was sequenced using an ion Ampliseq 50 candidate gene panel design and Ion Torrent S5 technologies. Sequencing variants were called using Ion Torrent Suite software and were subsequently annotated using Ion Reporter and Ensembl's VEP. High priority variants were manually analysed using tools such as MutationTaster, SIFT-INDEL and VarSome. Putative identified candidates were validated via Sanger Sequencing. Results: The patients studied had a variety of cancers, the most common being nephroblastoma (13), followed by osteosarcoma (4) and astrocytoma (3). We identified 10 pathogenic / likely pathogenic variants in 10 patients, most of which were novel. Conclusions: According to the literature, we expected ~10% of our patient population to harbour pathogenic or likely pathogenic germline variants, however, we reported about 3 times (~30%) more than we expected. Majority of the identified variants are novel; this may be because this is the first study of its kind in an understudied South African population.Keywords: Africa, genetics, germline-variants, paediatric-cancer
Procedia PDF Downloads 1391268 Study and Analysis of Optical Intersatellite Links
Authors: Boudene Maamar, Xu Mai
Abstract:
Optical Intersatellite Links (OISLs) are wireless communications using optical signals to interconnect satellites. It is expected to be the next generation wireless communication technology according to its inherent characteristics like: an increased bandwidth, a high data rate, a data transmission security, an immunity to interference, and an unregulated spectrum etc. Optical space links are the best choice for the classical communication schemes due to its distinctive properties; high frequency, small antenna diameter and lowest transmitted power, which are critical factors to define a space communication. This paper discusses the development of free space technology and analyses the parameters and factors to establish a reliable intersatellite links using an optical signal to exchange data between satellites.Keywords: optical intersatellite links, optical wireless communications, free space optical communications, next generation wireless communication
Procedia PDF Downloads 4471267 Investigation of Cavitation in a Centrifugal Pump Using Synchronized Pump Head Measurements, Vibration Measurements and High-Speed Image Recording
Authors: Simon Caba, Raja Abou Ackl, Svend Rasmussen, Nicholas E. Pedersen
Abstract:
It is a challenge to directly monitor cavitation in a pump application during operation because of a lack of visual access to validate the presence of cavitation and its form of appearance. In this work, experimental investigations are carried out in an inline single-stage centrifugal pump with optical access. Hence, it gives the opportunity to enhance the value of CFD tools and standard cavitation measurements. Experiments are conducted using two impellers running in the same volute at 3000 rpm and the same flow rate. One of the impellers used is optimized for lower NPSH₃% by its blade design, whereas the other one is manufactured using a standard casting method. The cavitation is detected by pump performance measurements, vibration measurements and high-speed image recordings. The head drop and the pump casing vibration caused by cavitation are correlated with the visual appearance of the cavitation. The vibration data is recorded in an axial direction of the impeller using accelerometers recording at a sample rate of 131 kHz. The vibration frequency domain data (up to 20 kHz) and the time domain data are analyzed as well as the root mean square values. The high-speed recordings, focusing on the impeller suction side, are taken at 10,240 fps to provide insight into the flow patterns and the cavitation behavior in the rotating impeller. The videos are synchronized with the vibration time signals by a trigger signal. A clear correlation between cloud collapses and abrupt peaks in the vibration signal can be observed. The vibration peaks clearly indicate cavitation, especially at higher NPSHA values where the hydraulic performance is not affected. It is also observed that below a certain NPSHA value, the cavitation started in the inlet bend of the pump. Above this value, cavitation occurs exclusively on the impeller blades. The impeller optimized for NPSH₃% does show a lower NPSH₃% than the standard impeller, but the head drop starts at a higher NPSHA value and is more gradual. Instabilities in the head drop curve of the optimized impeller were observed in addition to a higher vibration level. Furthermore, the cavitation clouds on the suction side appear more unsteady when using the optimized impeller. The shape and location of the cavitation are compared to 3D fluid flow simulations. The simulation results are in good agreement with the experimental investigations. In conclusion, these investigations attempt to give a more holistic view on the appearance of cavitation by comparing the head drop, vibration spectral data, vibration time signals, image recordings and simulation results. Data indicates that a criterion for cavitation detection could be derived from the vibration time-domain measurements, which requires further investigation. Usually, spectral data is used to analyze cavitation, but these investigations indicate that the time domain could be more appropriate for some applications.Keywords: cavitation, centrifugal pump, head drop, high-speed image recordings, pump vibration
Procedia PDF Downloads 1801266 Fluorescence-Based Biosensor for Dopamine Detection Using Quantum Dots
Authors: Sylwia Krawiec, Joanna Cabaj, Karol Malecha
Abstract:
Nowadays, progress in the field of the analytical methods is of great interest for reliable biological research and medical diagnostics. Classical techniques of chemical analysis, despite many advantages, do not permit to obtain immediate results or automatization of measurements. Chemical sensors have displaced the conventional analytical methods - sensors combine precision, sensitivity, fast response and the possibility of continuous-monitoring. Biosensor is a chemical sensor, which except of conventer also possess a biologically active material, which is the basis for the detection of specific chemicals in the sample. Each biosensor device mainly consists of two elements: a sensitive element, where is recognition of receptor-analyte, and a transducer element which receives the signal and converts it into a measurable signal. Through these two elements biosensors can be divided in two categories: due to the recognition element (e.g immunosensor) and due to the transducer (e.g optical sensor). Working of optical sensor is based on measurements of quantitative changes of parameters characterizing light radiation. The most often analyzed parameters include: amplitude (intensity), frequency or polarization. Changes in the optical properties one of the compound which reacts with biological material coated on the sensor is analyzed by a direct method, in an indirect method indicators are used, which changes the optical properties due to the transformation of the testing species. The most commonly used dyes in this method are: small molecules with an aromatic ring, like rhodamine, fluorescent proteins, for example green fluorescent protein (GFP), or nanoparticles such as quantum dots (QDs). Quantum dots have, in comparison with organic dyes, much better photoluminescent properties, better bioavailability and chemical inertness. These are semiconductor nanocrystals size of 2-10 nm. This very limited number of atoms and the ‘nano’-size gives QDs these highly fluorescent properties. Rapid and sensitive detection of dopamine is extremely important in modern medicine. Dopamine is very important neurotransmitter, which mainly occurs in the brain and central nervous system of mammals. Dopamine is responsible for the transmission information of moving through the nervous system and plays an important role in processes of learning or memory. Detection of dopamine is significant for diseases associated with the central nervous system such as Parkinson or schizophrenia. In developed optical biosensor for detection of dopamine, are used graphene quantum dots (GQDs). In such sensor dopamine molecules coats the GQD surface - in result occurs quenching of fluorescence due to Resonance Energy Transfer (FRET). Changes in fluorescence correspond to specific concentrations of the neurotransmitter in tested sample, so it is possible to accurately determine the concentration of dopamine in the sample.Keywords: biosensor, dopamine, fluorescence, quantum dots
Procedia PDF Downloads 3641265 A Novel Multi-Block Selective Mapping Scheme for PAPR Reduction in FBMC/OQAM Systems
Authors: Laabidi Mounira, Zayani Rafk, Bouallegue Ridha
Abstract:
Filter Bank Multicarrier with Offset Quadrature Amplitude Modulation (FBMC/OQAM) is presently known as a sustainable alternative to conventional Orthogonal Frequency Division Multiplexing (OFDM) for signal transmission over multi-path fading channels. Like all multicarrier systems, FBMC/OQAM suffers from high Peak to Average Power Ratio (PAPR). Due to the symbol overlap inherent in the FBMC/OQAM system, the direct application of conventional OFDM PAPR reduction scheme is far from being effective. This paper suggests a novel scheme termed Multi-Blocks Selective Mapping (MB-SLM) whose simulation results show that its performance in terms of PAPR reduction is almost identical to that of OFDM system.Keywords: FBMC/OQAM, multi-blocks, OFDM, PAPR, SLM
Procedia PDF Downloads 4631264 A New Mathematical Method for Heart Attack Forecasting
Authors: Razi Khalafi
Abstract:
Myocardial Infarction (MI) or acute Myocardial Infarction (AMI), commonly known as a heart attack, occurs when blood flow stops to part of the heart causing damage to the heart muscle. An ECG can often show evidence of a previous heart attack or one that's in progress. The patterns on the ECG may indicate which part of your heart has been damaged, as well as the extent of the damage. In chaos theory, the correlation dimension is a measure of the dimensionality of the space occupied by a set of random points, often referred to as a type of fractal dimension. In this research by considering ECG signal as a random walk we work on forecasting the oncoming heart attack by analysing the ECG signals using the correlation dimension. In order to test the model a set of ECG signals for patients before and after heart attack was used and the strength of model for forecasting the behaviour of these signals were checked. Results show this methodology can forecast the ECG and accordingly heart attack with high accuracy.Keywords: heart attack, ECG, random walk, correlation dimension, forecasting
Procedia PDF Downloads 506