Search results for: computer and information
8816 Monitoring Prolong Use of Intravenous Antibiotics: Antimicrobial Stewardship
Authors: Komal Fizza
Abstract:
Irrational and non-judicious use of antibiotics pave the way for an upsurge in antibiotic resistance, diminished effectiveness of different therapeutic regimens and as well as impounding effect on disease management leading to further morbidities. In the backdrop of this the current research is aimed to assess whether antimicrobial prescribing is in accordance with the Infectious Disease Society of America Guidelines in hospitalized patients at Shifa International Hospital, Islamabad, Pakistan. Shifa International Hospital, Islamabad is a 500 bed hospital. With the help of MIS team a form wad developed that gave the information about medical records number, name of the patient, day of start of antibiotic, the day antibiotic is supposed to be stopped and as well as the diagnosis of the patient. A ward pharmacist was employed to generate this report on a daily basis. The therapeutic regiment was reviewed by the pharmacist by monitoring the clinical progress, laboratory report and diagnosis. On the basis of this information, pharmacist made suggestions and forwarded to the hospital doctors responsible for prescribing antibiotics. If desired, changes were made regularly. In the current research our main focus was to implement this action and therefore, started monitoring patients who were on antibiotic regimens for more than 10-15 days. We took this initiative since November, 2013. At the start of the program a maximum 19 patients/day were reported to be on antibiotic regimen for more than 10-15 days. After the implementation of the initiative, the number of patients was decreased to fifteen patients per day in December, further decreased to 7 in the month of January and 9 and 6 in February and March respectively. The average patient census was 350. The current pilot study highlighted the role of pharmacist in initiating antibiotic stewardship programs in hospital settings.Keywords: stewardship, antibiotics, resistance, clinical process
Procedia PDF Downloads 3538815 Role of IT Systems in Corporate Recruitment: Challenges and Constraints
Authors: Brahim Bellali, Fatima Bellali
Abstract:
The integration of information technology systems (ITS) into a company's human resources processes seems to be the appropriate solution to the problem of evolving and adapting its human resources management practices in order to be both more strategic and more efficient in terms of costs and service quality. In this context, the aim of this work is to study the impact of information technology systems (ITS) on the recruitment process. In this study, we targeted candidates who had recruited using IT tools. The target population consists of 34 candidates based in Casablanca, Morocco. In order to collect the data, a questionnaire had to be drawn up. The survey is based on a data sheet and a questionnaire that is divided into several sections to make it more structured and comprehensible. The results show that the majority of respondents say that companies are making greater use of online CV libraries and social networks as digital solutions during the recruitment process. The results also show that 50% of candidates say that the use of digital tools by companies would not slow them down when applying for a job and that these IT tools improve manual recruitment processes, while 44.1% think that they facilitate recruitment without any human intervention. The majority of respondents (52.9%) think that social networks are the digital solutions most often used by recruiters in the sourcing phase. The constraints of digital recruitment encountered are the dehumanization of human resources (44.1%) and the limited interaction during remote interviews (44.1%), which leaves no room for informal exchanges. Digital recruitment can be a highly effective strategy for finding qualified candidates in a variety of fields. Here are a few recommendations for optimizing your digital recruitment process: (1) Use online recruitment platforms: LinkedIn, Twitter, and Facebook ; (2) Use applicant tracking systems (ATS) ; (3) Develop a content marketing strategy.Keywords: IT systems, recruitment, challenges, constraints
Procedia PDF Downloads 338814 Classifying Students for E-Learning in Information Technology Course Using ANN
Authors: Sirilak Areerachakul, Nat Ployong, Supayothin Na Songkla
Abstract:
This research’s objective is to select the model with most accurate value by using Neural Network Technique as a way to filter potential students who enroll in IT course by electronic learning at Suan Suanadha Rajabhat University. It is designed to help students selecting the appropriate courses by themselves. The result showed that the most accurate model was 100 Folds Cross-validation which had 73.58% points of accuracy.Keywords: artificial neural network, classification, students, e-learning
Procedia PDF Downloads 4268813 Northern Nigeria Vaccine Direct Delivery System
Authors: Evelyn Castle, Adam Thompson
Abstract:
Background: In 2013, the Kano State Primary Health Care Management Board redesigned its Routine immunization supply chain from diffused pull to direct delivery push. It addressed issues around stockouts and reduced time spent by health facility staff collecting, and reporting on vaccine usage. The health care board sought the help of a 3PL for twice-monthly deliveries from its cold store to 484 facilities across 44 local governments. eHA’s Health Delivery Systems group formed a 3PL to serve 326 of these new facilities in partnership with the State. We focused on designing and implementing a technology system throughout. Basic methodologies: GIS Mapping: - Planning the delivery of vaccines to hundreds of health facilities requires detailed route planning for delivery vehicles. Mapping the road networks across Kano and Bauchi with a custom routing tool provided information for the optimization of deliveries. Reducing the number of kilometers driven each round by 20%, - reducing cost and delivery time. Direct Delivery Information System: - Vaccine Direct Deliveries are facilitated through pre-round planning (driven by health facility database, extensive GIS, and inventory workflow rules), manager and driver control panel customizing delivery routines and reporting, progress dashboard, schedules/routes, packing lists, delivery reports, and driver data collection applications. Move: Last Mile Logistics Management System: - MOVE has improved vaccine supply information management to be timely, accurate and actionable. Provides stock management workflow support, alerts management for cold chain exceptions/stock outs, and on-device analytics for health and supply chain staff. Software was built to be offline-first with user-validated interface and experience. Deployed to hundreds of vaccine storage site the improved information tools helps facilitate the process of system redesign and change management. Findings: - Stock-outs reduced from 90% to 33% - Redesigned current health systems and managing vaccine supply for 68% of Kano’s wards. - Near real time reporting and data availability to track stock. - Paperwork burdens of health staff have been dramatically reduced. - Medicine available when the community needs it. - Consistent vaccination dates for children under one to prevent polio, yellow fever, tetanus. - Higher immunization rates = Lower infection rates. - Hundreds of millions of Naira worth of vaccines successfully transported. - Fortnightly service to 326 facilities in 326 wards across 30 Local Government areas. - 6,031 cumulative deliveries. - Over 3.44 million doses transported. - Minimum travel distance covered in a round of delivery is 2000 kms & maximum of 6297 kms. - 153,409 kms travelled by 6 drivers. - 500 facilities in 326 wards. - Data captured and synchronized for the first time. - Data driven decision making now possible. Conclusion: eHA’s Vaccine Direct delivery has met challenges in Kano and Bauchi State and provided a reliable delivery service of vaccinations that ensure t health facilities can run vaccination clinics for children under one. eHA uses innovative technology that delivers vaccines from Northern Nigerian zonal stores straight to healthcare facilities. Helped healthcare workers spend less time managing supplies and more time delivering care, and will be rolled out nationally across Nigeria.Keywords: direct delivery information system, health delivery system, GIS mapping, Northern Nigeria, vaccines
Procedia PDF Downloads 3738812 Impact of Farm Settlements' Facilities on Farm Patronage in Oyo State
Authors: Simon Ayorinde Okanlawon
Abstract:
The youths’ prevalent negative attitude to farming is partly due to amenities and facilities found in the urban centers at the expense of the rural areas. Hence, there is the need to create a befitting and conducive farm environment to retain farm employees and attract the youth to farming. This can be achieved through the provision of services and amenities that will ensure a comfortable standard of living higher than that obtained by a person of equal status in other forms of employment in urban centers, thereby eliminating the psychological feeling of lowered self-esteem associated with farming. This study assessed farm settlements’ facilities and patronage in Oyo State with a view to using the information to encourage sustainable agriculture in Nigeria. The study becomes necessary because of the dearth of information on the state of facilities in the farm settlements as it affects patronage of farm settlements for sustainable agriculture in the developing countries like Nigeria. The study utilized three purposely selected farm settlements- Ogbomoso, Fasola and Ilora out of the seven existing ones n Oyo State. One hundred percent (100%) of the 262 residential buildings in the three settlements were sampled, from where a household head from each of the buildings was randomly chosen. This translates to 262 household heads served with questionnaire out of which 47.7% of the questionnaires were recovered. Information obtained included respondents’ residency categories, residents’ status, residency years, housing types, types of holding and number of acres/holding. Others include the socio-economic attributes such as age, gender, income, educational status of respondents, assessment of existing facilities in the selected sites, the level of patronage of the farm settlements including perceived pull factors that can enhance farm settlements patronage. The study revealed that the residents were not satisfied with the adequacy and quality of all the facilities available in their settlements. Residents’ satisfaction with infrastructural facilities cannot be statistically linked with location across the study area. Findings suggested that residents of Ogbomoso farm settlements were not enjoying adequate provision of water supply and road as much as those from Ilora and Fasola. Patronage of the farm settlements were largely driven by farming activities and sale of farm produce. The respondents agreed that provision of farm resort centers, standard recreational and tourism facilities, vacation employment opportunities for youths, functional internet and communication networks among others are likely to boost the level of patronage of the farm settlements. The study concluded that improvement of the facilities both in quality and quantity will encourage the youths in going back to farming. It then recommends that maintenance of existing facilities and provision of more facilities such as resort centers be ensured.Keywords: encourage, farm settlements' facilities, Oyo state, patronage
Procedia PDF Downloads 2298811 Urban Sprawl Analysis in the City of Thiruvananthapuram and a Framework Formulation to Combat it
Authors: Sandeep J. Kumar
Abstract:
Urbanisation is considered as the primary driver of land use and land cover change that has direct link to population and economic growth. In India, as well as in other developing countries, cities are urbanizing at an alarming rate. This unprecedented and uncontrolled urbanisation can result in urban sprawl. Due to a number of factors, urban sprawl is recognised to be a result of poor planning, inadequate policies, and poor governance. Urban sprawl may be seen as posing a threat to the development of sustainable cities. Hence, it is very essential to manage this. Planning for predicted future growth is critical to avoid the negative effects of urban growth at the local and regional levels. Thiruvananthapuram being the capital city of Kerala is a city of economic success, challenges, and opportunities. Urbanization trends in the city have paved way for Urban Sprawl. This thesis aims to formulate a framework to combat the emerging urban sprawl in the city of Thiruvananthapuram. For that, the first step was to quantify trends of urban growth in Thiruvananthapuram city using Geographical Information System(GIS) and remote sensing techniques. The technique and results obtained in the study are extremely valuable in analysing the land use changes. Secondly, these change in the trends were analysed through some of the critical factors that helped the study to understand the underlying issues of the existing city structure that has resulted in urban sprawl. Anticipating development trends can modify the current order. This can be productively resolved using regional and municipal planning and management strategies. Hence efficient strategies to curb the sprawl in Thiruvananthapuram city have been formulated in this study that can be considered as recommendations for future planning.Keywords: urbanisation, urban sprawl, geographical information system(GIS), thiruvananthapuram
Procedia PDF Downloads 1078810 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language
Authors: Wenjun Hou, Marek Perkowski
Abstract:
The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language
Procedia PDF Downloads 1908809 Prediction of B-Cell Epitope for 24 Mite Allergens: An in Silico Approach towards Epitope-Based Immune Therapeutics
Authors: Narjes Ebrahimi, Soheila Alyasin, Navid Nezafat, Hossein Esmailzadeh, Younes Ghasemi, Seyed Hesamodin Nabavizadeh
Abstract:
Immunotherapy with allergy vaccines is of great importance in allergen-specific immunotherapy. In recent years, B-cell epitope-based vaccines have attracted considerable attention and the prediction of epitopes is crucial to design these types of allergy vaccines. B-cell epitopes might be linear or conformational. The prerequisite for the identification of conformational epitopes is the information about allergens' tertiary structures. Bioinformatics approaches have paved the way towards the design of epitope-based allergy vaccines through the prediction of tertiary structures and epitopes. Mite allergens are one of the major allergy contributors. Several mite allergens can elicit allergic reactions; however, their structures and epitopes are not well established. So, B-cell epitopes of various groups of mite allergens (24 allergens in 6 allergen groups) were predicted in the present work. Tertiary structures of 17 allergens with unknown structure were predicted and refined with RaptorX and GalaxyRefine servers, respectively. The predicted structures were further evaluated by Rampage, ProSA-web, ERRAT and Verify 3D servers. Linear and conformational B-cell epitopes were identified with Ellipro, Bcepred, and DiscoTope 2 servers. To improve the accuracy level, consensus epitopes were selected. Fifty-four conformational and 133 linear consensus epitopes were predicted. Furthermore, overlapping epitopes in each allergen group were defined, following the sequence alignment of the allergens in each group. The predicted epitopes were also compared with the experimentally identified epitopes. The presented results provide valuable information for further studies about allergy vaccine design.Keywords: B-cell epitope, Immunotherapy, In silico prediction, Mite allergens, Tertiary structure
Procedia PDF Downloads 1608808 Challenges to Change and Innovation in Educational System
Authors: Felicia Kikelomo Oluwalola
Abstract:
The study was designed to identify the challenges to change and innovation in educational system in Nigeria. Educational institutions, like all other organizations, require constant monitoring, to identify areas for potential improvement. However, educational reforms are often not well-implemented. This results in massive wastage of finances, human resources, and lost potential. Educational institutions are organised on many levels, from the individual classroom under the management of a single teacher, to groups of classrooms supervised by a Head Teacher or Executive Teacher, to a whole-school structure, under the guidance of the principal. Therefore, there is need for changes and innovation in our educational system since we are in the era of computer age. In doing so, this paper examined the psychology of change, concept of change and innovation with suggested view points. Educational administrators and individuals should be ready to have the challenge of monitoring changes in technologies. Educational planners/policy makers should be encouraged to involve in change process.Keywords: challenges, change, education, innovation
Procedia PDF Downloads 6128807 Manufacturing Process and Cost Estimation through Process Detection by Applying Image Processing Technique
Authors: Chalakorn Chitsaart, Suchada Rianmora, Noppawat Vongpiyasatit
Abstract:
In order to reduce the transportation time and cost for direct interface between customer and manufacturer, the image processing technique has been introduced in this research where designing part and defining manufacturing process can be performed quickly. A3D virtual model is directly generated from a series of multi-view images of an object, and it can be modified, analyzed, and improved the structure, or function for the further implementations, such as computer-aided manufacturing (CAM). To estimate and quote the production cost, the user-friendly platform has been developed in this research where the appropriate manufacturing parameters and process detections have been identified and planned by CAM simulation.Keywords: image processing technique, feature detections, surface registrations, capturing multi-view images, Production costs and Manufacturing processes
Procedia PDF Downloads 2518806 Positive-Negative Asymmetry in the Evaluations of Political Candidates: The Mediating Role of Affect in the Relationship between Cognitive Evaluation and Voting Intention
Authors: Magdalena Jablonska, Andrzej Falkowski
Abstract:
The negativity effect is one of the most intriguing and well-studied psychological phenomena that can be observed in many areas of human life. The aim of the following study is to investigate how valence framing and positive and negative information about political candidates affect judgments about similarity to an ideal and bad politician. Based on the theoretical framework of features of similarity, it is hypothesized that negative features have a stronger effect on similarity judgments than positive features of comparable value. Furthermore, the mediating role of affect is tested. Method: One hundred sixty-one people took part in an experimental study. Participants were divided into 6 research conditions that differed in the reference point (positive vs negative framing) and the number of favourable and unfavourable information items about political candidates (a positive, neutral and negative candidate profile). In positive framing condition, the concept of an ideal politician was primed; in the negative condition, participants were to think about a bad politician. The effect of independent variables on similarity judgments, affective evaluation, and voting intention was tested. Results: In the positive condition, the analysis showed that the negative effect of additional unfavourable features was greater than the positive effect of additional favourable features in judgements about similarity to the ideal candidate. In negative framing condition, ANOVA was insignificant, showing that neither the addition of positive features nor additional negative information had a significant impact on the similarity to a bad political candidate. To explain this asymmetry, two mediational analyses were conducted that tested the mediating role of affect in the relationship between similarity judgments and voting intention. In both situations the mediating effect was significant, but the comparison of two models showed that the mediation was stronger for a negative framing. Discussion: The research supports the negativity effect and attempts to explain the psychological mechanism behind the positive-negative asymmetry. The results of mediation analyses point to a stronger mediating role of affect in the relationship between cognitive evaluation and voting intention. Such a result suggests that negative comparisons, leading to the activation of negative features, give rise to stronger emotions than positive features of comparable strength. The findings are in line with positive-negative asymmetry, however, by adopting Tversky’s framework of features of similarity, the study integrates the cognitive mechanism of the negativity effect delineated in the contrast model of similarity with its emotional component resulting from the asymmetrical effect of positive and negative emotions on decision-making.Keywords: affect, framing, negativity effect, positive-negative asymmetry, similarity judgements
Procedia PDF Downloads 1988805 Study on the Geometric Similarity in Computational Fluid Dynamics Calculation and the Requirement of Surface Mesh Quality
Authors: Qian Yi Ooi
Abstract:
At present, airfoil parameters are still designed and optimized according to the scale of conventional aircraft, and there are still some slight deviations in terms of scale differences. However, insufficient parameters or poor surface mesh quality is likely to occur if these small deviations are embedded in a future civil aircraft with a size that is quite different from conventional aircraft, such as a blended-wing-body (BWB) aircraft with future potential, resulting in large deviations in geometric similarity in computational fluid dynamics (CFD) simulations. To avoid this situation, the study on the CFD calculation on the geometric similarity of airfoil parameters and the quality of the surface mesh is conducted to obtain the ability of different parameterization methods applied on different airfoil scales. The research objects are three airfoil scales, including the wing root and wingtip of conventional civil aircraft and the wing root of the giant hybrid wing, used by three parameterization methods to compare the calculation differences between different sizes of airfoils. In this study, the constants including NACA 0012, a Reynolds number of 10 million, an angle of attack of zero, a C-grid for meshing, and the k-epsilon (k-ε) turbulence model are used. The experimental variables include three airfoil parameterization methods: point cloud method, B-spline curve method, and class function/shape function transformation (CST) method. The airfoil dimensions are set to 3.98 meters, 17.67 meters, and 48 meters, respectively. In addition, this study also uses different numbers of edge meshing and the same bias factor in the CFD simulation. Studies have shown that with the change of airfoil scales, different parameterization methods, the number of control points, and the meshing number of divisions should be used to improve the accuracy of the aerodynamic performance of the wing. When the airfoil ratio increases, the most basic point cloud parameterization method will require more and larger data to support the accuracy of the airfoil’s aerodynamic performance, which will face the severe test of insufficient computer capacity. On the other hand, when using the B-spline curve method, average number of control points and meshing number of divisions should be set appropriately to obtain higher accuracy; however, the quantitative balance cannot be directly defined, but the decisions should be made repeatedly by adding and subtracting. Lastly, when using the CST method, it is found that limited control points are enough to accurately parameterize the larger-sized wing; a higher degree of accuracy and stability can be obtained by using a lower-performance computer.Keywords: airfoil, computational fluid dynamics, geometric similarity, surface mesh quality
Procedia PDF Downloads 2228804 A Comparison of qCON/qNOX to the Bispectral Index as Indices of Antinociception in Surgical Patients Undergoing General Anesthesia with Laryngeal Mask Airway
Authors: Roya Yumul, Ofelia Loani Elvir-Lazo, Sevan Komshian, Ruby Wang, Jun Tang
Abstract:
BACKGROUND: An objective means for monitoring the anti-nociceptive effects of perioperative medications has long been desired as a way to provide anesthesiologists information regarding a patient’s level of antinociception and preclude any untoward autonomic responses and reflexive muscular movements from painful stimuli intraoperatively. To this end, electroencephalogram (EEG) based tools including BIS and qCON were designed to provide information about the depth of sedation while qNOX was produced to inform on the degree of antinociception. The goal of this study was to compare the reliability of qCON/qNOX to BIS as specific indicators of response to nociceptive stimulation. METHODS: Sixty-two patients undergoing general anesthesia with LMA were included in this study. Institutional Review Board (IRB) approval was obtained, and informed consent was acquired prior to patient enrollment. Inclusion criteria included American Society of Anesthesiologists (ASA) class I-III, 18 to 80 years of age, and either gender. Exclusion criteria included the inability to consent. Withdrawal criteria included conversion to the endotracheal tube and EEG malfunction. BIS and qCON/qNOX electrodes were simultaneously placed on all patients prior to induction of anesthesia and were monitored throughout the case, along with other perioperative data, including patient response to noxious stimuli. All intraoperative decisions were made by the primary anesthesiologist without influence from qCON/qNOX. Student’s t-distribution, prediction probability (PK), and ANOVA were used to statistically compare the relative ability to detect nociceptive stimuli for each index. Twenty patients were included for the preliminary analysis. RESULTS: A comparison of overall intraoperative BIS, qCON and qNOX indices demonstrated no significant difference between the three measures (N=62, p> 0.05). Meanwhile, index values for qNOX (62±18) were significantly higher than those for BIS (46±14) and qCON (54±19) immediately preceding patient responses to nociceptive stimulation in a preliminary analysis (N=20, * p= 0.0408). Notably, certain hemodynamic measurements demonstrated a significant increase in response to painful stimuli (MAP increased from 74 ±13 mm Hg at baseline to 84 ± 18 mm Hg during noxious stimuli [p= 0.032] and HR from 76 ± 12 BPM at baseline to 80 ± 13 BPM during noxious stimuli [p=0.078] respectively). CONCLUSION: In this observational study, BIS and qCON/qNOX provided comparable information on patients’ level of sedation throughout the course of an anesthetic. Meanwhile, increases in qNOX values demonstrated a superior correlation to an imminent response to stimulation relative to all other indicesKeywords: antinociception, BIS, general anesthesia, LMA, qCON/qNOX
Procedia PDF Downloads 1378803 A Challenge to Acquire Serious Victims’ Locations during Acute Period of Giant Disasters
Authors: Keiko Shimazu, Yasuhiro Maida, Tetsuya Sugata, Daisuke Tamakoshi, Kenji Makabe, Haruki Suzuki
Abstract:
In this paper, we report how to acquire serious victims’ locations in the Acute Stage of Large-scale Disasters, in an Emergency Information Network System designed by us. The background of our concept is based on the Great East Japan Earthquake occurred on March 11th, 2011. Through many experiences of national crises caused by earthquakes and tsunamis, we have established advanced communication systems and advanced disaster medical response systems. However, Japan was devastated by huge tsunamis swept a vast area of Tohoku causing a complete breakdown of all the infrastructures including telecommunications. Therefore, we noticed that we need interdisciplinary collaboration between science of disaster medicine, regional administrative sociology, satellite communication technology and systems engineering experts. Communication of emergency information was limited causing a serious delay in the initial rescue and medical operation. For the emergency rescue and medical operations, the most important thing is to identify the number of casualties, their locations and status and to dispatch doctors and rescue workers from multiple organizations. In the case of the Tohoku earthquake, the dispatching mechanism and/or decision support system did not exist to allocate the appropriate number of doctors and locate disaster victims. Even though the doctors and rescue workers from multiple government organizations have their own dedicated communication system, the systems are not interoperable.Keywords: crisis management, disaster mitigation, messing, MGRS, military grid reference system, satellite communication system
Procedia PDF Downloads 2368802 Rare-Earth Ions Doped Lithium Niobate Crystals: Luminescence and Raman Spectroscopy
Authors: Ninel Kokanyan, Edvard Kokanyan, Anush Movsesyan, Marc D. Fontana
Abstract:
Lithium Niobate (LN) is one of the widely used ferroelectrics having a wide number of applications such as phase-conjugation, holographic storage, frequency doubling, SAW sensors. Furthermore, the possibility of doping with rare-earth ions leads to new laser applications. Ho and Tm dopants seem interesting due to laser emission obtained at around 2 µm. Raman spectroscopy is a powerful spectroscopic technique providing a possibility to obtain a number of information about physicochemical and also optical properties of a given material. Polarized Raman measurements were carried out on Ho and Tm doped LN crystals with excitation wavelengths of 532nm and 785nm. In obtained Raman anti-Stokes spectra, we detect expected modes according to Raman selection rules. In contrast, Raman Stokes spectra are significantly different compared to what is expected by selection rules. Additional forbidden lines are detected. These lines have quite high intensity and are well defined. Moreover, the intensity of mentioned additional lines increases with an increase of Ho or Tm concentrations in the crystal. These additional lines are attributed to emission lines reflecting the photoluminescence spectra of these crystals. It means that in our case we were able to detect, within a very good resolution, in the same Stokes spectrum, the transitions between the electronic states, and the vibrational states as well. The analysis of these data is reported as a function of Ho and Tm content, for different polarizations and wavelengths, of the incident laser beam. Results also highlight additional information about π and σ polarizations of crystals under study.Keywords: lithium niobate, Raman spectroscopy, luminescence, rare-earth ions doped lithium niobate
Procedia PDF Downloads 2218801 Lamb Waves Wireless Communication in Healthy Plates Using Coherent Demodulation
Authors: Rudy Bahouth, Farouk Benmeddour, Emmanuel Moulin, Jamal Assaad
Abstract:
Guided ultrasonic waves are used in Non-Destructive Testing (NDT) and Structural Health Monitoring (SHM) for inspection and damage detection. Recently, wireless data transmission using ultrasonic waves in solid metallic channels has gained popularity in some industrial applications such as nuclear, aerospace and smart vehicles. The idea is to find a good substitute for electromagnetic waves since they are highly attenuated near metallic components due to Faraday shielding. The proposed solution is to use ultrasonic guided waves such as Lamb waves as an information carrier due to their capability of propagation for long distances. In addition to this, valuable information about the health of the structure could be extracted simultaneously. In this work, the reliable frequency bandwidth for communication is extracted experimentally from dispersion curves at first. Then, an experimental platform for wireless communication using Lamb waves is described and built. After this, coherent demodulation algorithm used in telecommunications is tested for Amplitude Shift Keying, On-Off Keying and Binary Phase Shift Keying modulation techniques. Signal processing parameters such as threshold choice, number of cycles per bit and Bit Rate are optimized. Experimental results are compared based on the average Bit Error Rate. Results have shown high sensitivity to threshold selection for Amplitude Shift Keying and On-Off Keying techniques resulting a Bit Rate decrease. Binary Phase Shift Keying technique shows the highest stability and data rate between all tested modulation techniques.Keywords: lamb waves communication, wireless communication, coherent demodulation, bit error rate
Procedia PDF Downloads 2608800 Data Refinement Enhances The Accuracy of Short-Term Traffic Latency Prediction
Authors: Man Fung Ho, Lap So, Jiaqi Zhang, Yuheng Zhao, Huiyang Lu, Tat Shing Choi, K. Y. Michael Wong
Abstract:
Nowadays, a tremendous amount of data is available in the transportation system, enabling the development of various machine learning approaches to make short-term latency predictions. A natural question is then the choice of relevant information to enable accurate predictions. Using traffic data collected from the Taiwan Freeway System, we consider the prediction of short-term latency of a freeway segment with a length of 17 km covering 5 measurement points, each collecting vehicle-by-vehicle data through the electronic toll collection system. The processed data include the past latencies of the freeway segment with different time lags, the traffic conditions of the individual segments (the accumulations, the traffic fluxes, the entrance and exit rates), the total accumulations, and the weekday latency profiles obtained by Gaussian process regression of past data. We arrive at several important conclusions about how data should be refined to obtain accurate predictions, which have implications for future system-wide latency predictions. (1) We find that the prediction of median latency is much more accurate and meaningful than the prediction of average latency, as the latter is plagued by outliers. This is verified by machine-learning prediction using XGBoost that yields a 35% improvement in the mean square error of the 5-minute averaged latencies. (2) We find that the median latency of the segment 15 minutes ago is a very good baseline for performance comparison, and we have evidence that further improvement is achieved by machine learning approaches such as XGBoost and Long Short-Term Memory (LSTM). (3) By analyzing the feature importance score in XGBoost and calculating the mutual information between the inputs and the latencies to be predicted, we identify a sequence of inputs ranked in importance. It confirms that the past latencies are most informative of the predicted latencies, followed by the total accumulation, whereas inputs such as the entrance and exit rates are uninformative. It also confirms that the inputs are much less informative of the average latencies than the median latencies. (4) For predicting the latencies of segments composed of two or three sub-segments, summing up the predicted latencies of each sub-segment is more accurate than the one-step prediction of the whole segment, especially with the latency prediction of the downstream sub-segments trained to anticipate latencies several minutes ahead. The duration of the anticipation time is an increasing function of the traveling time of the upstream segment. The above findings have important implications to predicting the full set of latencies among the various locations in the freeway system.Keywords: data refinement, machine learning, mutual information, short-term latency prediction
Procedia PDF Downloads 1698799 A Hill Cipher Based on the Kish-Sethuraman Protocol
Authors: Kondwani Magamba
Abstract:
In the idealized Kish-Sethuraman (KS) protocol,messages are sent between Alice and Bob each using a secret personal key. This protocol is said to be perfectly secure because both Bob and Alice keep their keys undisclosed so that at all times the message is encrypted by at least one key, thus no information is leaked or shared. In this paper, we propose a realization of the KS protocol through the use of the Hill Cipher.Keywords: Kish-Sethuraman Protocol, Hill Cipher, MDS Matrices, encryption
Procedia PDF Downloads 3578798 A Corporate Social Responsibility Project to Improve the Democratization of Scientific Education in Brazil
Authors: Denise Levy
Abstract:
Nuclear technology is part of our everyday life and its beneficial applications help to improve the quality of our lives. Nevertheless, in Brazil, most often the media and social networks tend to associate radiation to nuclear weapons and major accidents, and there is still great misunderstanding about the peaceful applications of nuclear science. The Educational Portal Radioatividades (Radioactivities) is a corporate social responsibility initiative that takes advantage of the growing impact of Internet to offer high quality scientific information for teachers and students throughout Brazil. This web-based initiative focusses on the positive applications of nuclear technology, presenting the several contributions of ionizing radiation in different contexts, such as nuclear medicine, agriculture techniques, food safety and electric power generation, proving nuclear technology as part of modern life and a must to improve the quality of our lifestyle. This educational project aims to contribute for democratization of scientific education and social inclusion, approaching society to scientific knowledge, promoting critical thinking and inspiring further reflections. The website offers a wide variety of ludic activities such as curiosities, interactive exercises and short courses. Moreover, teachers are offered free web-based material with full instructions to be developed in class. Since year 2013, the project has been developed and improved according to a comprehensive study about the realistic scenario of ICTs infrastructure in Brazilian schools and in full compliance with the best e-learning national and international recommendations.Keywords: information and communication technologies, nuclear technology, science communication, society and education
Procedia PDF Downloads 3268797 Robotic Arm Control with Neural Networks Using Genetic Algorithm Optimization Approach
Authors: Arbnor Pajaziti, Hasan Cana
Abstract:
In this paper, the structural genetic algorithm is used to optimize the neural network to control the joint movements of robotic arm. The robotic arm has also been modeled in 3D and simulated in real-time in MATLAB. It is found that Neural Networks provide a simple and effective way to control the robot tasks. Computer simulation examples are given to illustrate the significance of this method. By combining Genetic Algorithm optimization method and Neural Networks for the given robotic arm with 5 D.O.F. the obtained the results shown that the base joint movements overshooting time without controller was about 0.5 seconds, while with Neural Network controller (optimized with Genetic Algorithm) was about 0.2 seconds, and the population size of 150 gave best results.Keywords: robotic arm, neural network, genetic algorithm, optimization
Procedia PDF Downloads 5238796 Text Analysis to Support Structuring and Modelling a Public Policy Problem-Outline of an Algorithm to Extract Inferences from Textual Data
Authors: Claudia Ehrentraut, Osama Ibrahim, Hercules Dalianis
Abstract:
Policy making situations are real-world problems that exhibit complexity in that they are composed of many interrelated problems and issues. To be effective, policies must holistically address the complexity of the situation rather than propose solutions to single problems. Formulating and understanding the situation and its complex dynamics, therefore, is a key to finding holistic solutions. Analysis of text based information on the policy problem, using Natural Language Processing (NLP) and Text analysis techniques, can support modelling of public policy problem situations in a more objective way based on domain experts knowledge and scientific evidence. The objective behind this study is to support modelling of public policy problem situations, using text analysis of verbal descriptions of the problem. We propose a formal methodology for analysis of qualitative data from multiple information sources on a policy problem to construct a causal diagram of the problem. The analysis process aims at identifying key variables, linking them by cause-effect relationships and mapping that structure into a graphical representation that is adequate for designing action alternatives, i.e., policy options. This study describes the outline of an algorithm used to automate the initial step of a larger methodological approach, which is so far done manually. In this initial step, inferences about key variables and their interrelationships are extracted from textual data to support a better problem structuring. A small prototype for this step is also presented.Keywords: public policy, problem structuring, qualitative analysis, natural language processing, algorithm, inference extraction
Procedia PDF Downloads 5898795 Determining the Collaboration and Challenges of Public Employment Service with Stakeholders, Employers and Job Seekers: In Case of Amhara National Regional State, Ethiopia
Authors: Redie Bezabih Hailu
Abstract:
Unemployment is a problem of nations that needs a continuous research. This study aimed to determine the collaborations and challenges of public employment service (PES) with special emphasis of stakeholders, employers and job seekers. The researcher used pragmatic philosophy, exploratory design and inductive approach to collect data from the respondents using interview and focused group discussion techniques. PES provides job market information, vocational counseling, and training. As PES is not fully furnished with man power, budget, modern technologies, it is providing less adequate services to the employers and job seekers. Matching job seekers with job vacancies is the major challenge for the center and using paper-based data management system too. There is also a number of job seekers in spite of very limited number of vacancies that the service provision is poor due to the fact that there is low level of vacancies and high level of job seekers. The center has collaboration with AFE, AYA, BoTVED, BoWCY, and CETU. The major challenges with this collaborations was the absence of operational guidelines to evaluate effectiveness and performance, lottery method of selecting candidates for vacancies and nepotism or favoritism were challenges for job seekers. On the other hand, (COVID-19) pandemic, inability to get skilled labor, absence of standardized payment, expectation of job seekers and less educational quality and mass graduation were another challenges for employment services. The study recommended quality education and training, operational guideline for collaboration, technology based labor market information system and suggested further studies on quality of PES.Keywords: public employment service, collaborations, stakeholders, employers, job seekers
Procedia PDF Downloads 498794 The Role of Creative Entrepreneurship in the Development of Croatian Economy
Authors: Marko Kolakovic
Abstract:
Creative industries are an important sector of growth and development of knowledge economies. They have a positive impact on employment, economic growth, export and the quality of life in the areas where they are developed. Creative sectors include architecture, design, advertising, publishing, music, film, television and radio, video games, visual and performing arts and heritage. Following the positive trends of development of creative industries on the global and European level, this paper analyzes creative industries in general and specific characteristics of creative entrepreneurship. Special focus in this paper is put on the influence of the information communication technology on the development of new creative business models and protection of the intellectual property rights. One part of the paper is oriented on the analysis of the status of creative industries and creative entrepreneurship in Croatia. The main objective of the paper is by using the statistical analysis of creative industries in Croatia and information gained during the interviews with entrepreneurs, to make conclusions about potentials and development of creative industries in Croatia. Creative industries in Croatia are at the beginning of their development and growth strategy still does not exist at the national level. Statistical analysis pointed out that in 2015 creative enterprises made 9% of all enterprises in Croatia, employed 5,5% of employed people and their share in GDP was 4,01%. Croatian creative entrepreneurs are building competitive advantage using their creative resources and creating specific business models. The main obstacles they meet are lack of business experience and impossibility of focusing on the creative activities only. In their business, they use digital technologies and are focused on export. The conclusion is that creative industries in Croatia have development potential, but it is necessary to take adequate measures to use this potential in a right way.Keywords: creative entrepreneurship, knowledge economy, business models, intellectual property
Procedia PDF Downloads 2088793 Investigating the Feasibility of Promoting Safety in Civil Projects by BIM System Using Fuzzy Logic
Authors: Mohammad Reza Zamanian
Abstract:
The construction industry has always been recognized as one of the most dangerous available industries, and the statistics of accidents and injuries resulting from it say that the safety category needs more attention and the arrival of up-to-date technologies in this field. Building information modeling (BIM) is one of the relatively new and applicable technologies in Iran, that the necessity of using it is increasingly evident. The main purposes of this research are to evaluate the feasibility of using this technology in the safety sector of construction projects and to evaluate the effectiveness and operationality of its various applications in this sector. These applications were collected and categorized after reviewing past studies and researches then a questionnaire based on Delphi method criteria was presented to 30 experts who were thoroughly familiar with modeling software and safety guidelines. After receiving and exporting the answers to SPSS software, the validity and reliability of the questionnaire were assessed to evaluate the measuring tools. Fuzzy logic is a good way to analyze data because of its flexibility in dealing with ambiguity and uncertainty issues, and the implementation of the Delphi method in the fuzzy environment overcomes the uncertainties in decision making. Therefore, this method was used for data analysis, and the results indicate the usefulness and effectiveness of BIM in projects and improvement of safety status at different stages of construction. Finally, the applications and the sections discussed were ranked in order of priority for efficiency and effectiveness. Safety planning is considered as the most influential part of the safety of BIM among the four sectors discussed, and planning for the installation of protective fences and barriers to prevent falls and site layout planning with a safety approach based on a 3D model are the most important applications of BIM among the 18 applications to improve the safety of construction projects.Keywords: building information modeling, safety of construction projects, Delphi method, fuzzy logic
Procedia PDF Downloads 1678792 The French Ekang Ethnographic Dictionary. The Quantum Approach
Authors: Henda Gnakate Biba, Ndassa Mouafon Issa
Abstract:
Dictionaries modeled on the Western model [tonic accent languages] are not suitable and do not account for tonal languages phonologically, which is why the [prosodic and phonological] ethnographic dictionary was designed. It is a glossary that expresses the tones and the rhythm of words. It recreates exactly the speaking or singing of a tonal language, and allows the non-speaker of this language to pronounce the words as if they were a native. It is a dictionary adapted to tonal languages. It was built from ethnomusicological theorems and phonological processes, according to Jean. J. Rousseau 1776 hypothesis /To say and to sing were once the same thing/. Each word in the French dictionary finds its corresponding language, ekaη. And each word ekaη is written on a musical staff. This ethnographic dictionary is also an inventive, original and innovative research thesis, but it is also an inventive, original and innovative research thesis. A contribution to the theoretical, musicological, ethno musicological and linguistic conceptualization of languages, giving rise to the practice of interlocution between the social and cognitive sciences, the activities of artistic creation and the question of modeling in the human sciences: mathematics, computer science, translation automation and artificial intelligence. When you apply this theory to any text of a folksong of a world-tone language, you do not only piece together the exact melody, rhythm, and harmonies of that song as if you knew it in advance but also the exact speaking of this language. The author believes that the issue of the disappearance of tonal languages and their preservation has been structurally resolved, as well as one of the greatest cultural equations related to the composition and creation of tonal, polytonal and random music. The experimentation confirming the theorization designed a semi-digital, semi-analog application which translates the tonal languages of Africa (about 2,100 languages) into blues, jazz, world music, polyphonic music, tonal and anatonal music and deterministic and random music). To test this application, I use a music reading and writing software that allows me to collect the data extracted from my mother tongue, which is already modeled in the musical staves saved in the ethnographic (semiotic) dictionary for automatic translation ( volume 2 of the book). Translation is done (from writing to writing, from writing to speech and from writing to music). Mode of operation: you type a text on your computer, a structured song (chorus-verse), and you command the machine a melody of blues, jazz and, world music or, variety etc. The software runs, giving you the option to choose harmonies, and then you select your melody.Keywords: music, language, entenglement, science, research
Procedia PDF Downloads 698791 Use of Analytic Hierarchy Process for Plant Site Selection
Authors: Muzaffar Shaikh, Shoaib Shaikh, Mark Moyou, Gaby Hawat
Abstract:
This paper presents the use of Analytic Hierarchy Process (AHP) in evaluating the site selection of a new plant by a corporation. Due to intense competition at a global level, multinational corporations are continuously striving to minimize production and shipping costs of their products. One key factor that plays significant role in cost minimization is where the production plant is located. In the U.S. for example, labor and land costs continue to be very high while they are much cheaper in countries such as India, China, Indonesia, etc. This is why many multinational U.S. corporations (e.g. General Electric, Caterpillar Inc., Ford, General Motors, etc.), have shifted their manufacturing plants outside. The continued expansion of the Internet and its availability along with technological advances in computer hardware and software all around the globe have facilitated U.S. corporations to expand abroad as they seek to reduce production cost. In particular, management of multinational corporations is constantly engaged in concentrating on countries at a broad level, or cities within specific countries where certain or all parts of their end products or the end products themselves can be manufactured cheaper than in the U.S. AHP is based on preference ratings of a specific decision maker who can be the Chief Operating Officer of a company or his/her designated data analytics engineer. It serves as a tool to first evaluate the plant site selection criteria and second, alternate plant sites themselves against these criteria in a systematic manner. Examples of site selection criteria are: Transportation Modes, Taxes, Energy Modes, Labor Force Availability, Labor Rates, Raw Material Availability, Political Stability, Land Costs, etc. As a necessary first step under AHP, evaluation criteria and alternate plant site countries are identified. Depending upon the fidelity of analysis, specific cities within a country can also be chosen as alternative facility locations. AHP experience in this type of analysis indicates that the initial analysis can be performed at the Country-level. Once a specific country is chosen via AHP, secondary analyses can be performed by selecting specific cities or counties within a country. AHP analysis is usually based on preferred ratings of a decision-maker (e.g., 1 to 5, 1 to 7, or 1 to 9, etc., where 1 means least preferred and a 5 means most preferred). The decision-maker assigns preferred ratings first, criterion vs. criterion and creates a Criteria Matrix. Next, he/she assigns preference ratings by alternative vs. alternative against each criterion. Once this data is collected, AHP is applied to first get the rank-ordering of criteria. Next, rank-ordering of alternatives is done against each criterion resulting in an Alternative Matrix. Finally, overall rank ordering of alternative facility locations is obtained by matrix multiplication of Alternative Matrix and Criteria Matrix. The most practical aspect of AHP is the ‘what if’ analysis that the decision-maker can conduct after the initial results to provide valuable sensitivity information of specific criteria to other criteria and alternatives.Keywords: analytic hierarchy process, multinational corporations, plant site selection, preference ratings
Procedia PDF Downloads 2888790 Radiation Dose and Associated Exposure Parameters in Selected MDCT Scanners in Multiphase Scan of Abdomen-Pelvic Region: A Clinical Study
Authors: P. Sathyathas, H. M. I. S. W. Herath, T. Amalraj, U. J. M. A. L. Jayasinghe
Abstract:
Over two thirds of medical radiation can now be attributed to Computed Tomography (CT). There is little information on amount of radiation received from multiphase CT scan of abdomen- pelvic region in clinical practice. We sought to estimate the radiation dose and associated exposure parameters in the multiphase abdomen - pelvic scan of Multideteror Computed Tomography (MDCT) studies in clinical practice. This was a retrospective cross sectional studies describing radiation dose associated with main exposure parameters in diagnostic multiphase abdomen - pelvic scans performed on 152 consecutive patients by two different sixteen slice CT scanners. Patient information, exposure parameters of CTDI (volume), DLP, kVp, mAs and pitch were recorded for every phases of abdomen- a pelvic study from dose report of MDCT scanners (MDCTs). Age of patients range from 14 years to 87 years in both MDCT scanners. Overall CTDI (volume) median was 63.8 (±10.4) mGy for a multiphase abdominal-pelvic scan with scanner A while it was 35.4 (±15.6) mGy for scanner B. Patients' effective dose for multiphase abdomen - pelvic CT scan range from 8.2 mSv to 58 mSv. Median effective dose for patients, who underwent multiphase abdomen- pelvis scan with scanner A and B were 38.5 (± 8.2) mSv and 21.3 (± 8.6) mSv respectively. Median value of exposure parameters of mAs, kVp and pitch, were 150 (±29.7), 130 (±15.3) and 1.3 (±0.1) respectively in scanner A. In scanner B; they were 60 (±14.5), 120 and 1. The median effective dose for patients between multiphase abdomen-pelvic scan of both MDCT, a significant different (P<0.05) was observed. Multiphase abdomen – pelvic scan of clinical study shows significant different of effective dose with reference level of phantom studies (8-14mSv) and it depends on the type of vendors.Keywords: abdomen-pelvic region, computed tomography, exposure parameters, radiation dose
Procedia PDF Downloads 3278789 Geographic Information System and Dynamic Segmentation of Very High Resolution Images for the Semi-Automatic Extraction of Sandy Accumulation
Authors: A. Bensaid, T. Mostephaoui, R. Nedjai
Abstract:
A considerable area of Algerian lands is threatened by the phenomenon of wind erosion. For a long time, wind erosion and its associated harmful effects on the natural environment have posed a serious threat, especially in the arid regions of the country. In recent years, as a result of increases in the irrational exploitation of natural resources (fodder) and extensive land clearing, wind erosion has particularly accentuated. The extent of degradation in the arid region of the Algerian Mecheria department generated a new situation characterized by the reduction of vegetation cover, the decrease of land productivity, as well as sand encroachment on urban development zones. In this study, we attempt to investigate the potential of remote sensing and geographic information systems for detecting the spatial dynamics of the ancient dune cords based on the numerical processing of LANDSAT images (5, 7, and 8) of three scenes 197/37, 198/36 and 198/37 for the year 2020. As a second step, we prospect the use of geospatial techniques to monitor the progression of sand dunes on developed (urban) lands as well as on the formation of sandy accumulations (dune, dunes fields, nebkha, barkhane, etc.). For this purpose, this study made use of the semi-automatic processing method for the dynamic segmentation of images with very high spatial resolution (SENTINEL-2 and Google Earth). This study was able to demonstrate that urban lands under current conditions are located in sand transit zones that are mobilized by the winds from the northwest and southwest directions.Keywords: land development, GIS, segmentation, remote sensing
Procedia PDF Downloads 1558788 Development of a Social Assistive Robot for Elderly Care
Authors: Edwin Foo, Woei Wen, Lui, Meijun Zhao, Shigeru Kuchii, Chin Sai Wong, Chung Sern Goh, Yi Hao He
Abstract:
This presentation presents an elderly care and assistive social robot development work. We named this robot JOS and he is restricted to table top operation. JOS is designed to have a maximum volume of 3600 cm3 with its base restricted to 250 mm and his mission is to provide companion, assist and help the elderly. In order for JOS to accomplish his mission, he will be equipped with perception, reaction and cognition capability. His appearance will be not human like but more towards cute and approachable type. JOS will also be designed to be neutral gender. However, the robot will still have eyes, eyelid and a mouth. For his eyes and eyelids, they will be built entirely with Robotis Dynamixel AX18 motor. To realize this complex task, JOS will be also be equipped with micro-phone array, vision camera and Intel i5 NUC computer and a powered by a 12 V lithium battery that will be self-charging. His face is constructed using 1 motor each for the eyelid, 2 motors for the eyeballs, 3 motors for the neck mechanism and 1 motor for the lips movement. The vision senor will be house on JOS forehead and the microphone array will be somewhere below the mouth. For the vision system, Omron latest OKAO vision sensor is used. It is a compact and versatile sensor that is only 60mm by 40mm in size and operates with only 5V supply. In addition, OKAO vision sensor is capable of identifying the user and recognizing the expression of the user. With these functions, JOS is able to track and identify the user. If he cannot recognize the user, JOS will ask the user if he would want him to remember the user. If yes, JOS will store the user information together with the capture face image into a database. This will allow JOS to recognize the user the next time the user is with JOS. In addition, JOS is also able to interpret the mood of the user through the facial expression of the user. This will allow the robot to understand the user mood and behavior and react according. Machine learning will be later incorporated to learn the behavior of the user so as to understand the mood of the user and requirement better. For the speech system, Microsoft speech and grammar engine is used for the speech recognition. In order to use the speech engine, we need to build up a speech grammar database that captures the commonly used words by the elderly. This database is built from research journals and literature on elderly speech and also interviewing elderly what do they want to robot to assist them with. Using the result from the interview and research from journal, we are able to derive a set of common words the elderly frequently used to request for the help. It is from this set that we build up our grammar database. In situation where there is more than one person near JOS, he is able to identify the person who is talking to him through an in-house developed microphone array structure. In order to make the robot more interacting, we have also included the capability for the robot to express his emotion to the user through the facial expressions by changing the position and movement of the eyelids and mouth. All robot emotions will be in response to the user mood and request. Lastly, we are expecting to complete this phase of project and test it with elderly and also delirium patient by Feb 2015.Keywords: social robot, vision, elderly care, machine learning
Procedia PDF Downloads 4418787 Crossing Multi-Source Climate Data to Estimate the Effects of Climate Change on Evapotranspiration Data: Application to the French Central Region
Authors: Bensaid A., Mostephaoui T., Nedjai R.
Abstract:
Climatic factors are the subject of considerable research, both methodologically and instrumentally. Under the effect of climate change, the approach to climate parameters with precision remains one of the main objectives of the scientific community. This is from the perspective of assessing climate change and its repercussions on humans and the environment. However, many regions of the world suffer from a severe lack of reliable instruments that can make up for this deficit. Alternatively, the use of empirical methods becomes the only way to assess certain parameters that can act as climate indicators. Several scientific methods are used for the evaluation of evapotranspiration which leads to its evaluation either directly at the level of the climatic stations or by empirical methods. All these methods make a point approach and, in no case, allow the spatial variation of this parameter. We, therefore, propose in this paper the use of three sources of information (network of weather stations of Meteo France, World Databases, and Moodis satellite images) to evaluate spatial evapotranspiration (ETP) using the Turc method. This first step will reflect the degree of relevance of the indirect (satellite) methods and their generalization to sites without stations. The spatial variation representation of this parameter using the geographical information system (GIS) accounts for the heterogeneity of the behaviour of this parameter. This heterogeneity is due to the influence of site morphological factors and will make it possible to appreciate the role of certain topographic and hydrological parameters. A phase of predicting the evolution over the medium and long term of evapotranspiration under the effect of climate change by the application of the Intergovernmental Panel on Climate Change (IPCC) scenarios gives a realistic overview as to the contribution of aquatic systems to the scale of the region.Keywords: climate change, ETP, MODIS, GIEC scenarios
Procedia PDF Downloads 100