Search results for: user requirement
620 The Significance of Islamic Concept of Good Faith to Cure Flaws in Public International Law
Authors: M. A. H. Barry
Abstract:
The concept of Good faith (husn al-niyyah) and fair-dealing (Nadl) are the fundamental guiding elements in all contracts and other agreements under Islamic law. The preaching of Al-Quran and Prophet Muhammad’s (Peace Be upon Him) firmly command people to act in good faith in all dealings. There are several Quran verses and the Prophet’s saying which stressed the significance of dealing honestly and fairly in all transactions. Under the English law, the good faith is not considered a fundamental requirement for the formation of a legal contract. However, the concept of Good Faith in private contracts is recognized by the civil law system and in Article 7(1) of the Convention on International Sale of Goods (CISG-Vienna Convention-1980). It took several centuries for the international trading community to recognize the significance of the concept of good faith for the international sale of goods transactions. Nevertheless, the recognition of good faith in Civil law is only confined for the commercial contracts. Subsequently to the CISG, this concept has made inroads into the private international law. There are submissions in favour of applying the good faith concept to public international law based on tacit recognition by the international conventions and International Tribunals. However, under public international law the concept of good faith is not recognized as a source of rights or obligations. This weakens the spirit of the good faith concept, particularly when determining the international disputes. This also creates a fundamental flaw because the absence of good faith application means the breaches tainted by bad faith are tolerated. The objective of this research is to evaluate, examine and analyze the application of the concept of good faith in the modern laws and identify its limitation, in comparison with Islamic concept of good faith. This paper also identifies the problems and issues connected with the non-application of this concept to public international law. This research consists of three key components (1) the preliminary inquiry (2) subject analysis and discovery of research results, and (3) examining the challenging problems, and concluding with proposals. The preliminary inquiry is based on both the primary and secondary sources. The same sources are used for the subject analysis. This research also has both inductive and deductive features. The Islamic concept of good faith covers all situations and circumstances where the bad faith causes unfairness to the affected parties, especially the weak parties. Under the Islamic law, the concept of good faith is a source of rights and obligations as Islam prohibits any person committing wrongful or delinquent acts in any dealing whether in a private or public life. This rule is applicable not only for individuals but also for institutions, states, and international organizations. This paper explains how the unfairness is caused by non-recognition of the good faith concept as a source of rights or obligations under public international law and provides legal and non-legal reasons to show why the Islamic formulation is important.Keywords: good faith, the civil law system, the Islamic concept, public international law
Procedia PDF Downloads 147619 A Protein-Wave Alignment Tool for Frequency Related Homologies Identification in Polypeptide Sequences
Authors: Victor Prevost, Solene Landerneau, Michel Duhamel, Joel Sternheimer, Olivier Gallet, Pedro Ferrandiz, Marwa Mokni
Abstract:
The search for homologous proteins is one of the ongoing challenges in biology and bioinformatics. Traditionally, a pair of proteins is thought to be homologous when they originate from the same ancestral protein. In such a case, their sequences share similarities, and advanced scientific research effort is spent to investigate this question. On this basis, we propose the Protein-Wave Alignment Tool (”P-WAT”) developed within the framework of the France Relance 2030 plan. Our work takes into consideration the mass-related wave aspect of protein biosynthesis, by associating specific frequencies to each amino acid according to its mass. Amino acids are then regrouped within their mass category. This way, our algorithm produces specific alignments in addition to those obtained with a common amino acid coding system. For this purpose, we develop the ”P-WAT” original algorithm, able to address large protein databases, with different attributes such as species, protein names, etc. that allow us to align user’s requests with a set of specific protein sequences. The primary intent of this algorithm is to achieve efficient alignments, in this specific conceptual frame, by minimizing execution costs and information loss. Our algorithm identifies sequence similarities by searching for matches of sub-sequences of different sizes, referred to as primers. Our algorithm relies on Boolean operations upon a dot plot matrix to identify primer amino acids common to both proteins which are likely to be part of a significant alignment of peptides. From those primers, dynamic programming-like traceback operations generate alignments and alignment scores based on an adjusted PAM250 matrix.Keywords: protein, alignment, homologous, Genodic
Procedia PDF Downloads 113618 Assessment of Water Reuse Potential in a Metal Finishing Factory
Authors: Efe Gumuslu, Guclu Insel, Gülten Yuksek, Nilay Sayi Ucar, Emine Ubay Cokgor, Tuğba Olmez Hanci, Didem Okutman Tas, Fatoş Germirli Babuna, Derya Firat Ertem, Ökmen Yildirim, Özge Erturan, Betül Kirci
Abstract:
Although water reclamation and reuse are inseparable parts of sustainable production concept all around the world, current levels of reuse constitute only a small fraction of the total volume of industrial effluents. Nowadays, within the perspective of serious climate change, wastewater reclamation and reuse practices should be considered as a requirement. Industrial sector is one of the largest users of water sources. The OECD Environmental Outlook to 2050 predicts that global water demand for manufacturing will increase by 400% from 2000 to 2050 which is much larger than any other sector. Metal finishing industry is one of the industries that requires high amount of water during the manufacturing. Therefore, actions regarding the improvement of wastewater treatment and reuse should be undertaken on both economic and environmental sustainability grounds. Process wastewater can be reused for more purposes if the appropriate treatment systems are installed to treat the wastewater to the required quality level. Recent studies showed that membrane separation techniques may help in solving the problem of attaining a suitable quality of water that allows being recycled back to the process. The metal finishing factory where this study is conducted is one of the biggest white-goods manufacturers in Turkey. The sheet metal parts used in the cookers production have to be exposed to surface pre-treatment processes composed of degreasing, rinsing, nanoceramics coating and deionization rinsing processes, consecutively. The wastewater generating processes in the factory are enamel coating, painting and styrofoam processes. In the factory, the main source of water is the well water. While some part of the well water is directly used in the processes after passing through resin treatment, some portion of it is directed to the reverse osmosis treatment to obtain required water quality for enamel coating and painting processes. In addition to these processes another important source of water that can be considered as a potential water source is rainwater (3660 tons/year). In this study, process profiles as well as pollution profiles were assessed by a detailed quantitative and qualitative characterization of the wastewater sources generated in the factory. Based on the preliminary results the main water sources that can be considered for reuse in the processes were determined as painting and styrofoam processes.Keywords: enamel coating, painting, reuse, wastewater
Procedia PDF Downloads 379617 Unlocking the Future of Grocery Shopping: Graph Neural Network-Based Cold Start Item Recommendations with Reverse Next Item Period Recommendation (RNPR)
Authors: Tesfaye Fenta Boka, Niu Zhendong
Abstract:
Recommender systems play a crucial role in connecting individuals with the items they require, as is particularly evident in the rapid growth of online grocery shopping platforms. These systems predominantly rely on user-centered recommendations, where items are suggested based on individual preferences, garnering considerable attention and adoption. However, our focus lies on the item-centered recommendation task within the grocery shopping context. In the reverse next item period recommendation (RNPR) task, we are presented with a specific item and challenged to identify potential users who are likely to consume it in the upcoming period. Despite the ever-expanding inventory of products on online grocery platforms, the cold start item problem persists, posing a substantial hurdle in delivering personalized and accurate recommendations for new or niche grocery items. To address this challenge, we propose a Graph Neural Network (GNN)-based approach. By capitalizing on the inherent relationships among grocery items and leveraging users' historical interactions, our model aims to provide reliable and context-aware recommendations for cold-start items. This integration of GNN technology holds the promise of enhancing recommendation accuracy and catering to users' individual preferences. This research contributes to the advancement of personalized recommendations in the online grocery shopping domain. By harnessing the potential of GNNs and exploring item-centered recommendation strategies, we aim to improve the overall shopping experience and satisfaction of users on these platforms.Keywords: recommender systems, cold start item recommendations, online grocery shopping platforms, graph neural networks
Procedia PDF Downloads 88616 Development of a Novel Clinical Screening Tool, Using the BSGE Pain Questionnaire, Clinical Examination and Ultrasound to Predict the Severity of Endometriosis Prior to Laparoscopic Surgery
Authors: Marlin Mubarak
Abstract:
Background: Endometriosis is a complex disabling disease affecting young females in the reproductive period mainly. The aim of this project is to generate a diagnostic model to predict severity and stage of endometriosis prior to Laparoscopic surgery. This will help to improve the pre-operative diagnostic accuracy of stage 3 & 4 endometriosis and as a result, refer relevant women to a specialist centre for complex Laparoscopic surgery. The model is based on the British Society of Gynaecological Endoscopy (BSGE) pain questionnaire, clinical examination and ultrasound scan. Design: This is a prospective, observational, study, in which women completed the BSGE pain questionnaire, a BSGE requirement. Also, as part of the routine preoperative assessment patient had a routine ultrasound scan and when recto-vaginal and deep infiltrating endometriosis was suspected an MRI was performed. Setting: Luton & Dunstable University Hospital. Patients: Symptomatic women (n = 56) scheduled for laparoscopy due to pelvic pain. The age ranged between 17 – 52 years of age (mean 33.8 years, SD 8.7 years). Interventions: None outside the recognised and established endometriosis centre protocol set up by BSGE. Main Outcome Measure(s): Sensitivity and specificity of endometriosis diagnosis predicted by symptoms based on BSGE pain questionnaire, clinical examinations and imaging. Findings: The prevalence of diagnosed endometriosis was calculated to be 76.8% and the prevalence of advanced stage was 55.4%. Deep infiltrating endometriosis in various locations was diagnosed in 32/56 women (57.1%) and some had DIE involving several locations. Logistic regression analysis was performed on 36 clinical variables to create a simple clinical prediction model. After creating the scoring system using variables with P < 0.05, the model was applied to the whole dataset. The sensitivity was 83.87% and specificity 96%. The positive likelihood ratio was 20.97 and the negative likelihood ratio was 0.17, indicating that the model has a good predictive value and could be useful in predicting advanced stage endometriosis. Conclusions: This is a hypothesis-generating project with one operator, but future proposed research would provide validation of the model and establish its usefulness in the general setting. Predictive tools based on such model could help organise the appropriate investigation in clinical practice, reduce risks associated with surgery and improve outcome. It could be of value for future research to standardise the assessment of women presenting with pelvic pain. The model needs further testing in a general setting to assess if the initial results are reproducible.Keywords: deep endometriosis, endometriosis, minimally invasive, MRI, ultrasound.
Procedia PDF Downloads 353615 Flow-Induced Vibration Marine Current Energy Harvesting Using a Symmetrical Balanced Pair of Pivoted Cylinders
Authors: Brad Stappenbelt
Abstract:
The phenomenon of vortex-induced vibration (VIV) for elastically restrained cylindrical structures in cross-flows is relatively well investigated. The utility of this mechanism in harvesting energy from marine current and tidal flows is however arguably still in its infancy. With relatively few moving components, a flow-induced vibration-based energy conversion device augers low complexity compared to the commonly employed turbine design. Despite the interest in this concept, a practical device has yet to emerge. It is desirable for optimal system performance to design for a very low mass or mass moment of inertia ratio. The device operating range, in particular, is maximized below the vortex-induced vibration critical point where an infinite resonant response region is realized. An unfortunate consequence of this requirement is large buoyancy forces that need to be mitigated by gravity-based, suction-caisson or anchor mooring systems. The focus of this paper is the testing of a novel VIV marine current energy harvesting configuration that utilizes a symmetrical and balanced pair of horizontal pivoted cylinders. The results of several years of experimental investigation, utilizing the University of Wollongong fluid mechanics laboratory towing tank, are analyzed and presented. A reduced velocity test range of 0 to 60 was covered across a large array of device configurations. In particular, power take-off damping ratios spanning from 0.044 to critical damping were examined in order to determine the optimal conditions and hence the maximum device energy conversion efficiency. The experiments conducted revealed acceptable energy conversion efficiencies of around 16% and desirable low flow-speed operating ranges when compared to traditional turbine technology. The potentially out-of-phase spanwise VIV cells on each arm of the device synchronized naturally as no decrease in amplitude response and comparable energy conversion efficiencies to the single cylinder arrangement were observed. In addition to the spatial design benefits related to the horizontal device orientation, the main advantage demonstrated by the current symmetrical horizontal configuration is to allow large velocity range resonant response conditions without the excessive buoyancy. The novel configuration proposed shows clear promise in overcoming many of the practical implementation issues related to flow-induced vibration marine current energy harvesting.Keywords: flow-induced vibration, vortex-induced vibration, energy harvesting, tidal energy
Procedia PDF Downloads 146614 A Cloud-Based Mobile Auditing Tools for Muslim-Friendly Hospitality Services
Authors: Mohd Iskandar Illyas Tan, Zuhra Junaida Mohamad Husny, Farawahida Mohd Yusof
Abstract:
The potentials of Muslim-friendly hospitality services bring huge opportunities to operators (hoteliers, tourist guides, and travel agents), especially among the Muslim countries. In order to provide guidelines that facilitate the operations among these operators, standards and manuals have been developing by the authorities. Among the challenges is the applicability and complexity of the standard to be adopted in the real world. Mobile digital technology can be implemented to overcome those challenges. A prototype has been developed to help operators and authorities to assess their readiness in complying with MS2610:2015. This study analyzes the of mobile digital technology characteristics that are suitable for the user in conducting sharia’ compliant hospitality audit. A focus group study was conducted in the state of Penang, Malaysia that involves operators (hoteliers, tourist guide, and travel agents) as well as agencies (Islamic Tourism Center, Penang Islamic Affairs Department, Malaysian Standard) that involved directly in the implementation of the certification. Both groups were given the 3 weeks to test and provide feedback on the usability of the mobile applications in order to conduct an audit on their readiness towards the Muslim-friendly hospitality services standard developed by the Malaysian Standard. The feedbacks were analyzed and the overall results show that three criteria (ease of use, completeness and fast to complete) show the highest responses among both groups for the mobile application. This study provides the evidence that the mobile application development has huge potentials to be implemented by the Muslim-friendly hospitality services operator and agencies.Keywords: hospitality, innovation, audit, compliance, mobile application
Procedia PDF Downloads 132613 Production of Bricks Using Mill Waste and Tyre Crumbs at a Low Temperature by Alkali-Activation
Authors: Zipeng Zhang, Yat C. Wong, Arul Arulrajah
Abstract:
Since automobiles became widely popular around the early 20th century, end-of-life tyres have been one of the major types of waste humans encounter. Every minute, there are considerable quantities of tyres being disposed of around the world. Most end-of-life tyres are simply landfilled or simply stockpiled, other than recycling. To address the potential issues caused by tyre waste, incorporating it into construction materials can be a possibility. This research investigated the viability of manufacturing bricks using mill waste and tyre crumb by alkali-activation at a relatively low temperature. The mill waste was extracted from a brick factory located in Melbourne, Australia, and the tyre crumbs were supplied by a local recycling company. As the main precursor, the mill waste was activated by the alkaline solution, which was comprised of sodium hydroxide (8m) and sodium silicate (liquid). The introduction ratio of alkaline solution (relative to the solid weight) and the weight ratio between sodium hydroxide and sodium silicate was fixed at 20 wt.% and 1:1, respectively. The tyre crumb was introduced to substitute part of the mill waste at four ratios by weight, namely 0, 5, 10 and 15%. The mixture of mill waste and tyre crumbs were firstly dry-mixed for 2 min to ensure the homogeneity, followed by a 2.5-min wet mixing after adding the solution. The ready mixture subsequently was press-moulded into blocks with the size of 109 mm in length, 112.5 mm in width and 76 mm in height. The blocks were cured at 50°C with 95% relative humidity for 2 days, followed by a 110°C oven-curing for 1 day. All the samples were then placed under the ambient environment until the age of 7 and 28 days for testing. A series of tests were conducted to evaluate the linear shrinkage, compressive strength and water absorption of the samples. In addition, the microstructure of the samples was examined via the scanning electron microscope (SEM) test. The results showed the highest compressive strength was 17.6 MPa, found in the 28-day-old group using 5 wt.% tyre crumbs. Such strength has been able to satisfy the requirement of ASTM C67. However, the increasing addition of tyre crumb weakened the compressive strength of samples. Apart from the strength, the linear shrinkage and water absorption of all the groups can meet the requirements of the standard. It is worth noting that the use of tyre crumbs tended to decrease the shrinkage and even caused expansion when the tyre content was over 15 wt.%. The research also found that there was a significant reduction in compressive strength for the samples after water absorption tests. In conclusion, the tyre crumbs have the potential to be used as a filler material in brick manufacturing, but more research needs to be done to tackle the durability problem in the future.Keywords: bricks, mill waste, tyre crumbs, waste recycling
Procedia PDF Downloads 122612 Low Overhead Dynamic Channel Selection with Cluster-Based Spatial-Temporal Station Reporting in Wireless Networks
Authors: Zeyad Abdelmageid, Xianbin Wang
Abstract:
Choosing the operational channel for a WLAN access point (AP) in WLAN networks has been a static channel assignment process initiated by the user during the deployment process of the AP, which fails to cope with the dynamic conditions of the assigned channel at the station side afterward. However, the dramatically growing number of Wi-Fi APs and stations operating in the unlicensed band has led to dynamic, distributed, and often severe interference. This highlights the urgent need for the AP to dynamically select the best overall channel of operation for the basic service set (BSS) by considering the distributed and changing channel conditions at all stations. Consequently, dynamic channel selection algorithms which consider feedback from the station side have been developed. Despite the significant performance improvement, existing channel selection algorithms suffer from very high feedback overhead. Feedback latency from the STAs, due to the high overhead, can cause the eventually selected channel to no longer be optimal for operation due to the dynamic sharing nature of the unlicensed band. This has inspired us to develop our own dynamic channel selection algorithm with reduced overhead through the proposed low-overhead, cluster-based station reporting mechanism. The main idea behind the cluster-based station reporting is the observation that STAs which are very close to each other tend to have very similar channel conditions. Instead of requesting each STA to report on every candidate channel while causing high overhead, the AP divides STAs into clusters then assigns each STA in each cluster one channel to report feedback on. With the proper design of the cluster based reporting, the AP does not lose any information about the channel conditions at the station side while reducing feedback overhead. The simulation results show equal performance and, at times, better performance with a fraction of the overhead. We believe that this algorithm has great potential in designing future dynamic channel selection algorithms with low overhead.Keywords: channel assignment, Wi-Fi networks, clustering, DBSCAN, overhead
Procedia PDF Downloads 118611 Five Years Analysis and Mitigation Plans on Adjustment Orders Impacts on Projects in Kuwait's Oil and Gas Sector
Authors: Rawan K. Al-Duaij, Salem A. Al-Salem
Abstract:
Projects, the unique and temporary process of achieving a set of requirements have always been challenging; Planning the schedule and budget, managing the resources and risks are mostly driven by a similar past experience or the technical consultations of experts in the matter. With that complexity of Projects in Scope, Time, and execution environment, Adjustment Orders are tools to reflect changes to the original project parameters after Contract signature. Adjustment Orders are the official/legal amendments to the terms and conditions of a live Contract. Reasons for issuing Adjustment Orders arise from changes in Contract scope, technical requirement and specification resulting in scope addition, deletion, or alteration. It can be as well a combination of most of these parameters resulting in an increase or decrease in time and/or cost. Most business leaders (handling projects in the interest of the owner) refrain from using Adjustment Orders considering their main objectives of staying within budget and on schedule. Success in managing the changes results in uninterrupted execution and agreed project costs as well as schedule. Nevertheless, this is not always practically achievable. In this paper, a detailed study through utilizing Industrial Engineering & Systems Management tools such as Six Sigma, Data Analysis, and Quality Control were implemented on the organization’s five years records of the issued Adjustment Orders in order to investigate their prevalence, and time and cost impact. The analysis outcome revealed and helped to identify and categorize the predominant causations with the highest impacts, which were considered most in recommending the corrective measures to reach the objective of minimizing the Adjustment Orders impacts. Data analysis demonstrated no specific trend in the AO frequency in past five years; however, time impact is more than the cost impact. Although Adjustment Orders might never be avoidable; this analysis offers’ some insight to the procedural gaps, and where it is highly impacting the organization. Possible solutions are concluded such as improving project handling team’s coordination and communication, utilizing a blanket service contract, and modifying the projects gate system procedures to minimize the possibility of having similar struggles in future. Projects in the Oil and Gas sector are always evolving and demand a certain amount of flexibility to sustain the goals of the field. As it will be demonstrated, the uncertainty of project parameters, in adequate project definition, operational constraints and stringent procedures are main factors resulting in the need for Adjustment Orders and accordingly the recommendation will be to address that challenge.Keywords: adjustment orders, data analysis, oil and gas sector, systems management
Procedia PDF Downloads 163610 An Investigation on Opportunities and Obstacles on Implementation of Building Information Modelling for Pre-fabrication in Small and Medium Sized Construction Companies in Germany: A Practical Approach
Authors: Nijanthan Mohan, Rolf Gross, Fabian Theis
Abstract:
The conventional method used in the construction industries often resulted in significant rework since most of the decisions were taken onsite under the pressure of project deadlines and also due to the improper information flow, which results in ineffective coordination. However, today’s architecture, engineering, and construction (AEC) stakeholders demand faster and accurate deliverables, efficient buildings, and smart processes, which turns out to be a tall order. Hence, the building information modelling (BIM) concept was developed as a solution to fulfill the above-mentioned necessities. Even though BIM is successfully implemented in most of the world, it is still in the early stages in Germany, since the stakeholders are sceptical of its reliability and efficiency. Due to the huge capital requirement, the small and medium-sized construction companies are still reluctant to implement BIM workflow in their projects. The purpose of this paper is to analyse the opportunities and obstacles to implementing BIM for prefabrication. Among all other advantages of BIM, pre-fabrication is chosen for this paper because it plays a vital role in creating an impact on time as well as cost factors of a construction project. The positive impact of prefabrication can be explicitly observed by the project stakeholders and participants, which enables the breakthrough of the skepticism factor among the small scale construction companies. The analysis consists of the development of a process workflow for implementing prefabrication in building construction, followed by a practical approach, which was executed with two case studies. The first case study represents on-site prefabrication, and the second was done for off-site prefabrication. It was planned in such a way that the first case study gives a first-hand experience for the workers at the site on the BIM model so that they can make much use of the created BIM model, which is a better representation compared to the traditional 2D plan. The main aim of the first case study is to create a belief in the implementation of BIM models, which was succeeded by the execution of offshore prefabrication in the second case study. Based on the case studies, the cost and time analysis was made, and it is inferred that the implementation of BIM for prefabrication can reduce construction time, ensures minimal or no wastes, better accuracy, less problem-solving at the construction site. It is also observed that this process requires more planning time, better communication, and coordination between different disciplines such as mechanical, electrical, plumbing, architecture, etc., which was the major obstacle for successful implementation. This paper was carried out in the perspective of small and medium-sized mechanical contracting companies for the private building sector in Germany.Keywords: building information modelling, construction wastes, pre-fabrication, small and medium sized company
Procedia PDF Downloads 113609 A Multi-Modal Virtual Walkthrough of the Virtual Past and Present Based on Panoramic View, Crowd Simulation and Acoustic Heritage on Mobile Platform
Authors: Lim Chen Kim, Tan Kian Lam, Chan Yi Chee
Abstract:
This research presents a multi-modal simulation in the reconstruction of the past and the construction of present in digital cultural heritage on mobile platform. In bringing the present life, the virtual environment is generated through a presented scheme for rapid and efficient construction of 360° panoramic view. Then, acoustical heritage model and crowd model are presented and improvised into the 360° panoramic view. For the reconstruction of past life, the crowd is simulated and rendered in an old trading port. However, the keystone of this research is in a virtual walkthrough that shows the virtual present life in 2D and virtual past life in 3D, both in an environment of virtual heritage sites in George Town through mobile device. Firstly, the 2D crowd is modelled and simulated using OpenGL ES 1.1 on mobile platform. The 2D crowd is used to portray the present life in 360° panoramic view of a virtual heritage environment based on the extension of Newtonian Laws. Secondly, the 2D crowd is animated and rendered into 3D with improved variety and incorporated into the virtual past life using Unity3D Game Engine. The behaviours of the 3D models are then simulated based on the enhancement of the classical model of Boid algorithm. Finally, a demonstration system is developed and integrated with the models, techniques and algorithms of this research. The virtual walkthrough is demonstrated to a group of respondents and is evaluated through the user-centred evaluation by navigating around the demonstration system. The results of the evaluation based on the questionnaires have shown that the presented virtual walkthrough has been successfully deployed through a multi-modal simulation and such a virtual walkthrough would be particularly useful in a virtual tour and virtual museum applications.Keywords: Boid Algorithm, Crowd Simulation, Mobile Platform, Newtonian Laws, Virtual Heritage
Procedia PDF Downloads 277608 Usability Evaluation of a Self-Report Mobile App for COVID-19 Symptoms: Supporting Health Monitoring in the Work Context
Authors: Kevin Montanez, Patricia Garcia
Abstract:
The confinement and restrictions adopted to avoid an exponential spread of the COVID-19 have negatively impacted the Peruvian economy. In this context, Industries offering essential products could continue operating, but they have to follow safety protocols and implement strategies to ensure employee health. In view of the increasing internet access and mobile phone ownership, “Alerta Temprana”, a mobile app, was developed to self-report COVID-19 symptoms in the work context. In this study, the usability of the mobile app “Alerta Temprana” was evaluated from the perspective of health monitors and workers. In addition to reporting the metrics related to the usability of the application, the utility of the system is also evaluated from the monitors' perspective. In this descriptive study, the participants used the mobile app for two months. Afterwards, System Usability Scale (SUS) questionnaire was answered by the workers and monitors. A Usefulness questionnaire with open questions was also used for the monitors. The data related to the use of the application was collected during one month. Furthermore, descriptive statistics and bivariate analysis were used. The workers rated the application as good (70.39). In the case of the monitors, usability was excellent (83.0). The most important feature for the monitors were the emails generated by the application. The average interaction per user was 30 seconds and a total of 6172 self-reports were sent. Finally, a statistically significant association was found between the acceptability scale and the work area. The results of this study suggest that Alerta Temprana has the potential to be used for surveillance and health monitoring in any context of face-to-face modality. Participants reported a high degree of ease of use. However, from the perspective of workers, SUS cannot diagnose usability issues and we suggest we use another standard usability questionnaire to improve "Alerta Temprana" for future use.Keywords: public health in informatics, mobile app, usability, self-report
Procedia PDF Downloads 117607 Investigation of the Kutta Condition Using Unsteady Flow
Authors: K. Bhojnadh, M. Fiddler, D. Cheshire
Abstract:
An investigation into the Kutta effect on the trailing edge of a subsonic aerofoil was conducted which led to an analysis using Ansys Fluent to determine the effect of flow separation over a NACA 0012 aerofoil. This aerofoil was subjected to oscillations to create an unsteady flow over the aerofoil, therefore, creating turbulence, with unsteady aerodynamics playing a key role to determine the flow regimes when the aerofoil is subjected to different angles of attack along with varying Reynolds numbers. Many theories were evolved to determine the flow parameters of a 2-D aerofoil in these unsteady conditions because they behave unpredictably at the trailing edge when subjected to a different angle of attack. The shear area observed in the boundary layer at the trailing edge tends towards an unsteady turbulent flow even at small angles of attack, creating drag as the flow separates, reducing the aerodynamic performance of aerofoil. In this paper, research was conducted to determine the effect of Kutta circulation over the aerofoil and the effect of that circulation in reducing the effect of pressure and boundary layer distribution over the aerofoil. The effect of circulation is observed by using Ansys Fluent by using varying flow parameters and differential schemes to observe the flow behaviour on the aerofoil. Initially, steady flow analysis was conducted on the aerofoil to determine the effect of circulation, and it was noticed that the effect of circulation could only be properly observed when the aerofoil is subjected to oscillations. Therefore, that was modelled by using Ansys user-defined functions, which define the motion of the aerofoil by creating a dynamic mesh on the aerofoil. Initial results were observed, and further development of the dynamic mesh functions in Ansys is taking place. This research will determine the overall basic principles of unsteady flow aerodynamics applied to the investigation of Kutta related circulation, and gives an indication regarding the generation of vortices which is discussed further in this paper.Keywords: circulation, flow seperation, turbulence modelling, vortices
Procedia PDF Downloads 205606 Integration of Thermal Energy Storage and Electric Heating with Combined Heat and Power Plants
Authors: Erich Ryan, Benjamin McDaniel, Dragoljub Kosanovic
Abstract:
Combined heat and power (CHP) plants are an efficient technology for meeting the heating and electric needs of large campus energy systems, but have come under greater scrutiny as the world pushes for emissions reductions and lower consumption of fossil fuels. The electrification of heating and cooling systems offers a great deal of potential for carbon savings, but these systems can be costly endeavors due to increased electric consumption and peak demand. Thermal energy storage (TES) has been shown to be an effective means of improving the viability of electrified systems, by shifting heating and cooling load to off-peak hours and reducing peak demand charges. In this study, we analyze the integration of an electrified heating and cooling system with thermal energy storage into a campus CHP plant, to investigate the potential of leveraging existing infrastructure and technologies with the climate goals of the 21st century. A TRNSYS model was built to simulate a ground source heat pump (GSHP) system with TES using measured campus heating and cooling loads. The GSHP with TES system is modeled to follow the parameters of industry standards and sized to provide an optimal balance of capital and operating costs. Using known CHP production information, costs and emissions were investigated for a unique large energy user rate structure that operates a CHP plant. The results highlight the cost and emissions benefits of a targeted integration of heat pump technology within the framework of existing CHP systems, along with the performance impacts and value of TES capability within the combined system.Keywords: thermal energy storage, combined heat and power, heat pumps, electrification
Procedia PDF Downloads 89605 The Effect of Data Integration to the Smart City
Authors: Richard Byrne, Emma Mulliner
Abstract:
Smart cities are a vision for the future that is increasingly becoming a reality. While a key concept of the smart city is the ability to capture, communicate, and process data that has long been produced through day-to-day activities of the city, much of the assessment models in place neglect this fact to focus on ‘smartness’ concepts. Although it is true technology often provides the opportunity to capture and communicate data in more effective ways, there are also human processes involved that are just as important. The growing importance with regards to the use and ownership of data in society can be seen by all with companies such as Facebook and Google increasingly coming under the microscope, however, why is the same scrutiny not applied to cities? The research area is therefore of great importance to the future of our cities here and now, while the findings will be of just as great importance to our children in the future. This research aims to understand the influence data is having on organisations operating throughout the smart cities sector and employs a mixed-method research approach in order to best answer the following question: Would a data-based evaluation model for smart cities be more appropriate than a smart-based model in assessing the development of the smart city? A fully comprehensive literature review concluded that there was a requirement for a data-driven assessment model for smart cities. This was followed by a documentary analysis to understand the root source of data integration to the smart city. A content analysis of city data platforms enquired as to the alternative approaches employed by cities throughout the UK and draws on best practice from New York to compare and contrast. Grounded in theory, the research findings to this point formulated a qualitative analysis framework comprised of: the changing environment influenced by data, the value of data in the smart city, the data ecosystem of the smart city and organisational response to the data orientated environment. The framework was applied to analyse primary data collected through the form of interviews with both public and private organisations operating throughout the smart cities sector. The work to date represents the first stage of data collection that will be built upon by a quantitative research investigation into the feasibility of data network effects in the smart city. An analysis into the benefits of data interoperability supporting services to the smart city in the areas of health and transport will conclude the research to achieve the aim of inductively forming a framework that can be applied to future smart city policy. To conclude, the research recognises the influence of technological perspectives in the development of smart cities to date and highlights this as a challenge to introduce theory applied with a planning dimension. The primary researcher has utilised their experience working in the public sector throughout the investigation to reflect upon what is perceived as a gap in practice of where we are today, to where we need to be tomorrow.Keywords: data, planning, policy development, smart cities
Procedia PDF Downloads 310604 An Analysis of Possible Implications of Patent Term Extension in Pharmaceutical Sector on Indian Consumers
Authors: Anandkumar Rshindhe
Abstract:
Patents are considered as good monopoly in India. It is a mechanism by which the inventor is encouraged to do invention and also to make available to the society at large with a new useful technology. Patent system does not provide any protection to the invention itself but to the claims (rights) which the patentee has identified in relation to his invention. Thus the patentee is granted monopoly to the extent of his recognition of his own rights in the form of utilities and all other utilities of invention are for the public. Thus we find both benefit to the inventor and the public at large that is the ultimate consumer. But developing any such technology is not free of cost. Inventors do a lot of investment in the coming out with a new technologies. One such example if of Pharmaceutical industries. These pharmaceutical Industries do lot of research and invest lot of money, time and labour in coming out with these invention. Once invention is done or process identified, in order to protect it, inventors approach Patent system to protect their rights in the form of claim over invention. The patent system takes its own time in giving recognition to the invention as patent. Even after the grant of patent the pharmaceutical companies need to comply with many other legal formalities to launch it as a drug (medicine) in market. Thus major portion in patent term is unproductive to patentee and whatever limited period the patentee gets would be not sufficient to recover the cost involved in invention and as a result price of patented product is raised very much, just to recover the cost of invent. This is ultimately a burden on consumer who is paying more only because the legislature has failed to provide for the delay and loss caused to patentee. This problem can be effectively remedied if Patent Term extension is done. Due to patent term extension, the inventor gets some more time in recovering the cost of invention. Thus the end product is much more cheaper compared to non patent term extension.The basic question here arises is that when the patent period granted to a patentee is only 20 years and out of which a major portion is spent in complying with necessary legal formalities before making the medicine available in market, does the company with the limited period of monopoly recover its investment made for doing research. Further the Indian patent Act has certain provisions making it mandatory on the part of patentee to make its patented invention at reasonable affordable price in India. In the light of above questions whether extending the term of patent would be a proper solution and a necessary requirement to protect the interest of patentee as well as the ultimate consumer. The basic objective of this paper would be to check the implications of Extending the Patent term on Indian Consumers. Whether it provides the benefits to the patentee, consumer or a hardship to the Generic industry and consumer.Keywords: patent term extention, consumer interest, generic drug industry, pharmaceutical industries
Procedia PDF Downloads 451603 Framework for Incorporating Environmental Performance in Network-Level Pavement Maintenance Program
Authors: Jessica Achebe, Susan Tighe
Abstract:
The reduction of material consumption and greenhouse gas emission when maintain and rehabilitating road networks can achieve added benefits including improved life cycle performance of pavements, reduced climate change impacts and human health effect due to less air pollution, improved productivity due to an optimal allocation of resources and reduced road user cost. This is the essence of incorporating environmental sustainability into pavement management. The functionality of performance measurement approach has made it one of the most valuable tool to Pavement Management Systems (PMSs) to account for different criteria in the decision-making process. However measuring the environmental performance of road network is still a far-fetched practice in road network management, more so an ostensive agency-wide environmental sustainability or sustainable maintenance specifications is missing. To address this challenge, this present research focuses on the environmental sustainability performance of network-level pavement management. The ultimate goal is to develop a framework to incorporate environmental sustainability in pavement management systems for network-level maintenance programming. In order to achieve this goal, this paper present the first step, the intention is to review the previous studies that employed environmental performance measures, as well as the suitability of environmental performance indicators for the evaluation of the sustainability of network-level pavement maintenance strategies. Through an industry practice survey, this paper provides a brief forward regarding the pavement manager motivations and barriers to making more sustainable decisions, and data needed to support the network-level environmental sustainability. The trends in network-level sustainable pavement management are also presented, existing gaps are highlighted, and ideas are proposed for network-level sustainable maintenance and rehabilitation programming.Keywords: pavement management, environment sustainability, network-level evaluation, performance measures
Procedia PDF Downloads 306602 Gnss Aided Photogrammetry for Digital Mapping
Authors: Muhammad Usman Akram
Abstract:
This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions.Keywords: photogrammetry, post processing kinematics, real time kinematics, manual data inquiry
Procedia PDF Downloads 30601 Adaptive Reuse of Lost Urban Space
Authors: Rana Sameeh
Abstract:
The city is the greatest symbol of human civilization and has been built for safety and comfort. However, uncontrolled urban growth caused some anonymous and unsightly images of the cities such as unused or abandoned spaces. When social interaction is missed in a public space it means the public space is lost since public spaces reflect the social life and interaction of people. Accordingly; this space became one of the most meaningless parts of the cities and has broken the continuity of the urban fabric. Lost urban spaces are the leftover unstructured landscape within the urban fabric. They are generally the unrecognized urban areas that are in need of redesign, since they have a great value that can add to their surrounding urban context. The research significance lies within the importance of urban open spaces, their value and their impact on the urban fabric. The research also addresses the reuse and reclamation of lost urban spaces in order to increase the percentage of green areas along the urban fabric, provide urban open spaces, develop a sustainable approach towards urban landscape and enhance the quality of the public open space and user experience. In addition, the reuse of lost space will give it the identity and function it lacks while also providing places for presence, spending time and observing. Creating continuity in a broken urban fabric represents an exploratory process in the relationship between infrastructure and the urban fabric and seeks to establish an architectural solution to leftover space within the city. In doing so, the research establishes a framework (criteria) for adaptive reuse of lost urban space throughout inductive and deductive methodology, analytical methodology; by analyzing some relevant examples and similar cases of lost spaces and finally through field methodology; by applying the achieved criteria on a case study in Alexandria and carrying on SWOT analysis and evaluation of the potentials of this case study.Keywords: adaptive reuse, lost urban space, quality of public open space, urban fabric
Procedia PDF Downloads 646600 Damage Mesomodel Based Low-Velocity Impact Damage Analysis of Laminated Composite Structures
Authors: Semayat Fanta, P.M. Mohite, C.S. Upadhyay
Abstract:
Damage meso-model for laminates is one of the most widely applicable approaches for the analysis of damage induced in laminated fiber-reinforced polymeric composites. Damage meso-model for laminates has been developed over the last three decades by many researchers in experimental, theoretical, and analytical methods that have been carried out in micromechanics as well as meso-mechanics analysis approaches. It has been fundamentally developed based on the micromechanical description that aims to predict the damage initiation and evolution until the failure of structure in various loading conditions. The current damage meso-model for laminates aimed to act as a bridge between micromechanics and macro-mechanics of the laminated composite structure. This model considers two meso-constituents for the analysis of damage in ply and interface that imparted from low-velocity impact. The damages considered in this study include fiber breakage, matrix cracking, and diffused damage of the lamina, and delamination of the interface. The damage initiation and evolution in laminae can be modeled in terms of damaged strain energy density using damage parameters and the thermodynamic irreversible forces. Interface damage can be modeled with a new concept of spherical micro-void in the resin-rich zone of interface material. The damage evolution is controlled by the damage parameter (d) and the radius of micro-void (r) from the point of damage nucleation to its saturation. The constitutive martial model for meso-constituents is defined in a user material subroutine VUMAT and implemented in ABAQUS/Explicit finite element modeling tool. The model predicts the damages in the meso-constituents level very accurately and is considered the most effective technique of modeling low-velocity impact simulation for laminated composite structures.Keywords: mesomodel, laminate, low-energy impact, micromechanics
Procedia PDF Downloads 223599 An AI-generated Semantic Communication Platform in HCI Course
Authors: Yi Yang, Jiasong Sun
Abstract:
Almost every aspect of our daily lives is now intertwined with some degree of human-computer interaction (HCI). HCI courses draw on knowledge from disciplines as diverse as computer science, psychology, design principles, anthropology, and more. Our HCI courses, named the Media and Cognition course, are constantly updated to reflect state-of-the-art technological advancements such as virtual reality, augmented reality, and artificial intelligence-based interactions. For more than a decade, our course has used an interest-based approach to teaching, in which students proactively propose some research-based questions and collaborate with teachers, using course knowledge to explore potential solutions. Semantic communication plays a key role in facilitating understanding and interaction between users and computer systems, ultimately enhancing system usability and user experience. The advancements in AI-generated technology, which have gained significant attention from both academia and industry in recent years, are exemplified by language models like GPT-3 that generate human-like dialogues from given prompts. Our latest version of the Human-Computer Interaction course practices a semantic communication platform based on AI-generated techniques. The purpose of this semantic communication is twofold: to extract and transmit task-specific information while ensuring efficient end-to-end communication with minimal latency. An AI-generated semantic communication platform evaluates the retention of signal sources and converts low-retain ability visual signals into textual prompts. These data are transmitted through AI-generated techniques and reconstructed at the receiving end; on the other hand, visual signals with a high retain ability rate are compressed and transmitted according to their respective regions. The platform and associated research are a testament to our students' growing ability to independently investigate state-of-the-art technologies.Keywords: human-computer interaction, media and cognition course, semantic communication, retainability, prompts
Procedia PDF Downloads 115598 Son Preference in Afghanistan and Its Impact on Fertility Outcomes
Authors: Saha Naseri
Abstract:
Introduction/Objective: Son preference, a preference for sons over daughters, is a practice deeply-rooted in gender inequality that is widespread in many societies and across different religions and cultures. In this study, we are aiming to study the effects of son preference on fertility outcomes (birth interval and current contraceptive use) in Afghanistan, where have been perceived with high rates of son preference. The objectives of the study are to examine the association between the sex of the previous child and the duration of the subsequent birth interval and to evaluate the effect of son preference on current contraceptive use. Methodology: Afghanistan Demographic and Health Survey (DHS) (2015) was used to study the impact of son preference on fertility outcomes among married women. The data collected from 28,661 on currently-married women, aged between 15 and 49 years who have at least one child, have used to conduct this quantitative study. Outcomes of interest are birth interval and current contraceptive use. Simple and multiple regression analysis have been conducted to assess the effect of son preference on these fertility outcomes. Results: The present study has highlighted that son preference somehow exists among married women in Afghanistan. It is indicated that the sex of the first birth is significantly associated with the succeeding birth interval. Having a female child as the first baby was associated with a shorter average succeeding birth interval by 1.8 months compared to a baby boy (p-value = 0.000). For the second model, the results identified that women who desire for more sons have 7% higher odds to be current contraceptive user compared to those who have no preference (p-value = 0.03). The latter results do not indicate the son preference. However, one limitation for this result was the timeliness of the questions asked since contraceptive use in the current time was asked along with a question on ‘future’ desired sex composition. Moreover, women may have just given birth and want to reach a certain time span of birth interval before planning for another child, even if it was a boy, which might have affected the results. Conclusion: Overall, this study has demonstrated that there is a positive relationship between son preference and one main fertility behaviors, birth interval. The second fertility outcome, current contraceptive use, was not a good indicator to measure son preference. Based on the finding, recommendations will be made for appropriate interventions addressing gender norms and related fertility decisions.Keywords: Afghanistan, birth interval, contraceptive, son preference
Procedia PDF Downloads 173597 Meet Automotive Software Safety and Security Standards Expectations More Quickly
Authors: Jean-François Pouilly
Abstract:
This study addresses the growing complexity of embedded systems and the critical need for secure, reliable software. Traditional cybersecurity testing methods, often conducted late in the development cycle, struggle to keep pace. This talk explores how formal methods, integrated with advanced analysis tools, empower C/C++ developers to 1) Proactively address vulnerabilities and bugs, which includes formal methods and abstract interpretation techniques to identify potential weaknesses early in the development process, reducing the reliance on penetration and fuzz testing in later stages. 2) Streamline development by focusing on bugs that matter, with close to no false positives and catching flaws earlier, the need for rework and retesting is minimized, leading to faster development cycles, improved efficiency and cost savings. 3) Enhance software dependability which includes combining static analysis using abstract interpretation with full context sensitivity, with hardware memory awareness allows for a more comprehensive understanding of potential vulnerabilities, leading to more dependable and secure software. This approach aligns with industry best practices (ISO2626 or ISO 21434) and empowers C/C++ developers to deliver robust, secure embedded systems that meet the demands of today's and tomorrow's applications. We will illustrate this approach with the TrustInSoft analyzer to show how it accelerates verification for complex cases, reduces user fatigue, and improves developer efficiency, cost-effectiveness, and software cybersecurity. In summary, integrating formal methods and sound Analyzers enhances software reliability and cybersecurity, streamlining development in an increasingly complex environment.Keywords: safety, cybersecurity, ISO26262, ISO24434, formal methods
Procedia PDF Downloads 19596 Stoa: Urban Community-Building Social Experiment through Mixed Reality Game Environment
Authors: Radek Richtr, Petr Pauš
Abstract:
Social media nowadays connects people more tightly and intensively than ever, but simultaneously, some sort of social distance, incomprehension, lost of social integrity appears. People can be strongly connected to the person on the other side of the world but unaware of neighbours in the same district or street. The Stoa is a type of application from the ”serious games” genre- it is research augmented reality experiment masked as a gaming environment. In the Stoa environment, the player can plant and grow virtual (organic) structure, a Pillar, that represent the whole suburb. Everybody has their own idea of what is an acceptable, admirable or harmful visual intervention in the area they live in; the purpose of this research experiment is to find and/or define residents shared subconscious spirit, genius loci of the Pillars vicinity, where residents live in. The appearance and evolution of Stoa’s Pillars reflect the real world as perceived by not only the creator but also by other residents/players, who, with their actions, refine the environment. Squares, parks, patios and streets get their living avatar depictions; investors and urban planners obtain information on the occurrence and level of motivation for reshaping the public space. As the project is in product conceptual design phase, the function is one of its most important factors. Function-based modelling makes design problem modular and structured and thus decompose it into sub-functions or function-cells. Paper discuss the current conceptual model for Stoa project, the using of different organic structure textures and models, user interface design, UX study and project’s developing to the final state.Keywords: augmented reality, urban computing, interaction design, mixed reality, social engineering
Procedia PDF Downloads 228595 A Resource-Based Understanding of Health and Social Care Regulation
Authors: David P. Horton, Gary Lynch-Wood
Abstract:
Western populations are aging, prone to various lifestyle health problems, and increasing their demand for health and social care services. This demand has created enormous fiscal and regulatory challenges. In response, government institutions have deployed strategies of behavior modification to encourage people to exercise greater personal responsibility over their health and care needs (i.e., welfare responsibilisation). Policy strategies are underpinned by the assumption that people if properly supported, will make better health and lifestyle selections. Not only does this absolve governments of the responsibility for meeting all health and care needs, but it also enables government institutions to assert fiscal control over welfare spending. Looking at the regulation of health and social care in the UK, the authors identify and outline a suite of regulatory tools that are designed to extract and manage the resources of health and social care services users and to encourage them to make (‘better’) use of these resources. This is important for our understanding of how health and social care regulation is responding to ongoing social and economic challenges. It is also important because there has been a failure to systematically examine the relevance of resources for regulation, which is surprising given that resources are crucial to how and whether regulation succeeds or fails. In particular, drawing from the regulatory welfare state concept, the authors analyse the key legal and regulatory changes and mechanisms that have been introduced since the 2008 financial crisis, focusing on critical measures such as the Health and Social Care Act and regulations introduced under the National Health Service Act. The authors show how three types of user resources (i.e., tangible, labor, and data) are being used to assert fiscal control and increase welfare responsibilisation. Amongst other things, the paper concludes that service users have become more than rule followers and targets of behavioral modification; rather, they are producers of resources that regulatory systems have come to rely on.Keywords: health care, regulation, resources, social care
Procedia PDF Downloads 94594 Relationship among Teams' Information Processing Capacity and Performance in Information System Projects: The Effects of Uncertainty and Equivocality
Authors: Ouafa Sakka, Henri Barki, Louise Cote
Abstract:
Uncertainty and equivocality are defined in the information processing literature as two task characteristics that require different information processing responses from managers. As uncertainty often stems from a lack of information, addressing it is thought to require the collection of additional data. On the other hand, as equivocality stems from ambiguity and a lack of understanding of the task at hand, addressing it is thought to require rich communication between those involved. Past research has provided weak to moderate empirical support to these hypotheses. The present study contributes to this literature by defining uncertainty and equivocality at the project level and investigating their moderating effects on the association between several project information processing constructs and project performance. The information processing constructs considered are the amount of information collected by the project team, and the richness and frequency of formal communications among the team members to discuss the project’s follow-up reports. Data on 93 information system development (ISD) project managers was collected in a questionnaire survey and analyzed it via the Fisher Test for correlation differences. The results indicate that the highest project performance levels were observed in projects characterized by high uncertainty and low equivocality in which project managers were provided with detailed and updated information on project costs and schedules. In addition, our findings show that information about user needs and technical aspects of the project is less useful to managing projects where uncertainty and equivocality are high. Further, while the strongest positive effect of interactive use of follow-up reports on performance occurred in projects where both uncertainty and equivocality levels were high, its weakest effect occurred when both of these were low.Keywords: uncertainty, equivocality, information processing model, management control systems, project control, interactive use, diagnostic use, information system development
Procedia PDF Downloads 294593 Mobile-Assisted Language Learning (MALL) Applications for Interactive and Engaging Classrooms: APPsolutely!
Authors: Ajda Osifo, Amanda Radwan
Abstract:
Mobile-assisted language learning (MALL) or m-learning which is defined as learning with mobile devices that can be utilized in any place that is equipped with unbroken transmission signals, has created new opportunities and challenges for educational use. It introduced a new learning model combining new types of mobile devices, wireless communication services and technologies with teaching and learning. Recent advancements in the mobile world such as the Apple IOS devices (IPhone, IPod Touch and IPad), Android devices and other smartphone devices and environments (such as Windows Phone 7 and Blackberry), allowed learning to be more flexible inside and outside the classroom, making the learning experience unique, adaptable and tailored to each user. Creativity, learner autonomy, collaboration and digital practices of language learners are encouraged as well as innovative pedagogical applications, like the flipped classroom, for such practices in classroom contexts are enhanced. These developments are gradually embedded in daily life and they also seem to be heralding the sustainable move to paperless classrooms. Since mobile technologies are increasingly viewed as a main platform for delivery, we as educators need to design our activities, materials and learning environments in such a way to ensure that learners are engaged and feel comfortable. For the purposes of our session, several core MALL applications that work on the Apple IPad/IPhone will be explored; the rationale and steps needed to successfully implement these applications will be discussed and student examples will be showcased. The focus of the session will be on the following points: 1-Our current pedagogical approach, 2-The rationale and several core MALL apps, 3-Possible Challenges for Teachers and Learners, 4-Future implications. This session is aimed at instructors who are interested in integrating MALL apps into their own classroom planning.Keywords: MALL, educational technology, iPads, apps
Procedia PDF Downloads 394592 Built-Own-Lease-Transfer (BOLT): “An Alternative Model to Subsidy Schemes in Public Private Partnership Projects”
Authors: Nirali Shukla, Neel Shah
Abstract:
The World Bank Institute (WBI) is undertaking a review of government interventions aimed at facilitating sustainable investment in public private partnerships (PPPs) in various under developed countries. The study presents best practice for applying financial model to make PPPs financially viable. The lessons presented here, if properly implemented, can help countries use limited funds to attract more private investment, get more infrastructure built and, as a result, achieve greater economic growth. The four countries Brazil, Colombia, Mexico, and India in total develop an average of nearly US$50 billion in PPPs per year. There are a range of policies and institutional arrangements governments use to provide subsidies to PPPs. For example, some countries have created dedicated agencies, or ‘funds’, capitalized with money from the national budget to manage and allocate subsidies. Other countries have established well-defined policies for appropriating subsidies on an ad hoc basis through an annual budget process. In this context, subsidies are direct fiscal contributions or grants paid by the government to a project when revenues from user fees are insufficient to cover all capital and operating costs while still providing private investors with a reasonable rate of return. Without subsidies, some infrastructure projects that would provide economic or social gains, but are not financially viable, would go undeveloped. But the Financial model of BOLT (PPP) model described in this study suggests that it is most feasible option rather than going for subsidy schemes for making infrastructure projects financially viable. The major advantage for implementing this model is the government money is saved and can be used for other projects as well as the private investors are getting better rate of return than subsidized schemes.Keywords: PPP, BOLT, subsidy schemes, financial model
Procedia PDF Downloads 765591 3D Geological Modeling and Engineering Geological Characterization of Shallow Subsurface Soil and Rock of Addis Ababa, Ethiopia
Authors: Biruk Wolde, Atalay Ayele, Yonatan Garkabo, Trufat Hailmariam, Zemenu Germewu
Abstract:
A comprehensive three-dimensional (3D) geological modeling and engineering geological characterization of shallow subsurface soils and rocks are essential for a wide range of geotechnical and seismological engineering applications, particularly in urban environments. The spatial distribution and geological variation of the shallow subsurface of Addis Ababa city have not been studied so far in terms of geological and geotechnical modeling. This study aims at the construction of a 3D geological model, as well as provides awareness into the engineering geological characteristics of shallow subsurface soil and rock of Addis Ababa city. The 3D geological model was constructed by using more than 1500 geotechnical boreholes, well-drilling data, and geological maps. A well-known geostatistical kriging 3D interpolation algorithm was applied to visualize the spatial distribution and geological variation of the shallow subsurface. Due to the complex nature of geological formations, vertical and lateral variation of the geological profiles horizons-solid command has been selected via the Groundwater Modelling System (GMS) graphical user interface software. For the engineering geological characterization of typical soils and rocks, both index and engineering laboratory tests have been used. The geotechnical properties of soil and rocks vary from place to place due to the uneven nature of subsurface formations observed in the study areas. The constructed model ascertains the thickness, extent, and 3D distribution of the important geological units of the city. This study is the first comprehensive research work on 3D geological modeling and subsurface characterization of soils and rocks in Addis Ababa city, and the outcomes will be important for further future research on subsurface conditions in the city. Furthermore, these findings provide a reference for developing a geo-database for the city.Keywords: 3d geological modeling, addis ababa, engineering geology, geostatistics, horizons-solid
Procedia PDF Downloads 98