Search results for: information technologies
11056 Privacy Policy Prediction for Uploaded Image on Content Sharing Sites
Authors: Pallavi Mane, Nikita Mankar, Shraddha Mazire, Rasika Pashankar
Abstract:
Content sharing sites are very useful in sharing information and images. However, with the increasing demand of content sharing sites privacy and security concern have also increased. There is need to develop a tool for controlling user access to their shared content. Therefore, we are developing an Adaptive Privacy Policy Prediction (A3P) system which is helpful for users to create privacy settings for their images. We propose the two-level framework which assigns the best available privacy policy for the users images according to users available histories on the site.Keywords: online information services, prediction, security and protection, web based services
Procedia PDF Downloads 35811055 Building Bridges on Roads With Major Constructions
Authors: Mohamed Zaidour
Abstract:
In this summary, we are going to look in brief at the bridges and their building and construction on most roads and we have followed a simple method to explain each field clearly because the geographical and climatic diversity of an area leads to different methods and types of roads and installation engineering in other areas In mountain areas we need to build retaining walls in areas of rain. It needs to construct ferries to discharge water from roads in areas of temporary or permanent rivers. There is a need to build bridges and construct road installations in the process of collecting the necessary information, such as soil type. This information needs it, engineer, when designing the constructor and in this section, we will identify the types and methods of calculation bridge columns rules phrases the walls are chock.Keywords: bridges, buildings, concrete, constructions, roads
Procedia PDF Downloads 11911054 Study on Clarification of the Core Technology in a Monozukuri Company
Authors: Nishiyama Toshiaki, Tadayuki Kyountani, Nguyen Huu Phuc, Shigeyuki Haruyama, Oke Oktavianty
Abstract:
It is important to clarify the company’s core technology in product development process to strengthen their power in providing technology that meets the customer requirement. QFD method is adopted to clarify the core technology through identifying the high element technologies that are related to the voice of customer, and offer the most delightful features for customer. AHP is used to determine the importance of evaluating factors. A case study was conducted by using this approach in Japan’s Monozukuri Company (so called manufacturing company) to clarify their core technology based on customer requirements.Keywords: core technology, QFD, voices of customer, analysis procedure
Procedia PDF Downloads 38611053 Convergence Analysis of Reactive Power Based Schemes Used in Sensorless Control of Induction Motors
Authors: N. Ben Si Ali, N. Benalia, N. Zerzouri
Abstract:
Many electronic drivers for the induction motor control are based on sensorless technologies. Speed and torque control is usually attained by application of a speed or position sensor which requires the additional mounting space, reduce the reliability and increase the cost. This paper seeks to analyze dynamical performances and sensitivity to motor parameter changes of reactive power based technique used in sensorless control of induction motors. Validity of theoretical results is verified by simulation.Keywords: adaptive observers, model reference adaptive system, RP-based estimator, sensorless control, stability analysis
Procedia PDF Downloads 54711052 Smart Card Technology Adaption in a Hospital Setting
Authors: H. K. V. Narayan
Abstract:
This study was conducted at Tata Memorial Hospital (TMH), Mumbai, India. The study was to evaluate the impact of adapting Smart Card (SC) for clinical and business transactions in order to reduce Lead times and to enforce business rules of the hospital. The objective for implementing the Smart Card was to improve the patient perception of quality in terms of structures process and outcomes and also to improve the productivity of the Institution. The Smart Card was implemented in phases from 2011 and integrated with the Hospital Information System (HIS/EMR). The implementation was a learning curve for all the stake holders as software obviated the need to use hardcopies of transactions. The acceptability to the stake holders was challenge in change management. The study assessed the impact 3 years into the implementation and the observed trends have suggested that it has decreased the lead times for services and increased the no of transactions and thereby the productivity. Patients who used to complain of multiple queues and cumbersome transactions now compliment the administration for effective use of Information and Communication Technology.Keywords: smart card, high availability of health care information, reduction in potential medical errors due to elimination of transcription errors, reduction in no of queues, increased transactions, augmentation of revenue
Procedia PDF Downloads 28511051 Wealth Creation and its Externalities: Evaluating Economic Growth and Corporate Social Responsibility
Authors: Zhikang Rong
Abstract:
The 4th industrial revolution has introduced technologies like interconnectivity, machine learning, and real-time big data analytics that improve operations and business efficiency. This paper examines how these advancements have led to a concentration of wealth, specifically among the top 1%, and investigates whether this wealth provides value to society. Through analyzing impacts on employment, productivity, supply-demand dynamics, and potential externalities, it is shown that successful businesspeople, by enhancing productivity and creating jobs, contribute positively to long-term economic growth. Additionally, externalities such as environmental degradation are managed by social entrepreneurship and government policies.Keywords: wealth creation, employment, productivity, social entrepreneurship
Procedia PDF Downloads 3011050 Impact Assessment of Information Communication, Network Providers, Teledensity, and Consumer Complaints on Gross Domestic Products
Authors: Essang Anwana Onuntuei, Chinyere Blessing Azunwoke
Abstract:
The study used secondary data from foreign and local organizations to explore major challenges and opportunities abound in Information Communication. The study aimed at exploring the tie between tele density (network coverage area) and the number of network subscriptions, probing if the degree of consumer complaints varies significantly among network providers, and assessing if network subscriptions do significantly influence the sector’s GDP contribution. Methods used for data analysis include Pearson product-moment correlation and regression analysis, and the Analysis of Variance (ANOVA) as well. At a two-tailed test of 0.05 confidence level, the results of findings established about 85.6% of network subscriptions were explained by tele density (network coverage area), and the number of network subscriptions; Consumer Complaints’ degree varied significantly among network providers as 80.158291 (F calculated) > 3.490295 (F critical) with very high confidence associated p-value = 0.000000 which is < 0.05; and finally, 65% of the nation’s GDP was explained by network subscription to show a high association.Keywords: tele density, subscription, network coverage, information communication, consumer
Procedia PDF Downloads 4511049 Airborne CO₂ Lidar Measurements for Atmospheric Carbon and Transport: America (ACT-America) Project and Active Sensing of CO₂ Emissions over Nights, Days, and Seasons 2017-2018 Field Campaigns
Authors: Joel F. Campbell, Bing Lin, Michael Obland, Susan Kooi, Tai-Fang Fan, Byron Meadows, Edward Browell, Wayne Erxleben, Doug McGregor, Jeremy Dobler, Sandip Pal, Christopher O'Dell, Ken Davis
Abstract:
The Active Sensing of CO₂ Emissions over Nights, Days, and Seasons (ASCENDS) CarbonHawk Experiment Simulator (ACES) is a NASA Langley Research Center instrument funded by NASA’s Science Mission Directorate that seeks to advance technologies critical to measuring atmospheric column carbon dioxide (CO₂ ) mixing ratios in support of the NASA ASCENDS mission. The ACES instrument, an Intensity-Modulated Continuous-Wave (IM-CW) lidar, was designed for high-altitude aircraft operations and can be directly applied to space instrumentation to meet the ASCENDS mission requirements. The ACES design demonstrates advanced technologies critical for developing an airborne simulator and spaceborne instrument with lower platform consumption of size, mass, and power, and with improved performance. The Atmospheric Carbon and Transport – America (ACT-America) is an Earth Venture Suborbital -2 (EVS-2) mission sponsored by the Earth Science Division of NASA’s Science Mission Directorate. A major objective is to enhance knowledge of the sources/sinks and transport of atmospheric CO₂ through the application of remote and in situ airborne measurements of CO₂ and other atmospheric properties on spatial and temporal scales. ACT-America consists of five campaigns to measure regional carbon and evaluate transport under various meteorological conditions in three regional areas of the Continental United States. Regional CO₂ distributions of the lower atmosphere were observed from the C-130 aircraft by the Harris Corp. Multi-Frequency Fiber Laser Lidar (MFLL) and the ACES lidar. The airborne lidars provide unique data that complement the more traditional in situ sensors. This presentation shows the applications of CO₂ lidars in support of these science needs.Keywords: CO₂ measurement, IMCW, CW lidar, laser spectroscopy
Procedia PDF Downloads 16211048 Targeting Violent Extremist Narratives: Applying Network Targeting Techniques to the Communication Functions of Terrorist Groups
Authors: John Hardy
Abstract:
Over the last decade, the increasing utility of extremist narratives to the operational effectiveness of terrorist organizations has been evidenced by the proliferation of inspired or affiliated attacks across the world. Famous examples such as regional al-Qaeda affiliates and the self-styled “Islamic State” demonstrate the effectiveness of leveraging communication technologies to disseminate propaganda, recruit members, and orchestrate attacks. Terrorist organizations with the capacity to harness the communicative power offered by digital communication technologies and effective political narratives have held an advantage over their targets in recent years. Terrorists have leveraged the perceived legitimacy of grass-roots actors to appeal to a global audience of potential supporters and enemies alike, and have wielded a proficiency in profile-raising which remains unmatched by counter terrorism narratives around the world. In contrast, many attempts at propagating official counter-narratives have been received by target audiences as illegitimate, top-down and impersonally bureaucratic. However, the benefits provided by widespread communication and extremist narratives have come at an operational cost. Terrorist organizations now face a significant challenge in protecting their access to communications technologies and authority over the content they create and endorse. The dissemination of effective narratives has emerged as a core function of terrorist organizations with international reach via inspired or affiliated attacks. As such, it has become a critical function which can be targeted by intelligence and security forces. This study applies network targeting principles which have been used by coalition forces against a range of non-state actors in the Middle East and South Asia to the communicative function of terrorist organizations. This illustrates both a conceptual link between functional targeting and operational disruption in the abstract and a tangible impact on the operational effectiveness of terrorists by degrading communicative ability and legitimacy. Two case studies highlight the utility of applying functional targeting against terrorist organizations. The first case is the targeted killing of Anwar al-Awlaki, an al-Qaeda propagandist who crafted a permissive narrative and effective propaganda videos to attract recruits who committed inspired terrorist attacks in the US and overseas. The second is a series of operations against Islamic State propagandists in Syria, including the capture or deaths of a cadre of high profile Islamic State members, including Junaid Hussain, Abu Mohammad al-Adnani, Neil Prakash, and Rachid Kassim. The group of Islamic State propagandists were linked to a significant rise in affiliated and enabled terrorist attacks and were subsequently targeted by law enforcement and military agencies. In both cases, the disruption of communication between the terrorist organization and recruits degraded both communicative and operational functions. Effective functional targeting on member recruitment and operational tempo suggests that narratives are a critical function which can be leveraged against terrorist organizations. Further application of network targeting methods to terrorist narratives may enhance the efficacy of a range of counter terrorism techniques employed by security and intelligence agencies.Keywords: countering violent extremism, counter terrorism, intelligence, terrorism, violent extremism
Procedia PDF Downloads 29111047 Predictive Analytics in Oil and Gas Industry
Authors: Suchitra Chnadrashekhar
Abstract:
Earlier looked as a support function in an organization information technology has now become a critical utility to manage their daily operations. Organizations are processing huge amount of data which was unimaginable few decades before. This has opened the opportunity for IT sector to help industries across domains to handle the data in the most intelligent manner. Presence of IT has been a leverage for the Oil & Gas industry to store, manage and process the data in most efficient way possible thus deriving the economic value in their day-to-day operations. Proper synchronization between Operational data system and Information Technology system is the need of the hour. Predictive analytics supports oil and gas companies by addressing the challenge of critical equipment performance, life cycle, integrity, security, and increase their utilization. Predictive analytics go beyond early warning by providing insights into the roots of problems. To reach their full potential, oil and gas companies need to take a holistic or systems approach towards asset optimization and thus have the functional information at all levels of the organization in order to make the right decisions. This paper discusses how the use of predictive analysis in oil and gas industry is redefining the dynamics of this sector. Also, the paper will be supported by real time data and evaluation of the data for a given oil production asset on an application tool, SAS. The reason for using SAS as an application for our analysis is that SAS provides an analytics-based framework to improve uptimes, performance and availability of crucial assets while reducing the amount of unscheduled maintenance, thus minimizing maintenance-related costs and operation disruptions. With state-of-the-art analytics and reporting, we can predict maintenance problems before they happen and determine root causes in order to update processes for future prevention.Keywords: hydrocarbon, information technology, SAS, predictive analytics
Procedia PDF Downloads 36011046 Matlab/Simulink Simulation of Solar Energy Storage System
Authors: Mustafa A. Al-Refai
Abstract:
This paper investigates the energy storage technologies that can potentially enhance the use of solar energy. Water electrolysis systems are seen as the principal means of producing a large amount of hydrogen in the future. Starting from the analysis of the models of the system components, a complete simulation model was realized in the Matlab-Simulink environment. Results of the numerical simulations are provided. The operation of electrolysis and photovoltaic array combination is verified at various insulation levels. It is pointed out that solar cell arrays and electrolysers are producing the expected results with solar energy inputs that are continuously varying.Keywords: electrolyzer, simulink, solar energy, storage system
Procedia PDF Downloads 43511045 Advertising Campaigns for a Sustainable Future: The Fight against Plastic Pollution in the Ocean
Authors: Mokhlisur Rahman
Abstract:
Ocean inhibits one of the most complex ecosystems on the planet that regulates the earth's climate and weather by providing us with compatible weather to live. Ocean provides food by extending various ways of lifestyles that are dependent on it, transportation by accommodating the world's biggest carriers, recreation by offering its beauty in many moods, and home to countless species. At the essence of receiving various forms of entertainment, consumers choose to be close to the ocean while performing many fun activities. Which, at some point, upsets the stomach of the ocean by threatening marine life and the environment. Consumers throw the waste into the ocean after using it. Most of them are plastics that float over the ocean and turn into thousands of micro pieces that are hard to observe with the naked eye but easily eaten by the sea species. Eventually, that conflicts with the natural consumption process of any living species, making them sick. This information is not known by most consumers who go to the sea or seashores occasionally to spend time, nor is it widely discussed, which creates an information gap among consumers. However, advertising is a powerful tool to educate people about ocean pollution. This abstract analyzes three major ocean-saving advertisement campaigns that use innovative and advanced technology to get maximum exposure. The study collects data from the selected campaigns' websites and retrieves all available content related to messages, videos, and images. First, the SeaLegacy campaign uses stunning images to create awareness among the people; they use social media content, videos, and other educational content. They create content and strategies to build an emotional connection among the consumers that encourage them to move on an action. All the messages in their campaign empower consumers by using powerful words. Second, Ocean Conservancy Campaign uses social media marketing, events, and educational content to protect the ocean from various pollutants, including plastics, climate change, and overfishing. They use powerful images and videos of marine life. Their mission is to create evidence-based solutions toward a healthy ocean. Their message includes the message regarding the local communities along with the sea species. Third, ocean clean-up is a campaign that applies strategies using innovative technologies to remove plastic waste from the ocean. They use social media, digital, and email marketing to reach people and raise awareness. They also use images and videos to evoke an emotional response to take action. These tree advertisements use realistic images, powerful words, and the presence of living species in the imagery presentation, which are eye-catching and can grow emotional connection among the consumers. Identifying the effectiveness of the messages these advertisements carry and their strategies highlights the knowledge gap of mass people between real pollution and its consequences, making the message more accessible to the mass of people. This study aims to provide insights into the effectiveness of ocean-saving advertisement campaigns and their impact on the public's awareness of ocean conservation. The findings from this study help shape future campaigns.Keywords: advertising-campaign, content-creation, images ocean-saving technology, videos
Procedia PDF Downloads 7811044 Code Embedding for Software Vulnerability Discovery Based on Semantic Information
Authors: Joseph Gear, Yue Xu, Ernest Foo, Praveen Gauravaran, Zahra Jadidi, Leonie Simpson
Abstract:
Deep learning methods have been seeing an increasing application to the long-standing security research goal of automatic vulnerability detection for source code. Attention, however, must still be paid to the task of producing vector representations for source code (code embeddings) as input for these deep learning models. Graphical representations of code, most predominantly Abstract Syntax Trees and Code Property Graphs, have received some use in this task of late; however, for very large graphs representing very large code snip- pets, learning becomes prohibitively computationally expensive. This expense may be reduced by intelligently pruning this input to only vulnerability-relevant information; however, little research in this area has been performed. Additionally, most existing work comprehends code based solely on the structure of the graph at the expense of the information contained by the node in the graph. This paper proposes Semantic-enhanced Code Embedding for Vulnerability Discovery (SCEVD), a deep learning model which uses semantic-based feature selection for its vulnerability classification model. It uses information from the nodes as well as the structure of the code graph in order to select features which are most indicative of the presence or absence of vulnerabilities. This model is implemented and experimentally tested using the SARD Juliet vulnerability test suite to determine its efficacy. It is able to improve on existing code graph feature selection methods, as demonstrated by its improved ability to discover vulnerabilities.Keywords: code representation, deep learning, source code semantics, vulnerability discovery
Procedia PDF Downloads 15911043 Reasons for the Selection of Information-Processing Framework and the Philosophy of Mind as a General Account for an Error Analysis and Explanation on Mathematics
Authors: Michael Lousis
Abstract:
This research study is concerned with learner’s errors on Arithmetic and Algebra. The data resulted from a broader international comparative research program called Kassel Project. However, its conceptualisation differed from and contrasted with that of the main program, which was mostly based on socio-demographic data. The way in which the research study was conducted, was not dependent on the researcher’s discretion, but was absolutely dictated by the nature of the problem under investigation. This is because the phenomenon of learners’ mathematical errors is due neither to the intentions of learners nor to institutional processes, rules and norms, nor to the educators’ intentions and goals; but rather to the way certain information is presented to learners and how their cognitive apparatus processes this information. Several approaches for the study of learners’ errors have been developed from the beginning of the 20th century, encompassing different belief systems. These approaches were based on the behaviourist theory, on the Piagetian- constructivist research framework, the perspective that followed the philosophy of science and the information-processing paradigm. The researcher of the present study was forced to disclose the learners’ course of thinking that led them in specific observable actions with the result of showing particular errors in specific problems, rather than analysing scripts with the students’ thoughts presented in a written form. This, in turn, entailed that the choice of methods would have to be appropriate and conducive to seeing and realising the learners’ errors from the perspective of the participants in the investigation. This particular fact determined important decisions to be made concerning the selection of an appropriate framework for analysing the mathematical errors and giving explanations. Thus the rejection of the belief systems concerning behaviourism, the Piagetian-constructivist, and philosophy of science perspectives took place, and the information-processing paradigm in conjunction with the philosophy of mind were adopted as a general account for the elaboration of data. This paper explains why these decisions were appropriate and beneficial for conducting the present study and for the establishment of the ensued thesis. Additionally, the reasons for the adoption of the information-processing paradigm in conjunction with the philosophy of mind give sound and legitimate bases for the development of future studies concerning mathematical error analysis are explained.Keywords: advantages-disadvantages of theoretical prospects, behavioral prospect, critical evaluation of theoretical prospects, error analysis, information-processing paradigm, opting for the appropriate approach, philosophy of science prospect, Piagetian-constructivist research frameworks, review of research in mathematical errors
Procedia PDF Downloads 19011042 An Approach to Maximize the Influence Spread in the Social Networks
Authors: Gaye Ibrahima, Mendy Gervais, Seck Diaraf, Ouya Samuel
Abstract:
In this paper, we consider the influence maximization in social networks. Here we give importance to initial diffuser called the seeds. The goal is to find efficiently a subset of k elements in the social network that will begin and maximize the information diffusion process. A new approach which treats the social network before to determine the seeds, is proposed. This treatment eliminates the information feedback toward a considered element as seed by extracting an acyclic spanning social network. At first, we propose two algorithm versions called SCG − algoritm (v1 and v2) (Spanning Connected Graphalgorithm). This algorithm takes as input data a connected social network directed or no. And finally, a generalization of the SCG − algoritm is proposed. It is called SG − algoritm (Spanning Graph-algorithm) and takes as input data any graph. These two algorithms are effective and have each one a polynomial complexity. To show the pertinence of our approach, two seeds set are determined and those given by our approach give a better results. The performances of this approach are very perceptible through the simulation carried out by the R software and the igraph package.Keywords: acyclic spanning graph, centrality measures, information feedback, influence maximization, social network
Procedia PDF Downloads 24811041 Key Concepts of 5th Generation Mobile Technology
Authors: Magri Hicham, Noreddine Abghour, Mohamed Ouzzif
Abstract:
The 5th generation of mobile networks is term used in various research papers and projects to identify the next major phase of mobile telecommunications standards. 5G wireless networks will support higher peak data rate, lower latency and provide best connections with QoS guarenty. In this article, we discuss various promising technologies for 5G wireless communication systems, such as IPv6 support, World Wide Wireless Web (WWWW), Dynamic Adhoc Wireless Networks (DAWN), BEAM DIVISION MULTIPLE ACCESS (BDMA), Cloud Computing and cognitive radio technology.Keywords: WWWW, BDMA, DAWN, 5G, 4G, IPv6, Cloud Computing
Procedia PDF Downloads 51411040 Ethnobotanical Survey of Medicinal Plants from Bechar Region, South-West of Algeria
Authors: Naima Fatehi
Abstract:
The paper reports on 107 medicinal plants, traditionally used in the South-West of Algeria (Bechar region). The information has been documented by interviewing traditional herbalists, various elderly men and women following different ethnobotanical methods. Ethnobotanical data was arranged alphabetically by botanical name, followed by family name, vernacular name, and part used. The present paper represents significant ethnobotanical information on medical plants used extensively in Bechar region for treating various diseases and provides baseline data for future pharmacological and phytochemical studies.Keywords: medicinal plants, ethnobotanical survey, South-West Algeria, Bechar region
Procedia PDF Downloads 52111039 Information Processing and Visual Attention: An Eye Tracking Study on Nutrition Labels
Authors: Rosa Hendijani, Amir Ghadimi Herfeh
Abstract:
Nutrition labels are diet-related health policies. They help individuals improve food-choice decisions and reduce intake of calories and unhealthy food elements, like cholesterol. However, many individuals do not pay attention to nutrition labels or fail to appropriately understand them. According to the literature, thinking and cognitive styles can have significant effects on attention to nutrition labels. According to the author's knowledge, the effect of global/local processing on attention to nutrition labels have not been previously studied. Global/local processing encourages individuals to attend to the whole/specific parts of an object and can have a significant impact on people's visual attention. In this study, this effect was examined with an experimental design using the eye-tracking technique. The research hypothesis was that individuals with local processing would pay more attention to nutrition labels, including nutrition tables and traffic lights. An experiment was designed with two conditions: global and local information processing. Forty participants were randomly assigned to either global or local conditions, and their processing style was manipulated accordingly. Results supported the hypothesis for nutrition tables but not for traffic lights.Keywords: eye-tracking, nutrition labelling, global/local information processing, individual differences
Procedia PDF Downloads 15911038 Neighbor Caring Environment System (NCE) Using Parallel Replication Mechanism
Authors: Ahmad Shukri Mohd Noor, Emma Ahmad Sirajudin, Rabiei Mamat
Abstract:
Pertaining to a particular Marine interest, the process of data sampling could take years before a study can be concluded. Therefore, the need for a robust backup system for the data is invariably implicit. In recent advancement of Marine applications, more functionalities and tools are integrated to assist the work of the researchers. It is anticipated that this modality will continue as research scope widens and intensifies and at the same to follow suit with current technologies and lifestyles. The convenience to collect and share information these days also applies to the work in Marine research. Therefore, Marine system designers should be aware that high availability is a necessary attribute in Marine repository applications as well as a robust backup system for the data. In this paper, the approach to high availability is related both to hardware and software but the focus is more on software. We consider a NABTIC repository system that is primitively built on a single server and does not have replicated components. First, the system is decomposed into separate modules. The modules are placed on multiple servers to create a distributed system. Redundancy is added by placing the copies of the modules on different servers using Neighbor Caring Environment System(NCES) technique. NCER is utilizing parallel replication components mechanism. A background monitoring is established to check servers’ heartbeats to confirm their aliveness. At the same time, a critical adaptive threshold is maintained to make sure a failure is timely detected using Adaptive Fault Detection (AFD). A confirmed failure will set the recovery mode where a selection process will be done before a fail-over server is instructed. In effect, the Marine repository service is continued as the fail-over masks a recent failure. The performance of the new prototype is tested and is confirmed to be more highly available. Furthermore, the downtime is not noticeable as service is immediately restored automatically. The Marine repository system is said to have achieved fault tolerance.Keywords: availability, fault detection, replication, fault tolerance, marine application
Procedia PDF Downloads 32111037 Data Access, AI Intensity, and Scale Advantages
Authors: Chuping Lo
Abstract:
This paper presents a simple model demonstrating that ceteris paribus countries with lower barriers to accessing global data tend to earn higher incomes than other countries. Therefore, large countries that inherently have greater data resources tend to have higher incomes than smaller countries, such that the former may be more hesitant than the latter to liberalize cross-border data flows to maintain this advantage. Furthermore, countries with higher artificial intelligence (AI) intensity in production technologies tend to benefit more from economies of scale in data aggregation, leading to higher income and more trade as they are better able to utilize global data.Keywords: digital intensity, digital divide, international trade, scale of economics
Procedia PDF Downloads 6811036 Audit on Compliance with Ottawa Ankle Rules in Ankle Radiograph Requests
Authors: Daud Muhammad
Abstract:
Introduction: Ankle radiographs are frequently requested in Emergency Departments (ED) for patients presenting with traumatic ankle pain. The Ottawa Ankle Rules (OAR) serve as a clinical guideline to determine the necessity of these radiographs, aiming to reduce unnecessary imaging. This audit was conducted to evaluate the adequacy of clinical information provided in radiograph requests in relation to the OAR. Methods: A retrospective analysis was performed on 50 consecutive ankle radiograph requests under ED clinicians' names for patients aged above 5 years, specifically excluding follow-up radiographs for known fractures. The study assessed whether the provided clinical information met the criteria outlined by the OAR. Results: The audit revealed that none of the 50 radiograph requests contained sufficient information to satisfy the Ottawa Ankle Rules. Furthermore, 10 out of the 50 radiographs (20%) identified fractures. Discussion: The findings indicate a significant lack of adherence to the OAR, suggesting potential overuse of radiography and unnecessary patient exposure to radiation. This non-compliance may also contribute to increased healthcare costs and resource utilization, as well as possible delays in diagnosis and treatment. Recommendations: To address these issues, the following recommendations are proposed: (1) Education and Training: Enhance awareness and training among ED clinicians regarding the OAR. (2) Standardised Request Forms: Implement changes to imaging request forms to mandate relevant information according to the OAR. (3) Scan Vetting: Promote awareness among radiographers to discuss the appropriateness of scan requests with clinicians. (4) Regular re-audits should be conducted to monitor improvements in compliance.Keywords: Ottawa ankle rules, ankle radiographs, emergency department, traumatic pain
Procedia PDF Downloads 4511035 Some Generalized Multivariate Estimators for Population Mean under Multi Phase Stratified Systematic Sampling
Authors: Muqaddas Javed, Muhammad Hanif
Abstract:
The generalized multivariate ratio and regression type estimators for population mean are suggested under multi-phase stratified systematic sampling (MPSSS) using multi auxiliary information. Estimators are developed under the two different situations of availability of auxiliary information. The expressions of bias and mean square error (MSE) are developed. Special cases of suggested estimators are also discussed and simulation study is conducted to observe the performance of estimators.Keywords: generalized estimators, multi-phase sampling, stratified random sampling, systematic sampling
Procedia PDF Downloads 72911034 Anaerobic Co-digestion in Two-Phase TPAD System of Sewage Sludge and Fish Waste
Authors: Rocio López, Miriam Tena, Montserrat Pérez, Rosario Solera
Abstract:
Biotransformation of organic waste into biogas is considered an interesting alternative for the production of clean energy from renewable sources by reducing the volume and organic content of waste Anaerobic digestion is considered one of the most efficient technologies to transform waste into fertilizer and biogas in order to obtain electrical energy or biofuel within the concept of the circular economy. Currently, three types of anaerobic processes have been developed on a commercial scale: (1) single-stage process where sludge bioconversion is completed in a single chamber, (2) two-stage process where the acidogenic and methanogenic stages are separated into two chambers and, finally, (3) temperature-phase sequencing (TPAD) process that combines a thermophilic pretreatment unit prior to mesophilic anaerobic digestion. Two-stage processes can provide hydrogen and methane with easier control of the first and second stage conditions producing higher total energy recovery and substrate degradation than single-stage processes. On the other hand, co-digestion is the simultaneous anaerobic digestion of a mixture of two or more substrates. The technology is similar to anaerobic digestion but is a more attractive option as it produces increased methane yields due to the positive synergism of the mixtures in the digestion medium thus increasing the economic viability of biogas plants. The present study focuses on the energy recovery by anaerobic co-digestion of sewage sludge and waste from the aquaculture-fishing sector. The valorization is approached through the application of a temperature sequential phase process or TPAD technology (Temperature - Phased Anaerobic Digestion). Moreover, two-phase of microorganisms is considered. Thus, the selected process allows the development of a thermophilic acidogenic phase followed by a mesophilic methanogenic phase to obtain hydrogen (H₂) in the first stage and methane (CH₄) in the second stage. The combination of these technologies makes it possible to unify all the advantages of these anaerobic digestion processes individually. To achieve these objectives, a sequential study has been carried out in which the biochemical potential of hydrogen (BHP) is tested followed by a BMP test, which will allow checking the feasibility of the two-stage process. The best results obtained were high total and soluble COD yields (59.8% and 82.67%, respectively) as well as H₂ production rates of 12LH₂/kg SVadded and methane of 28.76 L CH₄/kg SVadded for TPAD.Keywords: anaerobic co-digestion, TPAD, two-phase, BHP, BMP, sewage sludge, fish waste
Procedia PDF Downloads 15611033 Green Computing: Awareness and Practice in a University Information Technology Department
Authors: Samson Temitope Obafemi
Abstract:
The fact that ICTs is pervasive in today’s society paradoxically also calls for the need for green computing. Green computing generally encompasses the study and practice of using Information and Communication Technology (ICT) resources effectively and efficiently without negatively affecting the environment. Since the emergence of this innovation, manufacturers and governmental bodies such as Energy Star and the United State of America’s government have obviously invested many resources in ensuring the reality of green design, manufacture, and disposal of ICTs. However, the level of adherence to green use of ICTs among users have been less accounted for especially in developing ICT consuming nations. This paper, therefore, focuses on examining the awareness and practice of green computing among academics and students of the Information Technology Department of Durban University of Technology, Durban South Africa, in the context of green use of ICTs. This was achieved through a survey that involved the use of a questionnaire with four sections: (a) demography of respondents, (b) Awareness of green computing, (c) practices of green computing, and (d) attitude towards greener computing. One hundred and fifty (150) questionnaires were distributed, one hundred and twenty (125) were completed and collected for data analysis. Out of the one hundred and twenty-five (125) respondents, twenty-five percent (25%) were academics while the remaining seventy-five percent (75%) were students. The result showed a higher level of awareness of green computing among academics when compared to the students. Green computing practices are also shown to be highly adhered to among academics only. However, interestingly, the students were found to be more enthusiastic towards greener computing in the future. The study, therefore, suggests that the awareness of green computing should be further strengthened among students from the curriculum point of view in order to improve on the greener use of ICTs in universities especially in developing countries.Keywords: awareness, green computing, green use, information technology
Procedia PDF Downloads 19511032 Development of Wide Bandgap Semiconductor Based Particle Detector
Authors: Rupa Jeena, Pankaj Chetry, Pradeep Sarin
Abstract:
The study of fundamental particles and the forces governing them has always remained an attractive field of theoretical study to pursue. With the advancement and development of new technologies and instruments, it is possible now to perform particle physics experiments on a large scale for the validation of theoretical predictions. These experiments are generally carried out in a highly intense beam environment. This, in turn, requires the development of a detector prototype possessing properties like radiation tolerance, thermal stability, and fast timing response. Semiconductors like Silicon, Germanium, Diamond, and Gallium Nitride (GaN) have been widely used for particle detection applications. Silicon and germanium being narrow bandgap semiconductors, require pre-cooling to suppress the effect of noise by thermally generated intrinsic charge carriers. The application of diamond in large-scale experiments is rare owing to its high cost of fabrication, while GaN is one of the most extensively explored potential candidates. But we are aiming to introduce another wide bandgap semiconductor in this active area of research by considering all the requirements. We have made an attempt by utilizing the wide bandgap of rutile Titanium dioxide (TiO2) and other properties to use it for particle detection purposes. The thermal evaporation-oxidation (in PID furnace) technique is used for the deposition of the film, and the Metal Semiconductor Metal (MSM) electrical contacts are made using Titanium+Gold (Ti+Au) (20/80nm). The characterization comprising X-Ray Diffraction (XRD), Atomic Force Microscopy (AFM), Ultraviolet (UV)-Visible spectroscopy, and Laser Raman Spectroscopy (LRS) has been performed on the film to get detailed information about surface morphology. On the other hand, electrical characterizations like Current Voltage (IV) measurement in dark and light and test with laser are performed to have a better understanding of the working of the detector prototype. All these preliminary tests of the detector will be presented.Keywords: particle detector, rutile titanium dioxide, thermal evaporation, wide bandgap semiconductors
Procedia PDF Downloads 7911031 School Curriculum Incorporating Rights to Live in Clean and Healthy Environment: Assessing Its Effectiveness
Authors: Sitaram Dahal
Abstract:
Among many strategic and practical needs in overcoming the threats and challenges being experienced in the global environment, constitutional provision for Rights to live in clean and healthy environment is one and so is the school curriculum incorporating information on such rights. Government of Nepal has also introduced information on rights to live in clean and healthy environment, as provisioned in its interim constitution of 2007, in the secondary level curriculum of formal education. As the predetermined specific objective of such curriculum is to prepare students who are conscious of citizens’ rights and responsibilities and are able to adopt functions, duties and rights of the rights holders and duty bearers; the study was designed to assess the effectiveness of such curriculum. The study was conducted in one private school and a community school to assess the effectiveness of such curriculum. The study shows that such curriculum has been able to make students responsible duty bearers as they were aware of their habits towards environment. Whereas only very few students are aware enough as being rights holders. Students of community schools were aware rights holders as they complain if they are not satisfied with the environment of the school itself. But private school is far behind in this case. It can be said that only curriculum with very few portion of information on such rights might not be capable enough to meet its objective.Keywords: curriculum, environmental rights, constitution, effectiveness
Procedia PDF Downloads 32611030 Online Delivery Approaches of Post Secondary Virtual Inclusive Media Education
Authors: Margot Whitfield, Andrea Ducent, Marie Catherine Rombaut, Katia Iassinovskaia, Deborah Fels
Abstract:
Learning how to create inclusive media, such as closed captioning (CC) and audio description (AD), in North America is restricted to the private sector, proprietary company-based training. We are delivering (through synchronous and asynchronous online learning) the first Canadian post-secondary, practice-based continuing education course package in inclusive media for broadcast production and processes. Despite the prevalence of CC and AD taught within the field of translation studies in Europe, North America has no comparable field of study. This novel approach to audio visual translation (AVT) education develops evidence-based methodology innovations, stemming from user study research with blind/low vision and Deaf/hard of hearing audiences for television and theatre, undertaken at Ryerson University. Knowledge outcomes from the courses include a) Understanding how CC/AD fit within disability/regulatory frameworks in Canada. b) Knowledge of how CC/AD could be employed in the initial stages of production development within broadcasting. c) Writing and/or speaking techniques designed for media. d) Hands-on practice in captioning re-speaking techniques and open source technologies, or in AD techniques. e) Understanding of audio production technologies and editing techniques. The case study of the curriculum development and deployment, involving first-time online course delivery from academic and practitioner-based instructors in introductory Captioning and Audio Description courses (CDIM 101 and 102), will compare two different instructors' approaches to learning design, including the ratio of synchronous and asynchronous classroom time and technological engagement tools on meeting software platform such as breakout rooms and polling. Student reception of these two different approaches will be analysed using qualitative thematic and quantitative survey analysis. Thus far, anecdotal conversations with students suggests that they prefer synchronous compared with asynchronous learning within our hands-on online course delivery method.Keywords: inclusive media theory, broadcasting practices, AVT post secondary education, respeaking, audio description, learning design, virtual education
Procedia PDF Downloads 18311029 Effectiveness of ATMS (Advanced Transport Management Systems) in Asuncion, Paraguay
Authors: Sung Ho Oh
Abstract:
The advanced traffic lights, the system of traffic information collection and provision, the CCTVs for traffic control, and the traffic information center were installed in Asuncion, capital of Paraguay. After pre-post comparison of the installation, significant changes were found. Even though the traffic volumes were increased, travel speed was higher, so that travel time from origin to destination was decreased. the saving values for travel time, gas cost, and environmental cost are about 47 million US dollars per year. Satisfaction survey results for the installation were presented with statistical significance analysis.Keywords: advanced transport management systems, effectiveness, Paraguay, traffic lights
Procedia PDF Downloads 35211028 Quality Assurance in Translation Crowdsourcing: The TED Open Translation Project
Authors: Ya-Mei Chen
Abstract:
The participatory culture enabled by Web 2.0 technologies has led to the emergence of online translation crowdsourcing, which mainly relies on the collective intelligence of volunteer translators. Due to the fact that many volunteer translators do not have formal translator training, concerns have been raised about the quality of crowdsourced translations. Some empirical research has been done to examine the translation quality of for-profit crowdsourcing initiatives. However, quality assurance of non-profit translation crowdsourcing has rarely been explored in detail. Using the TED Open Translation Project as a case study, this paper investigates how the translation-review-approval method adopted by TED can (1) direct the volunteer translators’ use of translation strategies as well as the reviewers’ adoption of revising strategies and (2) shape the final translation products. To well examine the actual effect of TED’s translation-review-approval method, this paper will focus on its two major quality assurance mechanisms, that is, TED’s style guidelines and quality review. Based on an anonymous questionnaire, this research will first explore whether the volunteer translators and reviewers are aware of the style guidelines and whether their use of translation strategies is similar to that advised in the guidelines. The questionnaire, which will be posted online, will consist of two parts: demographic information and translation strategies. The invitations to complete it will then be distributed through TED Translator Facebook groups. With an aim to investigate if the style guidelines have any substantial impacts on actual subtitling practices, a comparison will be made between the original English subtitles of 20 TED talks (each around 5 to 7 minutes) and their Chinese subtitle translations to identify regularly adopted strategies. Concerning the function of the reviewing stage, a comparative study will be conducted between the drafts of Chinese subtitles for 10 short English talks and the revised versions of these drafts so as to examine the actual revising strategies and their effect on translation quality. According to the results obtained from the questionnaire and textual comparisons, this paper will provide in-depth analysis of quality assurance of the TED Open Translation Project. It is hoped that this research, through a detailed investigation of non-profit translation crowdsourcing, can enable translation researchers and practitioners to have a better understanding of quality control in translation crowdsourcing in the digital age.Keywords: quality assurance, TED, translation crowdsourcing, volunteer translators
Procedia PDF Downloads 23111027 Bioengineering of a Plant System to Sustainably Remove Heavy Metals and to Harvest Rare Earth Elements (REEs) from Industrial Wastes
Authors: Edmaritz Hernandez-Pagan, Kanjana Laosuntisuk, Alex Harris, Allison Haynes, David Buitrago, Michael Kudenov, Colleen Doherty
Abstract:
Rare Earth Elements (REEs) are critical metals for modern electronics, green technologies, and defense systems. However, due to their dispersed nature in the Earth’s crust, frequent co-occurrence with radioactive materials, and similar chemical properties, acquiring and purifying REEs is costly and environmentally damaging, restricting access to these metals. Plants could serve as resources for bioengineering REE mining systems. Although there is limited information on how REEs affect plants at a cellular and molecular level, plants with high REE tolerance and hyperaccumulation have been identified. This dissertation aims to develop a plant-based system for harvesting REEs from industrial waste material with a focus on Acid Mine Drainage (AMD), a toxic coal mining product. The objectives are 1) to develop a non-destructive, in vivo detection method for REE detection in Phytolacca plants (REE hyperaccumulator) plants utilizing fluorescence spectroscopy and with a primary focus on dysprosium, 2) to characterize the uptake of REE and Heavy Metals in Phytolacca americana and Phytolacca acinosa (REE hyperaccumulator) in AMD for potential implementation in the plant-based system, 3) to implement the REE detection method to identify REE-binding proteins and peptides for potential enhancement of uptake and selectivity for targeted REEs in the plants implemented in the plant-based system. The candidates are known REE-binding peptides or proteins, orthologs of known metal-binding proteins from REE hyperaccumulator plants, and novel proteins and peptides identified by comparative plant transcriptomics. Lanmodulin, a high-affinity REE-binding protein from methylotrophic bacteria, is used as a benchmark for the REE-protein binding fluorescence assays and expression in A. thaliana to test for changes in REE plant tolerance and uptake.Keywords: phytomining, agromining, rare earth elements, pokeweed, phytolacca
Procedia PDF Downloads 15