Search results for: computer and information
10226 Effect of Social Media on Knowledge Work
Authors: Pekka Makkonen, Georgios Lampropoulos, Kerstin Siakas
Abstract:
This paper examines the impact of social media on knowledge work. It discloses and highlights which specific aspects, areas and tasks of knowledge work can be improved by the use of social media. Moreover, the study includes a survey about higher education students’ viewpoints in regard to the use of social media as a means to enhance knowledge work and knowledge sharing. The analysis has been conducted based both on empirical data and on discussions about the sources dealing with knowledge work and how it can be enhanced by using social media. The results show that social media can improve knowledge work, knowledge building and maintenance tasks in which communication, information sharing and collaboration play a vital role. Additionally, by using social media, personal, collaborative and supplementary work activities can be enhanced. Based on the results of the study, we suggest how knowledge work can be enhanced when using the contemporary information and communications technologies (ICTs) of the 21st century and recommend future directions towards improving knowledge work.Keywords: knowledge work, social media, social media services, improving work performance
Procedia PDF Downloads 16110225 Identifying Barriers of Implementing Building Information Modelling in Construction
Authors: Kasra HosseinMostofi, Mohamadamin Oyar Hossein, Reza Mehdizadeh Anvigh
Abstract:
BIM is an innovative concept for the majority of firms operating in industry. BIM offers a new paradigm to design, construct, operate, and maintain a facility. However, even with the most conscientious use, stakeholders can run into trouble during its implementation on a project or within an organization. At times, project stakeholders are unaware of the challenges that they can face with the implementation at the project level or an organizational level. Therefore, the study aimed to identify and compile barriers associated with the BIM implementation at the project and organizational level, as per the literature. Despite the fact that innumerable advantageous involved in exploiting BIM, there are some barriers to implement it properly. These barriers have been proved as impediments for academicians and members of construction team project to take the maximum advantage of its utilization. Although some research has been conducted to identify these barriers regarding BIM implementation in construction industry, more research is needed to be carried out among academicians to identify these barriers in institutions, and most importantly, to make suggestions for eliminating these obstacles.Keywords: building information modelling, construction, design and construction, designers
Procedia PDF Downloads 18410224 Governance in the Age of Artificial intelligence and E- Government
Authors: Mernoosh Abouzari, Shahrokh Sahraei
Abstract:
Electronic government is a way for governments to use new technology that provides people with the necessary facilities for proper access to government information and services, improving the quality of services and providing broad opportunities to participate in democratic processes and institutions. That leads to providing the possibility of easy use of information technology in order to distribute government services to the customer without holidays, which increases people's satisfaction and participation in political and economic activities. The expansion of e-government services and its movement towards intelligentization has the ability to re-establish the relationship between the government and citizens and the elements and components of the government. Electronic government is the result of the use of information and communication technology (ICT), which by implementing it at the government level, in terms of the efficiency and effectiveness of government systems and the way of providing services, tremendous commercial changes are created, which brings people's satisfaction at the wide level will follow. The main level of electronic government services has become objectified today with the presence of artificial intelligence systems, which recent advances in artificial intelligence represent a revolution in the use of machines to support predictive decision-making and Classification of data. With the use of deep learning tools, artificial intelligence can mean a significant improvement in the delivery of services to citizens and uplift the work of public service professionals while also inspiring a new generation of technocrats to enter government. This smart revolution may put aside some functions of the government, change its components, and concepts such as governance, policymaking or democracy will change in front of artificial intelligence technology, and the top-down position in governance may face serious changes, and If governments delay in using artificial intelligence, the balance of power will change and private companies will monopolize everything with their pioneering in this field, and the world order will also depend on rich multinational companies and in fact, Algorithmic systems will become the ruling systems of the world. It can be said that currently, the revolution in information technology and biotechnology has been started by engineers, large economic companies, and scientists who are rarely aware of the political complexities of their decisions and certainly do not represent anyone. Therefore, it seems that if liberalism, nationalism, or any other religion wants to organize the world of 2050, it should not only rationalize the concept of artificial intelligence and complex data algorithm but also mix them in a new and meaningful narrative. Therefore, the changes caused by artificial intelligence in the political and economic order will lead to a major change in the way all countries deal with the phenomenon of digital globalization. In this paper, while debating the role and performance of e-government, we will discuss the efficiency and application of artificial intelligence in e-government, and we will consider the developments resulting from it in the new world and the concepts of governance.Keywords: electronic government, artificial intelligence, information and communication technology., system
Procedia PDF Downloads 9410223 Flood Hazard Impact Based on Simulation Model of Potential Flood Inundation in Lamong River, Gresik Regency
Authors: Yunita Ratih Wijayanti, Dwi Rahmawati, Turniningtyas Ayu Rahmawati
Abstract:
Gresik is one of the districts in East Java Province, Indonesia. Gresik Regency has three major rivers, namely Bengawan Solo River, Brantas River, and Lamong River. Lamong River is a tributary of Bengawan Solo River. Flood disasters that occur in Gresik Regency are often caused by the overflow of the Lamong River. The losses caused by the flood were very large and certainly detrimental to the affected people. Therefore, to be able to minimize the impact caused by the flood, it is necessary to take preventive action. However, before taking preventive action, it is necessary to have information regarding potential inundation areas and water levels at various points. For this reason, a flood simulation model is needed. In this study, the simulation was carried out using the Geographic Information System (GIS) method with the help of Global Mapper software. The approach used in this simulation is to use a topographical approach with Digital Elevation Models (DEMs) data. DEMs data have been widely used for various researches to analyze hydrology. The results obtained from this flood simulation are the distribution of flood inundation and water level. The location of the inundation serves to determine the extent of the flooding that occurs by referring to the 50-100 year flood plan, while the water level serves to provide early warning information. Both will be very useful to find out how much loss will be caused in the future due to flooding in Gresik Regency so that the Gresik Regency Regional Disaster Management Agency can take precautions before the flood disaster strikes.Keywords: flood hazard, simulation model, potential inundation, global mapper, Gresik Regency
Procedia PDF Downloads 8410222 Efficient Manageability and Intelligent Classification of Web Browsing History Using Machine Learning
Authors: Suraj Gururaj, Sumantha Udupa U.
Abstract:
Browsing the Web has emerged as the de facto activity performed on the Internet. Although browsing gets tracked, the manageability aspect of Web browsing history is very poor. In this paper, we have a workable solution implemented by using machine learning and natural language processing techniques for efficient manageability of user’s browsing history. The significance of adding such a capability to a Web browser is that it ensures efficient and quick information retrieval from browsing history, which currently is very challenging. Our solution guarantees that any important websites visited in the past can be easily accessible because of the intelligent and automatic classification. In a nutshell, our solution-based paper provides an implementation as a browser extension by intelligently classifying the browsing history into most relevant category automatically without any user’s intervention. This guarantees no information is lost and increases productivity by saving time spent revisiting websites that were of much importance.Keywords: adhoc retrieval, Chrome extension, supervised learning, tile, Web personalization
Procedia PDF Downloads 37610221 Arabic Light Stemmer for Better Search Accuracy
Authors: Sahar Khedr, Dina Sayed, Ayman Hanafy
Abstract:
Arabic is one of the most ancient and critical languages in the world. It has over than 250 million Arabic native speakers and more than twenty countries having Arabic as one of its official languages. In the past decade, we have witnessed a rapid evolution in smart devices, social network and technology sector which led to the need to provide tools and libraries that properly tackle the Arabic language in different domains. Stemming is one of the most crucial linguistic fundamentals. It is used in many applications especially in information extraction and text mining fields. The motivation behind this work is to enhance the Arabic light stemmer to serve the data mining industry and leverage it in an open source community. The presented implementation works on enhancing the Arabic light stemmer by utilizing and enhancing an algorithm that provides an extension for a new set of rules and patterns accompanied by adjusted procedure. This study has proven a significant enhancement for better search accuracy with an average 10% improvement in comparison with previous works.Keywords: Arabic data mining, Arabic Information extraction, Arabic Light stemmer, Arabic stemmer
Procedia PDF Downloads 30810220 The Role of Information and Communication Technology in Curbing Electoral Malpractices in Nigeria
Authors: Fred Fudah Moveh, Muhammad Abba Jallo
Abstract:
Electoral fraud remains a persistent threat to democracy in Nigeria, undermining public trust and stalling political development. This study explores the role of Information and Communication Technology (ICT) in curbing electoral fraud, focusing on its application in recent Nigerian elections. The paper identifies the main forms of electoral fraud, evaluates the effectiveness of ICT-based interventions like the Permanent Voter Card (PVC) and the Bi-modal Voter Accreditation System (BVAS), and discusses challenges such as poor infrastructure, voter intimidation, and legal inadequacies. Data was collected through structured questionnaires and interviews and analyzed using SPSS software. Results reveal that while ICT has mitigated some forms of fraud, systemic issues continue to hinder its full potential. The study concludes with recommendations for enhancing the application of ICT in Nigeria’s electoral process.Keywords: ICT, electoral fraud, election process, Nigeria, political instability
Procedia PDF Downloads 2610219 The Coexistence of Creativity and Information in Convergence Journalism: Pakistan's Evolving Media Landscape
Authors: Misha Mirza
Abstract:
In recent years, the definition of journalism in Pakistan has changed, so has the mindset of people and their approach towards a news story. For the audience, news has become more interesting than a drama or a film. This research thus provides an insight into Pakistan’s evolving media landscape. It tries not only to bring forth the outcomes of cross-platform cooperation among print and broadcast journalism but also gives an insight into the interactive data visualization techniques being used. The storytelling in journalism in Pakistan has evolved from depicting merely the truth to tweaking, fabricating and producing docu-dramas. It aims to look into how news is translated to a visual. Pakistan acquires a diverse cultural heritage and by engaging audience through media, this history translates into the storytelling platform today. The paper explains how journalists are thriving in a converging media environment and provides an analysis of the narratives in television talk shows today.’ Jack of all, master of none’ is being challenged by the journalists today. One has to be a quality information gatherer and an effective storyteller at the same time. Are journalists really looking more into what sells rather than what matters? Express Tribune is a very popular news platform among the youth. Not only is their newspaper more attractive than the competitors but also their style of narrative and interactive web stories lead to well-rounded news. Interviews are used as the basic methodology to get an insight into how data visualization is compassed. The quest for finding out the difference between visualization of information versus the visualization of knowledge has led the author to delve into the work of David McCandless in his book ‘Knowledge is beautiful’. Journalism in Pakistan has evolved from information to combining knowledge, infotainment and comedy. What is being criticized the most by the society most often becomes the breaking news. Circulation in today’s world is carried out in cultural and social networks. In recent times, we have come across many examples where people have gained overnight popularity by releasing songs with substandard lyrics or senseless videos perhaps because creativity has taken over information. This paper thus discusses the various platforms of convergence journalism from Pakistan’s perspective. The study concludes with proving how Pakistani pop culture Truck art is coexisting with all the platforms in convergent journalism. The changing media landscape thus challenges the basic rules of journalism. The slapstick humor and ‘jhatka’ in Pakistani talk shows has evolved from the Pakistani truck art poetry. Mobile journalism has taken over all the other mediums of journalism; however, the Pakistani culture coexists with the converging landscape.Keywords: convergence journalism in Pakistan, data visualization, interactive narrative in Pakistani news, mobile journalism, Pakistan's truck art culture
Procedia PDF Downloads 28410218 Method of Complex Estimation of Text Perusal and Indicators of Reading Quality in Different Types of Commercials
Authors: Victor N. Anisimov, Lyubov A. Boyko, Yazgul R. Almukhametova, Natalia V. Galkina, Alexander V. Latanov
Abstract:
Modern commercials presented on billboards, TV and on the Internet contain a lot of information about the product or service in text form. However, this information cannot always be perceived and understood by consumers. Typical sociological focus group studies often cannot reveal important features of the interpretation and understanding information that has been read in text messages. In addition, there is no reliable method to determine the degree of understanding of the information contained in a text. Only the fact of viewing a text does not mean that consumer has perceived and understood the meaning of this text. At the same time, the tools based on marketing analysis allow only to indirectly estimate the process of reading and understanding a text. Therefore, the aim of this work is to develop a valid method of recording objective indicators in real time for assessing the fact of reading and the degree of text comprehension. Psychophysiological parameters recorded during text reading can form the basis for this objective method. We studied the relationship between multimodal psychophysiological parameters and the process of text comprehension during reading using the method of correlation analysis. We used eye-tracking technology to record eye movements parameters to estimate visual attention, electroencephalography (EEG) to assess cognitive load and polygraphic indicators (skin-galvanic reaction, SGR) that reflect the emotional state of the respondent during text reading. We revealed reliable interrelations between perceiving the information and the dynamics of psychophysiological parameters during reading the text in commercials. Eye movement parameters reflected the difficulties arising in respondents during perceiving ambiguous parts of text. EEG dynamics in rate of alpha band were related with cumulative effect of cognitive load. SGR dynamics were related with emotional state of the respondent and with the meaning of text and type of commercial. EEG and polygraph parameters together also reflected the mental difficulties of respondents in understanding text and showed significant differences in cases of low and high text comprehension. We also revealed differences in psychophysiological parameters for different type of commercials (static vs. video, financial vs. cinema vs. pharmaceutics vs. mobile communication, etc.). Conclusions: Our methodology allows to perform multimodal evaluation of text perusal and the quality of text reading in commercials. In general, our results indicate the possibility of designing an integral model to estimate the comprehension of reading the commercial text in percent scale based on all noticed markers.Keywords: reading, commercials, eye movements, EEG, polygraphic indicators
Procedia PDF Downloads 16610217 Modular Data and Calculation Framework for a Technology-based Mapping of the Manufacturing Process According to the Value Stream Management Approach
Authors: Tim Wollert, Fabian Behrendt
Abstract:
Value Stream Management (VSM) is a widely used methodology in the context of Lean Management for improving end-to-end material and information flows from a supplier to a customer from a company’s perspective. Whereas the design principles, e.g. Pull, value-adding, customer-orientation and further ones are still valid against the background of an increasing digitalized and dynamic environment, the methodology itself for mapping a value stream is characterized as time- and resource-intensive due to the high degree of manual activities. The digitalization of processes in the context of Industry 4.0 enables new opportunities to reduce these manual efforts and make the VSM approach more agile. The paper at hand aims at providing a modular data and calculation framework, utilizing the available business data, provided by information and communication technologies for automizing the value stream mapping process with focus on the manufacturing process.Keywords: lean management 4.0, value stream management (VSM) 4.0, dynamic value stream mapping, enterprise resource planning (ERP)
Procedia PDF Downloads 15010216 Technology Futures in Global Militaries: A Forecasting Method Using Abstraction Hierarchies
Authors: Mark Andrew
Abstract:
Geopolitical tensions are at a thirty-year high, and the pace of technological innovation is driving asymmetry in force capabilities between nation states and between non-state actors. Technology futures are a vital component of defence capability growth, and investments in technology futures need to be informed by accurate and reliable forecasts of the options for ‘systems of systems’ innovation, development, and deployment. This paper describes a method for forecasting technology futures developed through an analysis of four key systems’ development stages, namely: technology domain categorisation, scanning results examining novel systems’ signals and signs, potential system-of systems’ implications in warfare theatres, and political ramifications in terms of funding and development priorities. The method has been applied to several technology domains, including physical systems (e.g., nano weapons, loitering munitions, inflight charging, and hypersonic missiles), biological systems (e.g., molecular virus weaponry, genetic engineering, brain-computer interfaces, and trans-human augmentation), and information systems (e.g., sensor technologies supporting situation awareness, cyber-driven social attacks, and goal-specification challenges to proliferation and alliance testing). Although the current application of the method has been team-centred using paper-based rapid prototyping and iteration, the application of autonomous language models (such as GPT-3) is anticipated as a next-stage operating platform. The importance of forecasting accuracy and reliability is considered a vital element in guiding technology development to afford stronger contingencies as ideological changes are forecast to expand threats to ecology and earth systems, possibly eclipsing the traditional vulnerabilities of nation states. The early results from the method will be subjected to ground truthing using longitudinal investigation.Keywords: forecasting, technology futures, uncertainty, complexity
Procedia PDF Downloads 11410215 Internet-Based Architecture for Machine-to-Machine Communication of a Public Security Network
Authors: Ogwueleka Francisca Nonyelum, Jiya Muhammad
Abstract:
Poor communication between the victims of the burglaries, road and fire accidents and the agencies, and lack of quick emergency response by the agencies is solved through Machine-to-Machine (M2M) communication. A distress caller is expected to make a call through a network to the respective agency for emergency response but due to some challenges, this often becomes arduous and futile. This research puts forth an Internet-based architecture for Machine-to-Machine (M2M) communication to enhance information dissemination in National Public Security Communication System (NPSCS) network. M2M enables the flow of data between machines and machines and ultimately machines and people with information flowing from a machine over a network, and then through a gateway to a system where it is reviewed and acted on. The research findings showed that Internet-based architecture for M2M communication is most suitable for deployment of a public security network which will allow machines to use Internet to talk to each other.Keywords: machine-to-machine (M2M), internet-based architecture, network, gateway
Procedia PDF Downloads 48410214 In Response to Worldwide Disaster: Academic Libraries’ Functioning During COVID-19 Pandemic Without a Policy
Authors: Dalal Albudaiwi, Mike Allen, Talal Alhaji, Shahnaz Khadimehzadah
Abstract:
As a pandemic, COVID-19 has impacted the whole world since November 2019. In other words, every organization, industry, and institution has been negatively affected by the Coronavirus. The uncertainty of how long the pandemic will last caused chaos at all levels. As with any other institution, public libraries were affected and transmitted into online services and resources. As internationally, have been witnessed that some public libraries were well-prepared for such disasters as the pandemic, and therefore, collections, users, services, technologies, staff, and budgets were all influenced. Public libraries’ policies did not mention any plan regarding such a pandemic. Instead, there are several rules in the guidelines about disasters in general, such as natural disasters. In this pandemic situation, libraries have been involved in different uneasy circumstances. However, it has always been apparent to public libraries the role they play in serving their communities in excellent and critical times. It dwells into the traditional role public libraries play in providing information services and sources to satisfy their information-based community needs. Remarkably increasing people’s awareness of the importance of informational enrichment and enhancing society’s skills in dealing with information and information sources. Under critical circumstances, libraries play a different role. It goes beyond the traditional part of information providers to the untraditional role of being a social institution that serves the community with whatever capabilities they have. This study takes two significant directions. The first focuses on investigating how libraries have responded to COVID-19 and how they manage disasters within their organization. The second direction focuses on how libraries help their communities to act during disasters and how to recover from the consequences. The current study examines how libraries prepare for disasters and the role of public libraries during disasters. We will also propose “measures” to be a model that libraries can use to evaluate the effectiveness of their response to disasters. We intend to focus on how libraries responded to this new disaster. Therefore, this study aims to develop a comprehensive policy that includes responding to a crisis such as Covid-19. An analytical lens inside the libraries as an organization and outside the organization walls will be documented based on analyzing disaster-related literature published in the LIS publication. The study employs content analysis (CA) methodology. CA is widely used in the library and information science. The critical contribution of this work is to propose solutions it provides to libraries and planers to prepare crisis management plans/ policies, specifically to face a new global disaster such as the COVID-19 pandemic. Moreover, the study will help library directors to evaluate their strategies and to improve them properly. The significance of this study lies in guiding libraries’ directors to enhance the goals of the libraries to guarantee crucial issues such as: saving time, avoiding loss, saving budget, acting quickly during a crisis, maintaining libraries’ role during pandemics, finding out the best response to disasters, and creating plan/policy as a sample for all libraries.Keywords: Covid-19, policy, preparedness, public libraries
Procedia PDF Downloads 8010213 The Fracture Resistance of Zirconia Based Dental Crowns from Cyclic Loading: A Function of Relative Wear Depth
Authors: T. Qasim, B. El Masoud, D. Ailabouni
Abstract:
This in vitro study focused on investigating the fatigue resistance of veneered zirconia molar crowns with different veneering ceramic thicknesses, simulating the relative wear depths under simulated cyclic loading. A mandibular first molar was prepared and then scanned using computer-aided design/computer-aided manufacturing (CAD/CAM) technology to fabricate 32 zirconia copings of uniform 0.5 mm thickness. The manufactured copings then veneered with 1.5 mm, 1.0 mm, 0.5 mm, and 0.0 mm representing 0%, 33%, 66%, and 100% relative wear of a normal ceramic thickness of 1.5 mm. All samples were thermally aged to 6000 thermo-cycles for 2 minutes with distilled water between 5 ˚C and 55 ˚C. The samples subjected to cyclic fatigue and fracture testing using SD Mechatronik chewing simulator. These samples are loaded up to 1.25x10⁶ cycles or until they fail. During fatigue, testing, extensive cracks were observed in samples with 0.5 mm veneering layer thickness. Veneering layer thickness 1.5-mm group and 1.0-mm group were not different in terms of resisting loads necessary to cause an initial crack or final failure. All ceramic zirconia-based crown restorations with varying occlusal veneering layer thicknesses appeared to be fatigue resistant. Fracture load measurement for all tested groups before and after fatigue loading exceeded the clinical chewing forces in the posterior region. In general, the fracture loads increased after fatigue loading and with the increase in the thickness of the occlusal layering ceramic.Keywords: all ceramic, cyclic loading, chewing simulator, dental crowns, relative wear, thermally ageing
Procedia PDF Downloads 14210212 A Review of Intelligent Fire Management Systems to Reduce Wildfires
Authors: Nomfundo Ngombane, Topside E. Mathonsi
Abstract:
Remote sensing and satellite imaging have been widely used to detect wildfires; nevertheless, the technologies present some limitations in terms of early wildfire detection as the technologies are greatly influenced by weather conditions and can miss small fires. The fires need to have spread a few kilometers for the technologies to provide accurate detection. The South African Advanced Fire Information System uses MODIS (Moderate Resolution Imaging Spectroradiometer) as satellite imaging. MODIS has limitations as it can exclude small fires and can fall short in validating fire vulnerability. Thus in the future, a Machine Learning algorithm will be designed and implemented for the early detection of wildfires. A simulator will be used to evaluate the effectiveness of the proposed solution, and the results of the simulation will be presented.Keywords: moderate resolution imaging spectroradiometer, advanced fire information system, machine learning algorithm, detection of wildfires
Procedia PDF Downloads 7810211 A Comparative Study of Environment Risk Assessment Guidelines of Developing and Developed Countries Including Bangladesh
Authors: Syeda Fahria Hoque Mimmi, Aparna Islam
Abstract:
Genetically engineered (GE) plants are the need of time for increased demand for food. A complete set of regulations need to be followed from the development of a GE plant to its release into the environment. The whole regulation system is categorized into separate stages for maintaining the proper biosafety. Environmental risk assessment (ERA) is one of such crucial stages in the whole process. ERA identifies potential risks and their impacts through science-based evaluation where it is done in a case-by-case study. All the countries which deal with GE plants follow specific guidelines to conduct a successful ERA. In this study, ERA guidelines of 4 developing and 4 developed countries, including Bangladesh, were compared. ERA guidelines of countries such as India, Canada, Australia, the European Union, Argentina, Brazil, and the US were considered as a model to conduct the comparison study with Bangladesh. Initially, ten parameters were detected to compare the required data and information among all the guidelines. Surprisingly, an adequate amount of data and information requirements (e.g., if the intended modification/new traits of interest has been achieved or not, the growth habit of GE plants, consequences of any potential gene flow upon the cultivation of GE plants to sexually compatible plant species, potential adverse effects on the human health, etc.) matched between all the countries. However, a few differences in data requirement (e.g., agronomic conventions of non-transformed plants, applicants should clearly describe experimental procedures followed, etc.) were also observed in the study. Moreover, it was found that only a few countries provide instructions on the quality of the data used for ERA. If these similarities are recognized in a more framed manner, then the approval pathway of GE plants can be shared.Keywords: GE plants, ERA, harmonization, ERA guidelines, Information and data requirements
Procedia PDF Downloads 18710210 Development of Academic Software for Medial Axis Determination of Porous Media from High-Resolution X-Ray Microtomography Data
Authors: S. Jurado, E. Pazmino
Abstract:
Determination of the medial axis of a porous media sample is a non-trivial problem of interest for several disciplines, e.g., hydrology, fluid dynamics, contaminant transport, filtration, oil extraction, etc. However, the computational tools available for researchers are limited and restricted. The primary aim of this work was to develop a series of algorithms to extract porosity, medial axis structure, and pore-throat size distributions from porous media domains. A complementary objective was to provide the algorithms as free computational software available to the academic community comprising researchers and students interested in 3D data processing. The burn algorithm was tested on porous media data obtained from High-Resolution X-Ray Microtomography (HRXMT) and idealized computer-generated domains. The real data and idealized domains were discretized in voxels domains of 550³ elements and binarized to denote solid and void regions to determine porosity. Subsequently, the algorithm identifies the layer of void voxels next to the solid boundaries. An iterative process removes or 'burns' void voxels in sequence of layer by layer until all the void space is characterized. Multiples strategies were tested to optimize the execution time and use of computer memory, i.e., segmentation of the overall domain in subdomains, vectorization of operations, and extraction of single burn layer data during the iterative process. The medial axis determination was conducted identifying regions where burnt layers collide. The final medial axis structure was refined to avoid concave-grain effects and utilized to determine the pore throat size distribution. A graphic user interface software was developed to encompass all these algorithms, including the generation of idealized porous media domains. The software allows input of HRXMT data to calculate porosity, medial axis, and pore-throat size distribution and provide output in tabular and graphical formats. Preliminary tests of the software developed during this study achieved medial axis, pore-throat size distribution and porosity determination of 100³, 320³ and 550³ voxel porous media domains in 2, 22, and 45 minutes, respectively in a personal computer (Intel i7 processor, 16Gb RAM). These results indicate that the software is a practical and accessible tool in postprocessing HRXMT data for the academic community.Keywords: medial axis, pore-throat distribution, porosity, porous media
Procedia PDF Downloads 11510209 Factors Influencing Resolution of Anaphora with Collective Nominals in Russian
Authors: Anna Moskaleva
Abstract:
A prolific body of research in theoretical and experimental linguistics claims that a preference for conceptual or grammatical information in the process of agreement greatly depends on the type of agreement dependency. According to the agreement hierarchy, an anaphoric agreement is more sensitive to semantic or conceptual rather than grammatical information of an antecedent. Furthermore, a higher linear distance between a pronoun and its antecedent is assumed to trigger semantic agreement, yet the hierarchical distance is hardly examined in the research field, and the contribution of each distance factor is unclear. Apart from that, working memory volume is deemed to play a role in maintaining grammatical information during language comprehension. The aim of this study is to observe distance and working memory effects in resolution of anaphora with collective nominals (e.g., team) and to have a closer look at the interaction of the factors. Collective nominals in many languages can have a holistic or distributive meaning and can be addressed by a singular or a plural pronoun, respectively. We investigated linguistic factors of linear and rhetorical (hierarchical) distance and a more general factor of working memory volume in their ability to facilitate the interpretation of the number feature of a collective noun in Russian. An eye-tracking reading experiment on comprehension has been conducted where university students were presented with composed texts, including collective nouns and personal pronouns alluding to them. Different eye-tracking measures were calculated using statistical methods. The results have shown that a significant increase in reading time in the case of a singular pronoun was demonstrated when both distances were high, and no such effect was observed when just one of the distances was high. A decrease in reading time has been obtained with distance in the case of a plural pronoun. The working memory effect was not revealed in the experiment. The resonance of distance factors indicates that not only the linear distance but also the hierarchical distance is of great importance in interpreting pronouns. The experimental findings also suggest that, apart from the agreement hierarchy, the preference for conceptual or grammatical information correlates with the distance between a pronoun and its antecedent.Keywords: collective nouns, agreement hierarchy, anaphora resolution, eye-tracking, language comprehension
Procedia PDF Downloads 3810208 The Image of Uganda in Germany: Assessing the Perceptions of Germans about Uganda as a Tourist Destination
Authors: K. V. Nabichu
Abstract:
The rationale of this research was to review how Germans perceive Uganda as a tourism destination, after German visitors arrivals to Uganda remain few compared to other destinations like Kenya. It was assumed that Uganda suffers a negative image in Germany due to negative media influence. The study findings indicate that Uganda is not a popular travel destination in Germany, there is generally lack of travel information about Uganda. Despite the respondents’ hearing about Uganda’s and her beautiful attractions, good climate and friendly people, they also think Uganda is unsafe for travel. Findings further show that Uganda is a potential travel destination for Germans due to her beautifull landscape, rich culture, wild life, primates and the Nile, however political unrest, insecurity, the fear for diseases and poor hygiene hinder Germans from travelling to Uganda. The media, internet as well as friends and relatives were the major primary sources of information on Uganda while others knew about Uganda through their school lessons and sports. Uganda is not well advertised and promoted in Germany.Keywords: destination Uganda and Germany, image, perception, negative media influence
Procedia PDF Downloads 34010207 Determine the Optimal Path of Content Adaptation Services with Max Heap Tree
Authors: Shilan Rahmani Azr, Siavash Emtiyaz
Abstract:
Recent development in computing and communicative technologies leads to much easier mobile accessibility to the information. Users can access to the information in different places using various deceives in which the care variety of abilities. Meanwhile, the format and details of electronic documents are changing each day. In these cases, a mismatch is created between content and client’s abilities. Recently the service-oriented content adaption has been developed which the adapting tasks are dedicated to some extended services. In this method, the main problem is to choose the best appropriate service among accessible and distributed services. In this paper, a method for determining the optimal path to the best services, based on the quality control parameters and user preferences, is proposed using max heap tree. The efficiency of this method in contrast to the other previous methods of the content adaptation is related to the determining the optimal path of the best services which are measured. The results show the advantages and progresses of this method in compare of the others.Keywords: service-oriented content adaption, QoS, max heap tree, web services
Procedia PDF Downloads 25910206 A Graph Library Development Based on the Service-Oriented Architecture: Used for Representation of the Biological Systems in the Computer Algorithms
Authors: Mehrshad Khosraviani, Sepehr Najjarpour
Abstract:
Considering the usage of graph-based approaches in systems and synthetic biology, and the various types of the graphs employed by them, a comprehensive graph library based on the three-tier architecture (3TA) was previously introduced for full representation of the biological systems. Although proposing a 3TA-based graph library, three following reasons motivated us to redesign the graph library based on the service-oriented architecture (SOA): (1) Maintaining the accuracy of the data related to an input graph (including its edges, its vertices, its topology, etc.) without involving the end user: Since, in the case of using 3TA, the library files are available to the end users, they may be utilized incorrectly, and consequently, the invalid graph data will be provided to the computer algorithms. However, considering the usage of the SOA, the operation of the graph registration is specified as a service by encapsulation of the library files. In other words, overall control operations needed for registration of the valid data will be the responsibility of the services. (2) Partitioning of the library product into some different parts: Considering 3TA, a whole library product was provided in general. While here, the product can be divided into smaller ones, such as an AND/OR graph drawing service, and each one can be provided individually. As a result, the end user will be able to select any parts of the library product, instead of all features, to add it to a project. (3) Reduction of the complexities: While using 3TA, several other libraries must be needed to add for connecting to the database, responsibility of the provision of the needed library resources in the SOA-based graph library is entrusted with the services by themselves. Therefore, the end user who wants to use the graph library is not involved with its complexity. In the end, in order to make the library easier to control in the system, and to restrict the end user from accessing the files, it was preferred to use the service-oriented architecture (SOA) over the three-tier architecture (3TA) and to redevelop the previously proposed graph library based on it.Keywords: Bio-Design Automation, Biological System, Graph Library, Service-Oriented Architecture, Systems and Synthetic Biology
Procedia PDF Downloads 31110205 Predictions of Dynamic Behaviors for Gas Foil Bearings Operating at Steady-State Based on Multi-Physics Coupling Computer Aided Engineering Simulations
Authors: Tai Yuan Yu, Pei-Jen Wang
Abstract:
A simulation scheme of rotational motions for predictions of bump-type gas foil bearings operating at steady-state is proposed; and, the scheme is based on multi-physics coupling computer aided engineering packages modularized with computational fluid dynamic model and structure elasticity model to numerically solve the dynamic equation of motions of a hydrodynamic loaded shaft supported by an elastic bump foil. The bump foil is assumed to be modelled as infinite number of Hookean springs mounted on stiff wall. Hence, the top foil stiffness is constant on the periphery of the bearing housing. The hydrodynamic pressure generated by the air film lubrication transfers to the top foil and induces elastic deformation needed to be solved by a finite element method program, whereas the pressure profile applied on the top foil must be solved by a finite element method program based on Reynolds Equation in lubrication theory. As a result, the equation of motions for the bearing shaft are iteratively solved via coupling of the two finite element method programs simultaneously. In conclusion, the two-dimensional center trajectory of the shaft plus the deformation map on top foil at constant rotational speed are calculated for comparisons with the experimental results.Keywords: computational fluid dynamics, fluid structure interaction multi-physics simulations, gas foil bearing, load capacity
Procedia PDF Downloads 16110204 Measuring Systems Interoperability: A Focal Point for Standardized Assessment of Regional Disaster Resilience
Authors: Joel Thomas, Alexa Squirini
Abstract:
The key argument of this research is that every element of systems interoperability is an enabler of regional disaster resilience, and arguably should become a focal point for standardized measurement of communities’ ability to work together. Few resilience research efforts have focused on the development and application of solutions that measurably improve communities’ ability to work together at a regional level, yet a majority of the most devastating and disruptive disasters are those that have had a regional impact. The key findings of the research include a unique theoretical, mathematical, and operational approach to tangibly and defensibly measure and assess systems interoperability required to support crisis information management activities performed by governments, the private sector, and humanitarian organizations. A most effective way for communities to measurably improve regional disaster resilience is through deliberately executed disaster preparedness activities. Developing interoperable crisis information management capabilities is a crosscutting preparedness activity that greatly affects a community’s readiness and ability to work together in times of crisis. Thus, improving communities’ human and technical posture to work together in advance of a crisis, with the ultimate goal of enabling information sharing to support coordination and the careful management of available resources, is a primary means by which communities may improve regional disaster resilience. This model describes how systems interoperability can be qualitatively and quantitatively assessed when characterized as five forms of capital: governance; standard operating procedures; technology; training and exercises; and usage. The unique measurement framework presented defines the relationships between systems interoperability, information sharing and safeguarding, operational coordination, community preparedness and regional disaster resilience, and offers a means by which to implement real-world solutions and measure progress over the course of a multi-year program. The model is being developed and piloted in partnership with the U.S. Department of Homeland Security (DHS) Science and Technology Directorate (S&T) and the North Atlantic Treaty Organization (NATO) Advanced Regional Civil Emergency Coordination Pilot (ARCECP) with twenty-three organizations in Bosnia and Herzegovina, Croatia, Macedonia, and Montenegro. The intended effect of the model implementation is to enable communities to answer two key questions: 'Have we measurably improved crisis information management capabilities as a result of this effort?' and, 'As a result, are we more resilient?'Keywords: disaster, interoperability, measurement, resilience
Procedia PDF Downloads 14310203 Web-Based Learning in Nursing: The Sample of Delivery Lesson Program
Authors: Merve Kadioğlu, Nevin H. Şahin
Abstract:
Purpose: This research is organized to determine the influence of the web-based learning program. The program has been developed to gain information about normal delivery skill that is one of the topics of nursing students who take the woman health and illness. Material and Methods: The methodology of this study was applied as pre-test post-test single-group quasi-experimental. The pilot study consisted of 28 nursing student study groups who agreed to participate in the study. The findings were gathered via web-based technologies: student information form, information evaluation tests, Web Based Training Material Evaluation Scale and web-based learning environment feedback form. In the analysis of the data, the percentage, frequency and Wilcoxon Signed Ranks Test were used. The Web Based Instruction Program was developed in the light of full learning model, Mayer's research-based multimedia development principles and Gagne's Instructional Activities Model. Findings: The average scores of it was determined in accordance with the web-based educational material evaluation scale: ‘Instructional Suitability’ 4.45, ‘Suitability to Educational Program’ 4.48, ‘Visual Adequacy’ 4.53, ‘Programming Eligibility / Technical Adequacy’ 4.00. Also, the participants mentioned that the program is successful and useful. A significant difference was found between the pre-test and post-test results of the seven modules (p < 0.05). Results: According to pilot study data, the program was rated ‘very good’ by the study group. It was also found to be effective in increasing knowledge about normal labor.Keywords: normal delivery, web-based learning, nursing students, e-learning
Procedia PDF Downloads 17810202 Query Task Modulator: A Computerized Experimentation System to Study Media-Multitasking Behavior
Authors: Premjit K. Sanjram, Gagan Jakhotiya, Apoorv Goyal, Shanu Shukla
Abstract:
In psychological research, laboratory experiments often face the trade-off issue between experimental control and mundane realism. With the advent of Immersive Virtual Environment Technology (IVET), this issue seems to be at bay. However there is a growing challenge within the IVET itself to design and develop system or software that captures the psychological phenomenon of everyday lives. One such phenomena that is of growing interest is ‘media-multitasking’ To aid laboratory researches in media-multitasking this paper introduces Query Task Modulator (QTM), a computerized experimentation system to study media-multitasking behavior in a controlled laboratory environment. The system provides a computerized platform in conducting an experiment for experimenters to study media-multitasking in which participants will be involved in a query task. The system has Instant Messaging, E-mail, and Voice Call features. The answers to queries are provided on the left hand side information panel where participants have to search for it and feed the information in the respective communication media blocks as fast as possible. On the whole the system will collect multitasking behavioral data. To analyze performance there is a separate output table that records the reaction times and responses of the participants individually. Information panel and all the media blocks will appear on a single window in order to ensure multi-modality feature in media-multitasking and equal emphasis on all the tasks (thus avoiding prioritization to a particular task). The paper discusses the development of QTM in the light of current techniques of studying media-multitasking.Keywords: experimentation system, human performance, media-multitasking, query-task
Procedia PDF Downloads 55710201 Convergence with IFRS: Evidence from Financial Statements
Authors: M. S. Turan, Dimple
Abstract:
Due to implementation of IFRS by several developed and developing countries, India has no option other than to converge their accounting standards with IFRS. There are over 10,000 listed companies required to implement IFRS in India. IFRS based financial information presented by a company is different from the same information provided by Indian GAAPs. In this study, we have brought out and analyzed the effect of IFRS reporting on the financial statements of selected companies. The results reveal that convergence with IFRS brought prominent positive variations in the values of quick ratio, debt/equity ratio, proprietary ratio and net profit ratio, while negative variation is brought in the values of current ratio, debt to total assets ratio, operating profit ratio, return on capital employed and return on shareholders’ equity ratios. It also presents significant changes in the values of items of balance sheet, profit and loss account and cash flow statement.Keywords: IFRS, reporting standards, convergence process, results
Procedia PDF Downloads 33410200 Polarization of Lithuanian Society on Issues Related to Language Politics
Authors: Eglė Žurauskaitė, Eglė Gudavičienė
Abstract:
The goal of this paper is to reveal how polarization is constructed through the use of impoliteness strategies. In general, media helps to spread various ideas very fast, and it means that processes of polarization are best revealed in computer-mediated communication (CMC) contexts. For this reason, data for the research was collected from online texts about a current, very diverse topic in Lithuania - Lithuanian language policy and regulations, because this topic is causing a lot of tension in Lithuanian society. Computer-mediated communication allows users to edit their message before they send it. It means that addressees carefully select verbal expressions to convey their message. In other words, each impoliteness strategy and its verbal expression were created intentionally. Impoliteness strategies in this research are understood as various ways to reach a communicative goal: belittle the other. To reach the goal, the public opinions of various Lithuanian public figures (e. g., cultural people, politicians, officials) were collected from new portals in 2019–2023 and analyzed using both quantitative and qualitative approaches. First, problematic aspects of the language policy, for which public figures complain, were identified. Then instances when public figures take a defensive position were analyzed: how they express this position and what it reveals about Lithuanian culture. Findings of this research demonstrate how concepts of impoliteness theory can be applied in analyzing the process of polarization in Lithuanian society on issues related to the State language policy. Also, to reveal how polarization is constructed, these tasks were set: a) determine which impoliteness strategies are used throughout the process of creating polarization, b) analyze how they were expressed verbally (e. g., as an advice, offer, etc.).Keywords: impoliteness, Lithuanian language policy, polarization, impoliteness strategies
Procedia PDF Downloads 5710199 Development of Building Information Modeling in Property Industry: Beginning with Building Information Modeling Construction
Authors: B. Godefroy, D. Beladjine, K. Beddiar
Abstract:
In France, construction BIM actors commonly evoke the BIM gains for exploitation by integrating of the life cycle of a building. The standardization of level 7 of development would achieve this stage of the digital model. The householders include local public authorities, social landlords, public institutions (health and education), enterprises, facilities management companies. They have a dual role: owner and manager of their housing complex. In a context of financial constraint, the BIM of exploitation aims to control costs, make long-term investment choices, renew the portfolio and enable environmental standards to be met. It assumes a knowledge of the existing buildings, marked by its size and complexity. The information sought must be synthetic and structured, it concerns, in general, a real estate complex. We conducted a study with professionals about their concerns and ways to use it to see how householders could benefit from this development. To obtain results, we had in mind the recurring interrogation of the project management, on the needs of the operators, we tested the following stages: 1) Inculcate a minimal culture of BIM with multidisciplinary teams of the operator then by business, 2) Learn by BIM tools, the adaptation of their trade in operations, 3) Understand the place and creation of a graphic and technical database management system, determine the components of its library so their needs, 4) Identify the cross-functional interventions of its managers by business (operations, technical, information system, purchasing and legal aspects), 5) Set an internal protocol and define the BIM impact in their digital strategy. In addition, continuity of management by the integration of construction models in the operation phase raises the question of interoperability in the control of the production of IFC files in the operator’s proprietary format and the export and import processes, a solution rivaled by the traditional method of vectorization of paper plans. Companies that digitize housing complex and those in FM produce a file IFC, directly, according to their needs without recourse to the model of construction, they produce models business for the exploitation. They standardize components, equipment that are useful for coding. We observed the consequences resulting from the use of the BIM in the property industry and, made the following observations: a) The value of data prevail over the graphics, 3D is little used b) The owner must, through his organization, promote the feedback of technical management information during the design phase c) The operator's reflection on outsourcing concerns the acquisition of its information system and these services, observing the risks and costs related to their internal or external developments. This study allows us to highlight: i) The need for an internal organization of operators prior to a response to the construction management ii) The evolution towards automated methods for creating models dedicated to the exploitation, a specialization would be required iii) A review of the communication of the project management, management continuity not articulating around his building model, it must take into account the environment of the operator and reflect on its scope of action.Keywords: information system, interoperability, models for exploitation, property industry
Procedia PDF Downloads 14410198 The Study of Internship Performances: Comparison of Information Technology Interns towards Students’ Types and Background Profiles
Authors: Shutchapol Chopvitayakun
Abstract:
Internship program is a compulsory course of many undergraduate programs in Thailand. It gives opportunities to a lot of senior students as interns to practice their working skills in the real organizations and also gives chances for interns to face real-world working problems. Interns also learn how to solve those problems by direct and indirect experiences. This program in many schools is a well-structured course with a contract or agreement made with real business organizations. Moreover, this program also offers opportunities for interns to get jobs after completing it from where the internship program takes place. Interns also learn how to work as a team and how to associate with other colleagues, trainers, and superiors of each organization in term of social hierarchy, self-responsibility, and self-disciplinary. This research focuses on senior students of Suan Sunandha Rajabhat University, Thailand whose studying major is information technology program. They practiced their working skills or took internship programs in the real business sector or real operating organizations in 2015-2016. Interns are categorized in to two types: normal program and special program. For special program, students study in weekday evening from Monday to Friday or Weekend and most of them work full-time or part-time job. For normal program, students study in weekday working hours and most of them do not work. The differences of these characters and the outcomes of internship performance were studied and analyzed in this research. This work applied some statistical analytics to find out whether the internship performance of each intern type has different performances statistically or not.Keywords: internship, intern, senior student, information technology program
Procedia PDF Downloads 26310197 Integrating a Security Operations Centre with an Organization’s Existing Procedures, Policies and Information Technology Systems
Authors: M. Mutemwa
Abstract:
A Cybersecurity Operation Centre (SOC) is a centralized hub for network event monitoring and incident response. SOCs are critical when determining an organization’s cybersecurity posture because they can be used to detect, analyze and report on various malicious activities. For most organizations, a SOC is not part of the initial design and implementation of the Information Technology (IT) environment but rather an afterthought. As a result, it is not natively a plug and play component; therefore, there are integration challenges when a SOC is introduced into an organization. A SOC is an independent hub that needs to be integrated with existing procedures, policies and IT systems of an organization such as the service desk, ticket logging system, reporting, etc. This paper discussed the challenges of integrating a newly developed SOC to an organization’s existing IT environment. Firstly, the paper begins by looking at what data sources should be incorporated into the Security Information and Event Management (SIEM) such as which host machines, servers, network end points, software, applications, web servers, etc. for security posture monitoring. That is which systems need to be monitored first and the order by which the rest of the systems follow. Secondly, the paper also describes how to integrate the organization’s ticket logging system with the SOC SIEM. That is how the cybersecurity related incidents should be logged by both analysts and non-technical employees of an organization. Also the priority matrix for incident types and notifications of incidents. Thirdly, the paper looks at how to communicate awareness campaigns from the SOC and also how to report on incidents that are found inside the SOC. Lastly, the paper looks at how to show value for the large investments that are poured into designing, building and running a SOC.Keywords: cybersecurity operation centre, incident response, priority matrix, procedures and policies
Procedia PDF Downloads 153