Search results for: customer information process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 23822

Search results for: customer information process

22052 Passengers’ Behavior Analysis under the Public Transport Disruption: An Agent-Based Simulation

Authors: M. Rahimi, F. Corman

Abstract:

This paper study the travel behavior of passengers in a public transport disruption under information provision strategies. We develop a within-day approach for multi-agent simulation to evaluate the behavior of the agents, under comprehensive scenarios through various information exposure, equilibrium, and non-equilibrium scenarios. In particular, we quantify the effects of information strategies in disruption situation on passengers’ satisfaction, number of involved agents, and the caused delay. An agent-based micro-simulation model (MATSim) is applied for the city of Zürich, Switzerland, for the purpose of activity-based simulation in a multimodal network. Statistic outcome is analysed for all the agents who may be involved in the disruption. Agents’ movement in the public transport network illustrates agents’ adaptations to available information about the disruption. Agents’ delays and utility reveal that information significantly affects agents’ satisfaction and delay in public transport disruption. Besides, while the earlier availability of the information causes the fewer consequent delay for the involved agents, however, it also leads to more amount of affected agents.

Keywords: agent-based simulation, disruption management, passengers’ behavior simulation, public transport

Procedia PDF Downloads 124
22051 Deep Learning Based Unsupervised Sport Scene Recognition and Highlights Generation

Authors: Ksenia Meshkova

Abstract:

With increasing amount of multimedia data, it is very important to automate and speed up the process of obtaining meta. This process means not just recognition of some object or its movement, but recognition of the entire scene versus separate frames and having timeline segmentation as a final result. Labeling datasets is time consuming, besides, attributing characteristics to particular scenes is clearly difficult due to their nature. In this article, we will consider autoencoders application to unsupervised scene recognition and clusterization based on interpretable features. Further, we will focus on particular types of auto encoders that relevant to our study. We will take a look at the specificity of deep learning related to information theory and rate-distortion theory and describe the solutions empowering poor interpretability of deep learning in media content processing. As a conclusion, we will present the results of the work of custom framework, based on autoencoders, capable of scene recognition as was deeply studied above, with highlights generation resulted out of this recognition. We will not describe in detail the mathematical description of neural networks work but will clarify the necessary concepts and pay attention to important nuances.

Keywords: neural networks, computer vision, representation learning, autoencoders

Procedia PDF Downloads 109
22050 The Study of Security Techniques on Information System for Decision Making

Authors: Tejinder Singh

Abstract:

Information system is the flow of data from different levels to different directions for decision making and data operations in information system (IS). Data can be violated by different manner like manual or technical errors, data tampering or loss of integrity. Security system called firewall of IS is effected by such type of violations. The flow of data among various levels of Information System is done by networking system. The flow of data on network is in form of packets or frames. To protect these packets from unauthorized access, virus attacks, and to maintain the integrity level, network security is an important factor. To protect the data to get pirated, various security techniques are used. This paper represents the various security techniques and signifies different harmful attacks with the help of detailed data analysis. This paper will be beneficial for the organizations to make the system more secure, effective, and beneficial for future decisions making.

Keywords: information systems, data integrity, TCP/IP network, vulnerability, decision, data

Procedia PDF Downloads 286
22049 Demand of Media and Information for the Public Relation Media for Local Learning Resource Salaya, Nakhon Pathom

Authors: Patsara Sirikamonsin, Sathapath Kilaso

Abstract:

This research aims to study the media and information demand for public relations in Salaya, Nakhonpathom. The research objectives are: 1. to research on conflicts of communication and seeking solutions and improvements of media information in Salaya, Nakhonpathom; 2. to study about opinions and demand for media information to reach out the improvements of people communications among Salaya, Nakhonpathom; 3. to explore the factors related to relationship and behaviors on obtaining media information for public relations among Salaya, Nakhonpathom. The research is conducted by questionnaire which is interpreted by statistical analysis concluding with analysis, frequency, percentage, average and standard deviations. The research results demonstrate: 1. The conflicts of communications among Salaya, Nakhonpathom are lacking equipment and technological knowledge and public relations. 2. Most people have demand on media improvements for vastly broadcasting public relations in order to nourish the social values. This research intentionally is to create the infographic media which are easily accessible, uncomplicated and popular, in the present.

Keywords: media and information, the public relation printed media, local learning resource

Procedia PDF Downloads 143
22048 Review of Assessment of Integrated Information System (IIS) in Organisation

Authors: Mariya Salihu Ingawa, Sani Suleiman Isah

Abstract:

The assessment of Integrated Information System (IIS) in organisation is an important initiative to enable the Information System (IS) managers, as well as top management to understand the success status of their investment in IS integration efforts. However, without a proper assessment, an organisation will not know its IIS status, which may affect their judgment on what action should be taken onwards. Current research on IIS assessment is lacking and those related literature on IIS assessment focus more on assessing the technical aspect of IIS. It is argued that assessing technical aspect alone is inadequate since organisational and strategic aspects in IIS should also be considered. Current methods, techniques and tools used by vendors for IIS assessment also are lack of comprehensive measures to fully assess the Integrated Information System in term of technical, organisational and strategic domains. The purpose of this study is to establish critical success factors for measuring success of an Integrated Information System. These factors are used as the basis for constructing an approach to comprehensively assess IIS in an organisation. A comprehensive list of success factors for IIS assessment, established from literature, was initially presented. An expert surveys using both manual and online methods were conducted to verify the factors. Based on the factors, an instrument for IIS assessment was constructed. The results from a case study indicate that through comprehensive assessment approach, not only the level of success been known, but also reveals the contributing factors. This research contributes to the field of Information Systems specifically in the area of Integrated Information System assessment.

Keywords: integrated information system, expert surveys, organisation, assessment

Procedia PDF Downloads 374
22047 Techno-Economic Assessment of Aluminum Waste Management

Authors: Hamad Almohamadi, Abdulrahman AlKassem, Majed Alamoudi

Abstract:

Dumping Aluminum (Al) waste into landfills causes several health and environmental problems. The pyrolysis process could treat Al waste to produce AlCl₃ and H₂. Using the Aspen Plus software, a techno-economic and feasibility assessment has been performed for Al waste pyrolysis. The Aspen Plus simulation was employed to estimate the plant's mass and energy balance, which was assumed to process 100 dry metric tons of Al waste per day. This study looked at two cases of Al waste treatment. The first case produces 355 tons of AlCl₃ per day and 9 tons of H₂ per day without recycling. The conversion rate must be greater than 50% in case 1 to make a profit. In this case, the MSP for AlCl₃ is $768/ton. The plant would generate $25 million annually if the AlCl₃ were sold at $1000 per ton. In case 2 with recycling, the conversion has less impact on the plant's profitability than in case 1. Moreover, compared to case 1, the MSP of AlCl₃ has no significant influence on process profitability. In this scenario, if AlCl₃ were sold at $1000/ton, the process profit would be $58 million annually. Case 2 is better than case 1 because recycling Al generates a higher yield than converting it to AlCl₃ and H₂.

Keywords: aluminum waste, aspen plus, process modelling, fast pyrolysis, techno-economic assessment

Procedia PDF Downloads 73
22046 Artificial Intelligence in Management Simulators

Authors: Nuno Biga

Abstract:

Artificial Intelligence (AI) allows machines to interpret information and learn from context analysis, giving them the ability to make predictions adjusted to each specific situation. In addition to learning by performing deterministic and probabilistic calculations, the 'artificial brain' also learns through information and data provided by those who train it, namely its users. The "Assisted-BIGAMES" version of the Accident & Emergency (A&E) simulator introduces the concept of a "Virtual Assistant" (VA) that provides users with useful suggestions, namely to pursue the following operations: a) to relocate workstations in order to shorten travelled distances and minimize the stress of those involved; b) to identify in real time the bottleneck(s) in the operations system so that it is possible to quickly act upon them; c) to identify resources that should be polyvalent so that the system can be more efficient; d) to identify in which specific processes it may be advantageous to establish partnership with other teams; and e) to assess possible solutions based on the suggested KPIs allowing action monitoring to guide the (re)definition of future strategies. This paper is built on the BIGAMES© simulator and presents the conceptual AI model developed in a pilot project. Each Virtual Assisted BIGAME is a management simulator developed by the author that guides operational and strategic decision making, providing users with useful information in the form of management recommendations that make it possible to predict the actual outcome of different alternative management strategic actions. The pilot project developed incorporates results from 12 editions of the BIGAME A&E that took place between 2017 and 2022 at AESE Business School, based on the compilation of data that allows establishing causal relationships between decisions taken and results obtained. The systemic analysis and interpretation of this information is materialised in the Assisted-BIGAMES through a computer application called "BIGAMES Virtual Assistant" that players can use during the Game. Each participant in the Virtual Assisted-BIGAMES permanently asks himself about the decisions he should make during the game in order to win the competition. To this end, the role of the VA of each team consists in guiding the players to be more effective in their decision making through presenting recommendations based on AI methods. It is important to note that the VA's suggestions for action can be accepted or rejected by the managers of each team, and as the participants gain a better understanding of the game, they will more easily dispense with the VA's recommendations and rely more on their own experience, capability, and knowledge to support their own decisions. Preliminary results show that the introduction of the VA provides a faster learning of the decision-making process. The facilitator (Serious Game Controller) is responsible for supporting the players with further analysis and the recommended action may be (or not) aligned with the previous recommendations of the VA. All the information should be jointly analysed and assessed by each player, who are expected to add “Emotional Intelligence”, a component absent from the machine learning process.

Keywords: artificial intelligence (AI), gamification, key performance indicators (KPI), machine learning, management simulators, serious games, virtual assistant

Procedia PDF Downloads 87
22045 Taguchi-Based Surface Roughness Optimization for Slotted and Tapered Cylindrical Products in Milling and Turning Operations

Authors: Vineeth G. Kuriakose, Joseph C. Chen, Ye Li

Abstract:

The research follows a systematic approach to optimize the parameters for parts machined by turning and milling processes. The quality characteristic chosen is surface roughness since the surface finish plays an important role for parts that require surface contact. A tapered cylindrical surface is designed as a test specimen for the research. The material chosen for machining is aluminum alloy 6061 due to its wide variety of industrial and engineering applications. HAAS VF-2 TR computer numerical control (CNC) vertical machining center is used for milling and HAAS ST-20 CNC machine is used for turning in this research. Taguchi analysis is used to optimize the surface roughness of the machined parts. The L9 Orthogonal Array is designed for four controllable factors with three different levels each, resulting in 18 experimental runs. Signal to Noise (S/N) Ratio is calculated for achieving the specific target value of 75 ± 15 µin. The controllable parameters chosen for turning process are feed rate, depth of cut, coolant flow and finish cut and for milling process are feed rate, spindle speed, step over and coolant flow. The uncontrollable factors are tool geometry for turning process and tool material for milling process. Hypothesis testing is conducted to study the significance of different uncontrollable factors on the surface roughnesses. The optimal parameter settings were identified from the Taguchi analysis and the process capability Cp and the process capability index Cpk were improved from 1.76 and 0.02 to 3.70 and 2.10 respectively for turning process and from 0.87 and 0.19 to 3.85 and 2.70 respectively for the milling process. The surface roughnesses were improved from 60.17 µin to 68.50 µin, reducing the defect rate from 52.39% to 0% for the turning process and from 93.18 µin to 79.49 µin, reducing the defect rate from 71.23% to 0% for the milling process. The purpose of this study is to efficiently utilize the Taguchi design analysis to improve the surface roughness.

Keywords: surface roughness, Taguchi parameter design, CNC turning, CNC milling

Procedia PDF Downloads 140
22044 Computer Self-Efficacy, Study Behaviour and Use of Electronic Information Resources in Selected Polytechnics in Ogun State, Nigeria

Authors: Fredrick Olatunji Ajegbomogun, Bello Modinat Morenikeji, Okorie Nancy Chituru

Abstract:

Electronic information resources are highly relevant to students' academic and research needs but are grossly underutilized, despite the institutional commitment to making them available. The under-utilisation of these resources could be attributed to a low level of study behaviour coupled with a low level of computer self-efficacy. This study assessed computer self-efficacy, study behaviour, and the use of electronic information resources by students in selected polytechnics in Ogun State. A simple random sampling technique using Krejcie and Morgan's (1970) Table was used to select 370 respondents for the study. A structured questionnaire was used to collect data on respondents. Data were analysed using frequency counts, percentages, mean, standard deviation, Pearson Product Moment Correlation (PPMC) and multiple regression analysis. Results reveal that the internet (= 1.94), YouTube (= 1.74), and search engines (= 1.72) were the common information resources available to the students, while the Internet (= 4.22) is the most utilized resource. Major reasons for using electronic information resources were to source materials and information (= 3.30), for research (= 3.25), and to augment class notes (= 2.90). The majority (91.0%) of the respondents have a high level of computer self-efficacy in the use of electronic information resources through selecting from screen menus (= 3.12), using data files ( = 3.10), and efficient use of computers (= 3.06). Good preparation for tests (= 3.27), examinations (= 3.26), and organization of tutorials (= 3.11) are the common study behaviours of the respondents. Overall, 93.8% have good study behaviour. Inadequate computer facilities to access information (= 3.23), and poor internet access (= 2.87) were the major challenges confronting students’ use of electronic information resources. According to the PPMC results, study behavior (r = 0.280) and computer self-efficacy (r = 0.304) have significant (p 0.05) relationships with the use of electronic information resources. Regression results reveal that self-efficacy (=0.214) and study behavior (=0.122) positively (p 0.05) influenced students' use of electronic information resources. The study concluded that students' use of electronic information resources depends on the purpose, their computer self-efficacy, and their study behaviour. Therefore, the study recommended that the management should encourage the students to improve their study habits and computer skills, as this will enhance their continuous and more effective utilization of electronic information resources.

Keywords: computer self-efficacy, study behaviour, electronic information resources, polytechnics, Nigeria

Procedia PDF Downloads 102
22043 Challenges of Implementing Zero Trust Security Based on NIST SP 800-207

Authors: Mazhar Hamayun

Abstract:

Organizations need to take a holistic approach to their Zero Trust strategic and tactical security needs. This includes using a framework-agnostic model that will ensure all enterprise resources are being accessed securely, regardless of their location. Such can be achieved through the implementation of a security posture, monitoring the posture, and adjusting the posture through the Identify, Detect, Protect, Respond, and Recover Methods, The target audience of this document includes those involved in the management and operational functions of risk, information security, and information technology. This audience consists of the chief information security officer, chief information officer, chief technology officer, and those leading digital transformation initiatives where Zero Trust methods can help protect an organization’s data assets.

Keywords: ZTNA, zerotrust architecture, microsegmentation, NIST SP 800-207

Procedia PDF Downloads 65
22042 Exploring Open Process Innovation: Insights from a Systematic Review and Framework Development

Authors: Saeed Nayeri

Abstract:

This paper explores the feasibility of openness within firms' boundaries during process innovation and identifies the key determinants of open process innovation (OPI). Through a systematic review of 78 research studies published between 2001 and 2024, the author synthesized diverse findings into a comprehensive framework detailing OPI attributes and pillars. The identified OPI attributes encompass themes such as technology intensity, significance, magnitude, and locus of exploitation, while the OPI pillars include mechanisms, partners, achievements, and antecedents. Additionally, the author critically analysed gaps in the literature, proposing future research directions that advocate for a broader methodological approach, increased emphasis on theory development and testing, and more cross-national and cross-sectoral studies to advance understanding in this field.

Keywords: open innovation, process innovation, OPI attributes, systematic literature review, organizational openness

Procedia PDF Downloads 43
22041 [Keynote Speech]: Risk Management during the Rendition Process: Use of Screen-Voice Recordings in Translator Training

Authors: Maggie Hui

Abstract:

Risk management is not a new concept; however, it is an uncharted area as applied to the translation process and translator training. Serving as one of the self-discovery activities in their practicum course, a two-cycle experiment was carried out with a class of 13 MA translation students with an attempt to explore their risk management while translating in a simulated setting that involves translator-client relations. To test the effects of the main variable of translators’ interaction with the simulated clients, the researcher employed control-group translators and two experiment groups (with Group A being the translator in Cycle 1 and the client in Cycle 2, and Group B on the client position in Cycle 1 and the translator position in Cycle 2). Experiment cycle 1 aims to explore if there would be any behavioral difference in risk management between translators with interaction with the simulated clients, i.e. experiment group A, and their counterparts without such interaction, i.e. control group. Design of Cycle 2 concerns the order of playing different roles of the translator and client in the experiment, and provides information to compare behavior of translators of the two experiment groups. Since this is process-oriented research, it is necessary to hypothesize what was happening in the translators’ minds. The researcher made use of a user-friendly screen-voice recording freeware to record subjects’ screen activities, including every word the translator typed and every change they made to the rendition, the websites they browsed and the reference tools they used, in addition to the verbalization of their thoughts throughout the process. The research observes the translation procedures subjects considered and finally adopted, and looks into the justifications for their procedures, in order to interpret their risk management. The qualitative and quantitative results of this study have some implications for translator training: (a) the experience of being a client seems to reinforce the translator’s risk aversion; (b) the use of role-playing simulation can empower students’ learning by enhancing their attitudinal or psycho-physiological competence, interpersonal competence and strategic competence; and (c) the screen-voice recordings serve as a helpful tool for learners to reflect on their rendition processes, i.e. what they performed satisfactorily and unsatisfactorily while translating and what they could do for improvement in future translation tasks.

Keywords: risk management, screen-voice recordings, simulated translator-client relations, translation pedagogy, translation process-oriented research

Procedia PDF Downloads 254
22040 D6tions: A Serious Game to Learn Software Engineering Process and Design

Authors: Hector G. Perez-Gonzalez, Miriam Vazquez-Escalante, Sandra E. Nava-Muñoz, 
 Francisco E. Martinez-Perez, Alberto S. Nunez-Varela

Abstract:

The software engineering teaching process has been the subject of many studies. To improve this process, researchers have proposed merely illustrative techniques in the classroom, such as topic presentations and dynamics between students on one side or attempts to involve students in real projects with companies and institutions to bring them to a real software development problem on the other hand. Simulators and serious games have been used as auxiliary tools to introduce students to topics that are too abstract when these are presented in the traditional way. Most of these tools cover a limited area of the huge software engineering scope. To address this problem, we have developed D6tions, an educational serious game that simulates the software engineering process and is designed to experiment the different stages a software engineer (playing roles as project leader or as a developer or designer) goes through, while participating in a software project. We describe previous approaches to this problem, how D6tions was designed, its rules, directions, and the results we obtained of the use of this game involving undergraduate students playing the game.

Keywords: serious games, software engineering, software engineering education, software engineering teaching process

Procedia PDF Downloads 476
22039 Prioritizing the Most Important Information from Contractors’ BIM Handover for Firefighters’ Responsibilities

Authors: Akram Mahdaviparsa, Tamera McCuen, Vahideh Karimimansoob

Abstract:

Fire service is responsible for protecting life, assets, and natural resources from fire and other hazardous incidents. Search and rescue in unfamiliar buildings is a vital part of firefighters’ responsibilities. Providing firefighters with precise building information in an easy-to-understand format is a potential solution for mitigating the negative consequences of fire hazards. The negative effect of insufficient knowledge about a building’s indoor environment impedes firefighters’ capabilities and leads to lost property. A data rich building information modeling (BIM) is a potentially useful source in three-dimensional (3D) visualization and data/information storage for fire emergency response. Therefore, this research’s purpose is prioritizing the required information for firefighters from the most important information to the least important. A survey was carried out with firefighters working in the Norman Fire Department to obtain the importance of each building information item. The results show that “the location of exit doors, windows, corridors, elevators, and stairs”, “material of building elements”, and “building data” are the three most important information specified by firefighters. The results also implied that the 2D model of architectural, structural and way finding is more understandable in comparison with the 3D model, while the 3D model of MEP system could convey more information than the 2D model. Furthermore, color in visualization can help firefighters to understand the building information easier and quicker. Sufficient internal consistency of all responses was proven through developing the Pearson Correlation Matrix and obtaining Cronbach’s alpha of 0.916. Therefore, the results of this study are reliable and could be applied to the population.

Keywords: BIM, building fire response, ranking, visualization

Procedia PDF Downloads 120
22038 Transfer of Information Heritage between Algerian Veterinarians and Breeders: Assessment of Information and Communication Technology Using Mobile Phone

Authors: R. Bernaoui, P. Ohly

Abstract:

Our research shows the use of the mobile phone that consolidates the relationship between veterinarians, and that between breeders and veterinarians. On the other hand it asserts that the tool in question is a means of economic development. The results of our survey reveal a positive return to the veterinary community, which shows that the mobile phone has become an effective means of sustainable development through the transfer of a rapid and punctual information inheritance via social networks; including many Internet applications. Our results show that almost all veterinarians use the mobile phone for interprofessional communication. We therefore believe that the use of the mobile phone by livestock operators has greatly improved the working conditions, just as the use of this tool contributes to a better management of the exploitation as long as it allows limit travel but also save time. These results show that we are witnessing a growth in the use of mobile telephony technologies that impact is as much in terms of sustainable development. Allowing access to information, especially technical information, the mobile phone, and Information and Communication of Technology (ICT) in general, give livestock sector players not only security, by limiting losses, but also an efficiency that allows them a better production and productivity.

Keywords: algeria, breeder-veterinarian, digital heritage, networking

Procedia PDF Downloads 104
22037 U-Turn on the Bridge to Freedom: An Interaction Process Analysis of Task and Relational Messages in Totalistic Organization Exit Conversations on Online Discussion Boards

Authors: Nancy Di Tunnariello, Jenna L. Currie-Mueller

Abstract:

Totalistic organizations include organizations that operate by playing a prominent role in the life of its members through embedding values and practices. The Church of Scientology (CoS) is an example of a religious totalistic organization and has recently garnered attention because of the questionable treatment of members by those with authority, particularly when members try to leave the Church. The purpose of this study was to analyze exit communication and evaluate the task and relational messages discussed on online discussion boards for individuals with a previous or current connection to the totalistic CoS. Using organizational exit phases and interaction process analysis (IPA), researchers coded 30 boards consisting of 14,179 thought units from the Exscn.net website. Findings report all stages of exit were present, and post-exit surfaced most often. Posts indicated more tasks than relational messages, where individuals mainly provided orientation/information. After a discussion of the study’s contributions, limitations and directions for future research are explained.

Keywords: Bales' IPA, organizational exit, relational messages, scientology, task messages, totalistic organizations

Procedia PDF Downloads 117
22036 A Structuring and Classification Method for Assigning Application Areas to Suitable Digital Factory Models

Authors: R. Hellmuth

Abstract:

The method of factory planning has changed a lot, especially when it is about planning the factory building itself. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity and Ambiguity) lead to more frequent restructuring measures within a factory. A digital factory model is the planning basis for rebuilding measures and becomes an indispensable tool. Furthermore, digital building models are increasingly being used in factories to support facility management and manufacturing processes. The main research question of this paper is, therefore: What kind of digital factory model is suitable for the different areas of application during the operation of a factory? First, different types of digital factory models are investigated, and their properties and usabilities for use cases are analysed. Within the scope of investigation are point cloud models, building information models, photogrammetry models, and these enriched with sensor data are examined. It is investigated which digital models allow a simple integration of sensor data and where the differences are. Subsequently, possible application areas of digital factory models are determined by means of a survey and the respective digital factory models are assigned to the application areas. Finally, an application case from maintenance is selected and implemented with the help of the appropriate digital factory model. It is shown how a completely digitalized maintenance process can be supported by a digital factory model by providing information. Among other purposes, the digital factory model is used for indoor navigation, information provision, and display of sensor data. In summary, the paper shows a structuring of digital factory models that concentrates on the geometric representation of a factory building and its technical facilities. A practical application case is shown and implemented. Thus, the systematic selection of digital factory models with the corresponding application cases is evaluated.

Keywords: building information modeling, digital factory model, factory planning, maintenance

Procedia PDF Downloads 96
22035 On the Optimality Assessment of Nano-Particle Size Spectrometry and Its Association to the Entropy Concept

Authors: A. Shaygani, R. Saifi, M. S. Saidi, M. Sani

Abstract:

Particle size distribution, the most important characteristics of aerosols, is obtained through electrical characterization techniques. The dynamics of charged nano-particles under the influence of electric field in electrical mobility spectrometer (EMS) reveals the size distribution of these particles. The accuracy of this measurement is influenced by flow conditions, geometry, electric field and particle charging process, therefore by the transfer function (transfer matrix) of the instrument. In this work, a wire-cylinder corona charger was designed and the combined field-diffusion charging process of injected poly-disperse aerosol particles was numerically simulated as a prerequisite for the study of a multi-channel EMS. The result, a cloud of particles with non-uniform charge distribution, was introduced to the EMS. The flow pattern and electric field in the EMS were simulated using computational fluid dynamics (CFD) to obtain particle trajectories in the device and therefore to calculate the reported signal by each electrometer. According to the output signals (resulted from bombardment of particles and transferring their charges as currents), we proposed a modification to the size of detecting rings (which are connected to electrometers) in order to evaluate particle size distributions more accurately. Based on the capability of the system to transfer information contents about size distribution of the injected particles, we proposed a benchmark for the assessment of optimality of the design. This method applies the concept of Von Neumann entropy and borrows the definition of entropy from information theory (Shannon entropy) to measure optimality. Entropy, according to the Shannon entropy, is the ''average amount of information contained in an event, sample or character extracted from a data stream''. Evaluating the responses (signals) which were obtained via various configurations of detecting rings, the best configuration which gave the best predictions about the size distributions of injected particles, was the modified configuration. It was also the one that had the maximum amount of entropy. A reasonable consistency was also observed between the accuracy of the predictions and the entropy content of each configuration. In this method, entropy is extracted from the transfer matrix of the instrument for each configuration. Ultimately, various clouds of particles were introduced to the simulations and predicted size distributions were compared to the exact size distributions.

Keywords: aerosol nano-particle, CFD, electrical mobility spectrometer, von neumann entropy

Procedia PDF Downloads 327
22034 Implementing Internet of Things through Building Information Modelling in Order to Assist with the Maintenance Stage of Commercial Buildings

Authors: Ushir Daya, Zenadene Lazarus, Dimelle Moodley, Ehsan Saghatforoush

Abstract:

It was found through literature that there is a lack of implementation of the Internet of Things (IoT) incorporated into Building Information Modelling (BIM) in South Africa. The research aims to find if the implementation of IoT into BIM will make BIM more useful during the maintenance stage of buildings and assist facility managers when doing their job. The research will look at the existing problematic areas with building information modelling, specifically BIM 7D. This paper will look at the capabilities of IoT and what issues IoT will be able to resolve in BIM software, as well as how IoT into BIM will assist facility managers and if such an implementation will make a facility manager's job more efficient.

Keywords: internet of things, building information modeling, facilities management, structural health monitoring

Procedia PDF Downloads 190
22033 Colonialism and Modernism in Architecture, the Case of a Blank Page Opportunity in Casablanka

Authors: Nezha Alaoui

Abstract:

The early 1950s French colonial context in Morocco provided an opportunity for architects to question the modernist established order by building dwellings for the local population. The dwellings were originally designed to encourage Muslims to adopt an urban lifestyle based on local customs. However, the inhabitants transformed their dwelling into a hybrid habitation. This paper aims to prove the relevance of the design process in accordance with the local colonial context by analyzing the dwellers' appropriation process and the modification of their habitat.

Keywords: colonial heritage, appropriation process, islamic spatial habit, housing experiment, modernist mass housing

Procedia PDF Downloads 119
22032 A Collaborative Teaching and Learning Model between Academy and Industry for Multidisciplinary Engineering Education

Authors: Moon-Soo Kim

Abstract:

In order to cope with the increasing demand for multidisciplinary learning between academy and industry, a collaborative teaching and learning model and related operational tools enabling applications to engineering education are essential. This study proposes a web-based collaborative framework for interactive teaching and learning between academy and industry as an initial step for the development of a web- and mobile-based integrated system for both engineering students and industrial practitioners. The proposed web-based collaborative teaching and learning framework defines several entities such as learner, solver and supporter or sponsor for industrial problems, and also has a systematic architecture to build information system including diverse functions enabling effective interaction among the defined entities regardless of time and places. Furthermore, the framework, which includes knowledge and information self-reinforcing mechanism, focuses on the previous problem-solving records as well as subsequent learners’ creative reusing in solving process of new problems.

Keywords: collaborative teaching and learning model, academy and industry, web-based collaborative framework, self-reinforcing mechanism

Procedia PDF Downloads 309
22031 Using Analytical Hierarchy Process and TOPSIS Approaches in Designing a Finite Element Analysis Automation Program

Authors: Ming Wen, Nasim Nezamoddini

Abstract:

Sophisticated numerical simulations like finite element analysis (FEA) involve a complicated process from model setup to post-processing tasks that require replication of time-consuming steps. Utilizing FEA automation program simplifies the complexity of the involved steps while minimizing human errors in analysis set up, calculations, and results processing. One of the main challenges in designing FEA automation programs is to identify user requirements and link them to possible design alternatives. This paper presents a decision-making framework to design a Python based FEA automation program for modal analysis, frequency response analysis, and random vibration fatigue (RVF) analysis procedures. Analytical hierarchy process (AHP) and technique for order preference by similarity to ideal solution (TOPSIS) are applied to evaluate design alternatives considering the feedback received from experts and program users.

Keywords: finite element analysis, FEA, random vibration fatigue, process automation, analytical hierarchy process, AHP, TOPSIS, multiple-criteria decision-making, MCDM

Procedia PDF Downloads 99
22030 Talent Management through Integration of Talent Value Chain and Human Capital Analytics Approaches

Authors: Wuttigrai Ngamsirijit

Abstract:

Talent management in today’s modern organizations has become data-driven due to a demand for objective human resource decision making and development of analytics technologies. HR managers have been faced with some obstacles in exploiting data and information to obtain their effective talent management decisions. These include process-based data and records; insufficient human capital-related measures and metrics; lack of capabilities in data modeling in strategic manners; and, time consuming to add up numbers and make decisions. This paper proposes a framework of talent management through integration of talent value chain and human capital analytics approaches. It encompasses key data, measures, and metrics regarding strategic talent management decisions along the organizational and talent value chain. Moreover, specific predictive and prescriptive models incorporating these data and information are recommended to help managers in understanding the state of talent, gaps in managing talent and the organization, and the ways to develop optimized talent strategies.    

Keywords: decision making, human capital analytics, talent management, talent value chain

Procedia PDF Downloads 160
22029 Development of a Psychometric Testing Instrument Using Algorithms and Combinatorics to Yield Coupled Parameters and Multiple Geometric Arrays in Large Information Grids

Authors: Laith F. Gulli, Nicole M. Mallory

Abstract:

The undertaking to develop a psychometric instrument is monumental. Understanding the relationship between variables and events is important in structural and exploratory design of psychometric instruments. Considering this, we describe a method used to group, pair and combine multiple Philosophical Assumption statements that assisted in development of a 13 item psychometric screening instrument. We abbreviated our Philosophical Assumptions (PA)s and added parameters, which were then condensed and mathematically modeled in a specific process. This model produced clusters of combinatorics which was utilized in design and development for 1) information retrieval and categorization 2) item development and 3) estimation of interactions among variables and likelihood of events. The psychometric screening instrument measured Knowledge, Assessment (education) and Beliefs (KAB) of New Addictions Research (NAR), which we called KABNAR. We obtained an overall internal consistency for the seven Likert belief items as measured by Cronbach’s α of .81 in the final study of 40 Clinicians, calculated by SPSS 14.0.1 for Windows. We constructed the instrument to begin with demographic items (degree/addictions certifications) for identification of target populations that practiced within Outpatient Substance Abuse Counseling (OSAC) settings. We then devised education items, beliefs items (seven items) and a modifiable “barrier from learning” item that consisted of six “choose any” choices. We also conceptualized a close relationship between identifying various degrees and certifications held by Outpatient Substance Abuse Therapists (OSAT) (the demographics domain) and all aspects of their education related to EB-NAR (past and present education and desired future training). We placed a descriptive (PA)1tx in both demographic and education domains to trace relationships of therapist education within these two domains. The two perceptions domains B1/b1 and B2/b2 represented different but interrelated perceptions from the therapist perspective. The belief items measured therapist perceptions concerning EB-NAR and therapist perceptions using EB-NAR during the beginning of outpatient addictions counseling. The (PA)s were written in simple words and descriptively accurate and concise. We then devised a list of parameters and appropriately matched them to each PA and devised descriptive parametric (PA)s in a domain categorized information grid. Descriptive parametric (PA)s were reduced to simple mathematical symbols. This made it easy to utilize parametric (PA)s into algorithms, combinatorics and clusters to develop larger information grids. By using matching combinatorics we took paired demographic and education domains with a subscript of 1 and matched them to the column with each B domain with subscript 1. Our algorithmic matching formed larger information grids with organized clusters in columns and rows. We repeated the process using different demographic, education and belief domains and devised multiple information grids with different parametric clusters and geometric arrays. We found benefit combining clusters by different geometric arrays, which enabled us to trace parametric variables and concepts. We were able to understand potential differences between dependent and independent variables and trace relationships of maximum likelihoods.

Keywords: psychometric, parametric, domains, grids, therapists

Procedia PDF Downloads 261
22028 The Hubs of Transformation Dictated by the Innovation Wave: Boston as a Case Study. Exploring How Design is Emerging as an Essential Feature in the Process of Laboratorisation of Cities

Authors: Luana Parisi, Sohrab Donyavi

Abstract:

Cities have become the nodes of global networks, standing at the intersection points of the flows of capital, goods, workers, businesses and travellers, making them the spots where innovation, progress and economic development occur. The primary challenge for them is to create the most fertile ecosystems for triggering innovation activities. Design emerges as an essential feature in this process of laboratorisation of cities. This paper aims at exploring the spatial hubs of transformation within the knowledge economy, providing an overview of the current models of innovation spaces, before focusing on the innovation district of one of the cities that are riding the innovation wave, namely, Boston, USA. Useful lessons will be drawn from the case study of the innovation district in Boston, allowing to define precious tools for policymakers, in the form of a range of factors that define the broad strategy able to implement the model successfully. A mixed methodology is implemented, including information from observations, exploratory interviews to key stakeholders and on-desk data.

Keywords: Innovation District, innovation ecosystem, economic development, urban regeneration

Procedia PDF Downloads 97
22027 Importance of Ethics in Cloud Security

Authors: Pallavi Malhotra

Abstract:

This paper examines the importance of ethics in cloud computing. In the modern society, cloud computing is offering individuals and businesses an unlimited space for storing and processing data or information. Most of the data and information stored in the cloud by various users such as banks, doctors, architects, engineers, lawyers, consulting firms, and financial institutions among others require a high level of confidentiality and safeguard. Cloud computing offers centralized storage and processing of data, and this has immensely contributed to the growth of businesses and improved sharing of information over the internet. However, the accessibility and management of data and servers by a third party raise concerns regarding the privacy of clients’ information and the possible manipulations of the data by third parties. This document suggests the approaches various stakeholders should take to address various ethical issues involving cloud-computing services. Ethical education and training is key to all stakeholders involved in the handling of data and information stored or being processed in the cloud.

Keywords: IT ethics, cloud computing technology, cloud privacy and security, ethical education

Procedia PDF Downloads 311
22026 Effect of Cooking Process on the Antioxidant Activity of Different Variants of Tomato-Based Sofrito

Authors: Ana Beltran Sanahuja, A. Valdés García, Saray Lopez De Pablo Gallego, Maria Soledad Prats Moya

Abstract:

Tomato consumption has greatly increased worldwide in the last few years, mostly due to a growing demand for products like sofrito. In this sense, regular consumption of tomato-based products has been consistently associated with a reduction in the incidence of chronic degenerative diseases. The sofrito is a homemade tomato sauce typical of the Mediterranean area, which contains as main ingredients: tomato, onion, garlic and olive oil. There are also sofrito’s variations by adding other spices which bring at the same time not only color, flavor, smell and or aroma; they also provide medicinal properties, due to their antioxidant power. This protective effect has mainly been attributed to the predominant bioactive compounds present in sofrito, such as lycopene and other carotenoids as well as more than 40 different polyphenols. Regarding the cooking process, it is known that it can modify the properties and the availability of nutrients in sofrito; however, there is not enough information regarding this issue. For this reason, the aim of the present work is to evaluate the cooking effect on the antioxidant capacity of different variants of tomato-based sofrito combined with other spices, through the analysis of total phenols content (TPC) and to evaluate the antioxidant capacity by using the method of free radical 2,2-diphenyl-1-picrylhydrazyl (DPPH). Based on the results obtained, it can be confirmed that the basic sofrito composed of tomato, onion, garlic and olive oil and the sofrito with 1 g of rosemary added, are the ones with the highest content of phenols presenting greater antioxidant power than other industrial sofrito, and that of other variables of sofrito with added thyme or higher amounts of garlic. Moreover, it has been observed that in the elaboration of the tomato-based sofrito, it is possible to cook until 60 minutes, since the cooking process increases the bioavailability of the carotenoids when breaking the cell walls, which weakens the binding forces between the carotenoids and increases the levels of antioxidants present, confirmed both with the TPC and DPPH methods. It can be concluded that the cooking process of different variants of tomato-based sofrito, including spices, can improve the antioxidant capacity. The synergistic effects of different antioxidants may have a greater protective effect; increasing, also, the digestibility of proteins. In addition, the antioxidants help to deactivate the free radicals of diseases such as atherosclerosis, aging, immune suppression, cancer, and diabetes.

Keywords: antioxidants, cooking process, phenols sofrito

Procedia PDF Downloads 126
22025 Depyritization of US Coal Using Iron-Oxidizing Bacteria: Batch Stirred Reactor Study

Authors: Ashish Pathak, Dong-Jin Kim, Haragobinda Srichandan, Byoung-Gon Kim

Abstract:

Microbial depyritization of coal using chemoautotrophic bacteria is gaining acceptance as an efficient and eco-friendly technique. The process uses the metabolic activity of chemoautotrophic bacteria in removing sulfur and pyrite from the coal. The aim of the present study was to investigate the potential of Acidithiobacillus ferrooxidans in removing the pyritic sulfur and iron from high iron and sulfur containing US coal. The experiment was undertaken in 8 L bench scale stirred tank reactor having 1% (w/v) pulp density of coal. The reactor was operated at 35ºC and aerobic conditions were maintained by sparging the air into the reactor. It was found that at the end of bio-depyritization process, about 90% of pyrite and 67% of pyritic sulfur was removed from the coal. The results indicate that the bio-depyritization process is an efficient process in treating the high pyrite and sulfur containing coal.

Keywords: At.ferrooxidans, batch reactor, coal desulfurization, pyrite

Procedia PDF Downloads 257
22024 A Systematic Review on Challenges in Big Data Environment

Authors: Rimmy Yadav, Anmol Preet Kaur

Abstract:

Big Data has demonstrated the vast potential in streamlining, deciding, spotting business drifts in different fields, for example, producing, fund, Information Technology. This paper gives a multi-disciplinary diagram of the research issues in enormous information and its procedures, instruments, and system identified with the privacy, data storage management, network and energy utilization, adaptation to non-critical failure and information representations. Other than this, result difficulties and openings accessible in this Big Data platform have made.

Keywords: big data, privacy, data management, network and energy consumption

Procedia PDF Downloads 289
22023 A Time-Varying and Non-Stationary Convolution Spectral Mixture Kernel for Gaussian Process

Authors: Kai Chen, Shuguang Cui, Feng Yin

Abstract:

Gaussian process (GP) with spectral mixture (SM) kernel demonstrates flexible non-parametric Bayesian learning ability in modeling unknown function. In this work a novel time-varying and non-stationary convolution spectral mixture (TN-CSM) kernel with a significant enhancing of interpretability by using process convolution is introduced. A way decomposing the SM component into an auto-convolution of base SM component and parameterizing it to be input dependent is outlined. Smoothly, performing a convolution between two base SM component yields a novel structure of non-stationary SM component with much better generalized expression and interpretation. The TN-CSM perfectly allows compatibility with the stationary SM kernel in terms of kernel form and spectral base ignored and confused by previous non-stationary kernels. On synthetic and real-world datatsets, experiments show the time-varying characteristics of hyper-parameters in TN-CSM and compare the learning performance of TN-CSM with popular and representative non-stationary GP.

Keywords: Gaussian process, spectral mixture, non-stationary, convolution

Procedia PDF Downloads 179