Search results for: algorithmic personalization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 189

Search results for: algorithmic personalization

39 Data Analytics in Hospitality Industry

Authors: Tammy Wee, Detlev Remy, Arif Perdana

Abstract:

In the recent years, data analytics has become the buzzword in the hospitality industry. The hospitality industry is another example of a data-rich industry that has yet fully benefited from the insights of data analytics. Effective use of data analytics can change how hotels operate, market and position themselves competitively in the hospitality industry. However, at the moment, the data obtained by individual hotels remain under-utilized. This research is a preliminary research on data analytics in the hospitality industry, using an in-depth face-to-face interview on one hotel as a start to a multi-level research. The main case study of this research, hotel A, is a chain brand of international hotel that has been systematically gathering and collecting data on its own customer for the past five years. The data collection points begin from the moment a guest book a room until the guest leave the hotel premises, which includes room reservation, spa booking, and catering. Although hotel A has been gathering data intelligence on its customer for some time, they have yet utilized the data to its fullest potential, and they are aware of their limitation as well as the potential of data analytics. Currently, the utilization of data analytics in hotel A is limited in the area of customer service improvement, namely to enhance the personalization of service for each individual customer. Hotel A is able to utilize the data to improve and enhance their service which in turn, encourage repeated customers. According to hotel A, 50% of their guests returned to their hotel, and 70% extended nights because of the personalized service. Apart from using the data analytics for enhancing customer service, hotel A also uses the data in marketing. Hotel A uses the data analytics to predict or forecast the change in consumer behavior and demand, by tracking their guest’s booking preference, payment preference and demand shift between properties. However, hotel A admitted that the data they have been collecting was not fully utilized due to two challenges. The first challenge of using data analytics in hotel A is the data is not clean. At the moment, the data collection of one guest profile is meaningful only for one department in the hotel but meaningless for another department. Cleaning up the data and getting standards correctly for usage by different departments are some of the main concerns of hotel A. The second challenge of using data analytics in hotel A is the non-integral internal system. At the moment, the internal system used by hotel A do not integrate with each other well, limiting the ability to collect data systematically. Hotel A is considering another system to replace the current one for more comprehensive data collection. Hotel proprietors recognized the potential of data analytics as reported in this research, however, the current challenges of implementing a system to collect data come with a cost. This research has identified the current utilization of data analytics and the challenges faced when it comes to implementing data analytics.

Keywords: data analytics, hospitality industry, customer relationship management, hotel marketing

Procedia PDF Downloads 142
38 The Use of Gender-Fair Language in CS National Exams

Authors: Moshe Leiba, Doron Zohar

Abstract:

Computer Science (CS) and programming is still considered a boy’s club and is a male-dominated profession. This is also the case in high schools and higher education. In Israel, not different from the rest of the world, there are less than 35% of female students in CS studies that take the matriculation exams. The Israeli matriculation exams are written in a masculine form language. Gender-fair language (GFL) aims at reducing gender stereotyping and discrimination. There are several strategies that can be employed to make languages gender-fair and to treat women and men symmetrically (especially in languages with grammatical gender, among them neutralization and using the plural form. This research aims at exploring computer science teachers’ beliefs regarding the use of gender-fair language in exams. An exploratory quantitative research methodology was employed to collect the data. A questionnaire was administered to 353 computer science teachers. 58% female and 42% male. 86% are teaching for at least 3 years, with 59% of them have a teaching experience of 7 years. 71% of the teachers teach in high school, and 82% of them are preparing students for the matriculation exam in computer science. The questionnaire contained 2 matriculation exam questions from previous years and open-ended questions. Teachers were asked which form they think is more suited: (a) the existing form (mescaline), (b) using both gender full forms (e.g., he/she), (c) using both gender short forms, (d) plural form, (e) natural form, and (f) female form. 84% of the teachers recognized the need to change the existing mescaline form in the matriculation exams. About 50% of them thought that using the plural form was the best-suited option. When examining the teachers who are pro-change and those who are against, no gender differences or teaching experience were found. The teachers who are pro gender-fair language justified it as making it more personal and motivating for the female students. Those who thought that the mescaline form should remain argued that the female students do not complain and the change in form will not influence or affect the female students to choose to study computer science. Some even argued that the change will not affect the students but can only improve their sense of identity or feeling toward the profession (which seems like a misconception). This research suggests that the teachers are pro-change and believe that re-formulating the matriculation exams is the right step towards encouraging more female students to choose to study computer science as their major study track and to bridge the gap for gender equality. This should indicate a bottom-up approach, as not long after this research was conducted, the Israeli ministry of education decided to change the matriculation exams to gender-fair language using the plural form. In the coming years, with the transition to web-based examination, it is suggested to use personalization and adjust the language form in accordance with the student's gender.

Keywords: compter science, gender-fair language, teachers, national exams

Procedia PDF Downloads 85
37 The Role of Twitter Bots in Political Discussion on 2019 European Elections

Authors: Thomai Voulgari, Vasilis Vasilopoulos, Antonis Skamnakis

Abstract:

The aim of this study is to investigate the effect of the European election campaigns (May 23-26, 2019) on Twitter achieving with artificial intelligence tools such as troll factories and automated inauthentic accounts. Our research focuses on the last European Parliamentary elections that took place between 23 and 26 May 2019 specifically in Italy, Greece, Germany and France. It is difficult to estimate how many Twitter users are actually bots (Echeverría, 2017). Detection for fake accounts is becoming even more complicated as AI bots are made more advanced. A political bot can be programmed to post comments on a Twitter account for a political candidate, target journalists with manipulated content or engage with politicians and artificially increase their impact and popularity. We analyze variables related to 1) the scope of activity of automated bots accounts and 2) degree of coherence and 3) degree of interaction taking into account different factors, such as the type of content of Twitter messages and their intentions, as well as the spreading to the general public. For this purpose, we collected large volumes of Twitter accounts of party leaders and MEP candidates between 10th of May and 26th of July based on content analysis of tweets based on hashtags while using an innovative network analysis tool known as MediaWatch.io (https://mediawatch.io/). According to our findings, one of the highest percentage (64.6%) of automated “bot” accounts during 2019 European election campaigns was in Greece. In general terms, political bots aim to proliferation of misinformation on social media. Targeting voters is a way that it can be achieved contribute to social media manipulation. We found that political parties and individual politicians create and promote purposeful content on Twitter using algorithmic tools. Based on this analysis, online political advertising play an important role to the process of spreading misinformation during elections campaigns. Overall, inauthentic accounts and social media algorithms are being used to manipulate political behavior and public opinion.

Keywords: artificial intelligence tools, human-bot interactions, political manipulation, social networking, troll factories

Procedia PDF Downloads 114
36 A Statistical-Algorithmic Approach for the Design and Evaluation of a Fresnel Solar Concentrator-Receiver System

Authors: Hassan Qandil

Abstract:

Using a statistical algorithm incorporated in MATLAB, four types of non-imaging Fresnel lenses are designed; spot-flat, linear-flat, dome-shaped and semi-cylindrical-shaped. The optimization employs a statistical ray-tracing methodology of the incident light, mainly considering effects of chromatic aberration, varying focal lengths, solar inclination and azimuth angles, lens and receiver apertures, and the optimum number of prism grooves. While adopting an equal-groove-width assumption of the Poly-methyl-methacrylate (PMMA) prisms, the main target is to maximize the ray intensity on the receiver’s aperture and therefore achieving higher values of heat flux. The algorithm outputs prism angles and 2D sketches. 3D drawings are then generated via AutoCAD and linked to COMSOL Multiphysics software to simulate the lenses under solar ray conditions, which provides optical and thermal analysis at both the lens’ and the receiver’s apertures while setting conditions as per the Dallas-TX weather data. Once the lenses’ characterization is finalized, receivers are designed based on its optimized aperture size. Several cavity shapes; including triangular, arc-shaped and trapezoidal, are tested while coupled with a variety of receiver materials, working fluids, heat transfer mechanisms, and enclosure designs. A vacuum-reflective enclosure is also simulated for an enhanced thermal absorption efficiency. Each receiver type is simulated via COMSOL while coupled with the optimized lens. A lab-scale prototype for the optimum lens-receiver configuration is then fabricated for experimental evaluation. Application-based testing is also performed for the selected configuration, including that of a photovoltaic-thermal cogeneration system and solar furnace system. Finally, some future research work is pointed out, including the coupling of the collector-receiver system with an end-user power generator, and the use of a multi-layered genetic algorithm for comparative studies.

Keywords: COMSOL, concentrator, energy, fresnel, optics, renewable, solar

Procedia PDF Downloads 124
35 Evolution of Web Development Progress in Modern Information Technology

Authors: Abdul Basit Kiani

Abstract:

Web development, the art of creating and maintaining websites, has witnessed remarkable advancements. The aim is to provide an overview of some of the cutting-edge developments in the field. Firstly, the rise of responsive web design has revolutionized user experiences across devices. With the increasing prevalence of smartphones and tablets, web developers have adapted to ensure seamless browsing experiences, regardless of screen size. This progress has greatly enhanced accessibility and usability, catering to the diverse needs of users worldwide. Additionally, the evolution of web frameworks and libraries has significantly streamlined the development process. Tools such as React, Angular, and Vue.js have empowered developers to build dynamic and interactive web applications with ease. These frameworks not only enhance efficiency but also bolster scalability, allowing for the creation of complex and feature-rich web solutions. Furthermore, the emergence of progressive web applications (PWAs) has bridged the gap between native mobile apps and web development. PWAs leverage modern web technologies to deliver app-like experiences, including offline functionality, push notifications, and seamless installation. This innovation has transformed the way users interact with websites, blurring the boundaries between traditional web and mobile applications. Moreover, the integration of artificial intelligence (AI) and machine learning (ML) has opened new horizons in web development. Chatbots, intelligent recommendation systems, and personalization algorithms have become integral components of modern websites. These AI-powered features enhance user engagement, provide personalized experiences, and streamline customer support processes, revolutionizing the way businesses interact with their audiences. Lastly, the emphasis on web security and privacy has been a pivotal area of progress. With the increasing incidents of cyber threats, web developers have implemented robust security measures to safeguard user data and ensure secure transactions. Innovations such as HTTPS protocol, two-factor authentication, and advanced encryption techniques have bolstered the overall security of web applications, fostering trust and confidence among users. Hence, recent progress in web development has propelled the industry forward, enabling developers to craft innovative and immersive digital experiences. From responsive design to AI integration and enhanced security, the landscape of web development continues to evolve, promising a future filled with endless possibilities.

Keywords: progressive web applications (PWAs), web security, machine learning (ML), web frameworks, advancement responsive web design

Procedia PDF Downloads 24
34 Adapting Liability in the Era of Automated Decision-Making: A South African Labour Law Perspective

Authors: Aisha Adam

Abstract:

This study critically examines the transformative impact of automated decision-making (ADM) and artificial intelligence (AI) systems on South African labour law. As AI technologies increasingly infiltrate workplaces, existing liability frameworks face challenges in addressing the unique complexities presented by these innovations. This article explores the necessity of redefining liability to accommodate the nuanced landscape of ADM and AI within South African labour law. It emphasises the importance of ensuring responsible deployment and safeguarding the rights of workers amid evolving technological dynamics. This research investigates the central concern of fairness, bias, and discrimination in ADM and AI decision-making. Focusing on algorithmic bias and discriminatory outcomes, the paper advocates for the integration of mechanisms within the South African legal framework, particularly under the Promotion of Equality and Prevention of Unfair Discrimination Act (PEPUDA) and the Employment Equity Act (EEA). The study scrutinises the shifting dynamics of the employment relationship, calling for clear guidelines on the responsibilities and liabilities of employers, employees, and technology providers. Furthermore, the article analyses legal and policy responses to ADM and AI within South African labour law, exploring potential amendments to legislation, guidelines, and codes of practice. It assesses the role of regulatory bodies, specifically the Commission for Conciliation, Mediation, and Arbitration (CCMA), in overseeing and enforcing responsible practices in the workplace. Lastly, the research evaluates the impact of ADM and AI on human and social rights in the South African context. Emphasising the protection of constitutional rights, including fair labour practices, privacy, and equality, the study proposes remedies and safeguards. It advocates for a multidisciplinary approach involving legal, technological, and ethical considerations to redefine liability in South African labour law effectively. The article contends that a shift from accountability to responsibility is crucial for promoting fairness, antidiscrimination, and the protection of human and social rights in the age of automated decision-making. It calls for collaborative efforts among stakeholders to shape responsible practices and redefine liability in this evolving technological landscape.

Keywords: automated decision-making, artificial intelligence, labour law, vicarious liability

Procedia PDF Downloads 46
33 The Internet of Things in Luxury Hotels: Generating Customized Multisensory Guest Experiences

Authors: Jean-Eric Pelet, Erhard Lick, Basma Taieb

Abstract:

Purpose This research bridges the gap between sensory marketing and the use of the Internet of Things (IoT) in luxury hotels. We investigated how stimulating guests’ senses through IoT devices influenced their emotions, affective experiences, eudaimonism (well-being), and, ultimately, guest behavior. We examined potential moderating effects of gender. Design/methodology/approach We adopted a mixed method approach, combining qualitative research (semi-structured interviews) to explore hotel managers’ perspectives on the potential use of IoT in luxury hotels and quantitative research (surveying hotel guests; n=357). Findings The results showed that while the senses of smell, hearing, and sight had an impact on guests’ emotions, the senses of touch, hearing, and sight impacted guests’ affective experiences. The senses of smell and taste influenced guests’ eudaimonism. The sense of smell had a greater effect on eudaimonism and behavioral intentions among women compared to men. Originality IoT can be applied in creating customized multi-sensory hotel experiences. For example, hotels may offer unique and diverse ambiences in their rooms and suites to improve guest experiences. Research limitations/implications This study concentrated on luxury hotels located in Europe. Further research may explore the generalizability of the findings (e.g., in other cultures, comparison between high-end and low-end hotels). Practical implications Context awareness and hyper-personalization, through intensive and continuous data collection (hyper-connectivity) and real time processing, are key trends in the service industry. Therefore, big data plays a crucial role in the collection of information since it allows hoteliers to retrieve, analyze, and visualize data to provide personalized services in real time. Together with their guests, hotels may co-create customized sensory experiences. For instance, if the hotel knows about the guest’s music preferences based on social media as well as their age and gender, etc. and considers the temperature and size (standard, suite, etc.) of the guest room, this may determine the playlist of the concierge-tablet made available in the guest room. Furthermore, one may record the guest’s voice to use it for voice command purposes once the guest arrives at the hotel. Based on our finding that the sense of smell has a greater impact on eudaimonism and behavioral intentions among women than men, hotels may deploy subtler scents with lower intensities, or even different scents, for female guests in comparison to male guests.

Keywords: affective experience, emotional value, eudaimonism, hospitality industry, Internet of Things, sensory marketing

Procedia PDF Downloads 37
32 Bioinformatics High Performance Computation and Big Data

Authors: Javed Mohammed

Abstract:

Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.

Keywords: high performance, big data, parallel computation, molecular data, computational biology

Procedia PDF Downloads 337
31 The Affordances and Challenges of Online Learning and Teaching for Secondary School Students

Authors: Hahido Samaras

Abstract:

In many cases, especially with the pandemic playing a major role in fast-tracking the growth of the digital industry, online learning has become a necessity or even a standard educational model nowadays, reliably overcoming barriers such as location, time and cost and frequently combined with a face-to-face format (e.g., in blended learning). This being the case, it is evident that students in many parts of the world, as well as their parents, will increasingly need to become aware of the pros and cons of online versus traditional courses. This fast-growing mode of learning, accelerated during the years of the pandemic, presents an abundance of exciting options especially matched for a large number of secondary school students in remote places of the world where access to stimulating educational settings and opportunities for a variety of learning alternatives are scarce, adding advantages such as flexibility, affordability, engagement, flow and personalization of the learning experience. However, online learning can also present several challenges, such as a lack of student motivation and social interactions in natural settings, digital literacy, and technical issues, to name a few. Therefore, educational researchers will need to conduct further studies focusing on the benefits and weaknesses of online learning vs. traditional learning, while instructional designers propose ways of enhancing student motivation and engagement in virtual environments. Similarly, teachers will be required to become more and more technology-capable, at the same time developing their knowledge about their students’ particular characteristics and needs so as to match them with the affordances the technology offers. And, of course, schools, education programs, and policymakers will have to invest in powerful tools and advanced courses for online instruction. By developing digital courses that incorporate intentional opportunities for community-building and interaction in the learning environment, as well as taking care to include built-in design principles and strategies that align learning outcomes with learning assignments, activities, and assessment practices, rewarding academic experiences can derive for all students. This paper raises various issues regarding the effectiveness of online learning on students by reviewing a large number of research studies related to the usefulness and impact of online learning following the COVID-19-induced digital education shift. It also discusses what students, teachers, decision-makers, and parents have reported about this mode of learning to date. Best practices are proposed for parties involved in the development of online learning materials, particularly for secondary school students, as there is a need for educators and developers to be increasingly concerned about the impact of virtual learning environments on student learning and wellbeing.

Keywords: blended learning, online learning, secondary schools, virtual environments

Procedia PDF Downloads 63
30 Goal-Setting in a Peer Leader HIV Prevention Intervention to Improve Preexposure Prophylaxis Access among Black Men Who Have Sex with Men

Authors: Tim J. Walsh, Lindsay E. Young, John A. Schneider

Abstract:

Background: The disproportionate rate of HIV infection among Black men who have sex with men (BMSM) in the United States suggest the importance of Preexposure Prophylaxis (PrEP) interventions for this population. As such, there is an urgent need for innovative outreach strategies that extend beyond the traditional patient-provider relationship to reach at-risk populations. Training members of the BMSM community as peer change agents (PCAs) is one such strategy. An important piece of this training is goal-setting. Goal-setting not only encourages PCAs to define the parameters of the intervention according to their lived experience, it also helps them plan courses of action. Therefore, the aims of this mixed methods study are: (1) Characterize the goals that BMSM set at the end of their PrEP training and (2) Assess the relationship between goal types and PCA engagement. Methods: Between March 2016 and July 2016, preliminary data were collected from 68 BMSM, ages 18-33, in Chicago as part of an ongoing PrEP intervention. Once enrolled, PCAs participate in a half-day training in which they learn about PrEP, practice initiating conversations about PrEP, and identify strategies for supporting at-risk peers through the PrEP adoption process. Training culminates with a goal-setting exercise, whereby participants establish a goal related to their role as a PCA. Goals were coded for features that either emerged from the data itself or existed in extant goal-setting literature. The main outcomes were (1) number of PrEP conversations PCAs self-report during booster conversations two weeks following the intervention and (2) number of peers PCAs recruit into the study that completed the PrEP workshop. Results: PCA goals (N=68) were characterized in terms of four features: Specificity, target population, personalization, and purpose defined. To date, PCAs report a collective 52 PrEP conversations. 56, 25, and 6% of PrEP conversations occurred with friends, family, and sexual partners, respectively. PCAs with specific goals had more PrEP conversations with at-risk peers compared to those with vague goals (58% vs. 42%); PCAs with personalized goals had more PrEP conversations compared to those with de-personalized goals (60% vs. 53%); and PCAs with goals that defined a purpose had more PrEP conversations compared to those who did not define a purpose (75% vs. 52%). 100% of PCAs with goals that defined a purpose recruited peers into the study compared to 45 percent of PCAs with goals that did not define a purpose. Conclusion: Our preliminary analysis demonstrates that BMSM are motivated to set and work toward a diverse set of goals to support peers in PrEP adoption. PCAs with goals involving a clearly defined purpose had more PrEP conversations and greater peer recruitment than those with goals lacking a defined purpose. This may indicate that PCAs who define their purpose at the outset of their participation will be more engaged in the study than those who do not. Goal-setting may be considered as a component of future HIV prevention interventions to advance intervention goals and as an indicator of PCAs understanding of the intervention.

Keywords: HIV prevention, MSM, peer change agent, preexposure prophylaxis

Procedia PDF Downloads 176
29 Data Clustering Algorithm Based on Multi-Objective Periodic Bacterial Foraging Optimization with Two Learning Archives

Authors: Chen Guo, Heng Tang, Ben Niu

Abstract:

Clustering splits objects into different groups based on similarity, making the objects have higher similarity in the same group and lower similarity in different groups. Thus, clustering can be treated as an optimization problem to maximize the intra-cluster similarity or inter-cluster dissimilarity. In real-world applications, the datasets often have some complex characteristics: sparse, overlap, high dimensionality, etc. When facing these datasets, simultaneously optimizing two or more objectives can obtain better clustering results than optimizing one objective. However, except for the objectives weighting methods, traditional clustering approaches have difficulty in solving multi-objective data clustering problems. Due to this, evolutionary multi-objective optimization algorithms are investigated by researchers to optimize multiple clustering objectives. In this paper, the Data Clustering algorithm based on Multi-objective Periodic Bacterial Foraging Optimization with two Learning Archives (DC-MPBFOLA) is proposed. Specifically, first, to reduce the high computing complexity of the original BFO, periodic BFO is employed as the basic algorithmic framework. Then transfer the periodic BFO into a multi-objective type. Second, two learning strategies are proposed based on the two learning archives to guide the bacterial swarm to move in a better direction. On the one hand, the global best is selected from the global learning archive according to the convergence index and diversity index. On the other hand, the personal best is selected from the personal learning archive according to the sum of weighted objectives. According to the aforementioned learning strategies, a chemotaxis operation is designed. Third, an elite learning strategy is designed to provide fresh power to the objects in two learning archives. When the objects in these two archives do not change for two consecutive times, randomly initializing one dimension of objects can prevent the proposed algorithm from falling into local optima. Fourth, to validate the performance of the proposed algorithm, DC-MPBFOLA is compared with four state-of-art evolutionary multi-objective optimization algorithms and one classical clustering algorithm on evaluation indexes of datasets. To further verify the effectiveness and feasibility of designed strategies in DC-MPBFOLA, variants of DC-MPBFOLA are also proposed. Experimental results demonstrate that DC-MPBFOLA outperforms its competitors regarding all evaluation indexes and clustering partitions. These results also indicate that the designed strategies positively influence the performance improvement of the original BFO.

Keywords: data clustering, multi-objective optimization, bacterial foraging optimization, learning archives

Procedia PDF Downloads 112
28 Synthetic Classicism: A Machine Learning Approach to the Recognition and Design of Circular Pavilions

Authors: Federico Garrido, Mostafa El Hayani, Ahmed Shams

Abstract:

The exploration of the potential of artificial intelligence (AI) in architecture is still embryonic, however, its latent capacity to change design disciplines is significant. 'Synthetic Classism' is a research project that questions the underlying aspects of classically organized architecture not just in aesthetic terms but also from a geometrical and morphological point of view, intending to generate new architectural information using historical examples as source material. The main aim of this paper is to explore the uses of artificial intelligence and machine learning algorithms in architectural design while creating a coherent narrative to be contained within a design process. The purpose is twofold: on one hand, to develop and train machine learning algorithms to produce architectural information of small pavilions and on the other, to synthesize new information from previous architectural drawings. These algorithms intend to 'interpret' graphical information from each pavilion and then generate new information from it. The procedure, once these algorithms are trained, is the following: parting from a line profile, a synthetic 'front view' of a pavilion is generated, then using it as a source material, an isometric view is created from it, and finally, a top view is produced. Thanks to GAN algorithms, it is also possible to generate Front and Isometric views without any graphical input as well. The final intention of the research is to produce isometric views out of historical information, such as the pavilions from Sebastiano Serlio, James Gibbs, or John Soane. The idea is to create and interpret new information not just in terms of historical reconstruction but also to explore AI as a novel tool in the narrative of a creative design process. This research also challenges the idea of the role of algorithmic design associated with efficiency or fitness while embracing the possibility of a creative collaboration between artificial intelligence and a human designer. Hence the double feature of this research, both analytical and creative, first by synthesizing images based on a given dataset and then by generating new architectural information from historical references. We find that the possibility of creatively understand and manipulate historic (and synthetic) information will be a key feature in future innovative design processes. Finally, the main question that we propose is whether an AI could be used not just to create an original and innovative group of simple buildings but also to explore the possibility of fostering a novel architectural sensibility grounded on the specificities on the architectural dataset, either historic, human-made or synthetic.

Keywords: architecture, central pavilions, classicism, machine learning

Procedia PDF Downloads 115
27 Pareto Optimal Material Allocation Mechanism

Authors: Peter Egri, Tamas Kis

Abstract:

Scheduling problems have been studied by the algorithmic mechanism design research from the beginning. This paper is focusing on a practically important, but theoretically rather neglected field: the project scheduling problem where the jobs connected by precedence constraints compete for various nonrenewable resources, such as materials. Although the centralized problem can be solved in polynomial-time by applying the algorithm of Carlier and Rinnooy Kan from the Eighties, obtaining materials in a decentralized environment is usually far from optimal. It can be observed in practical production scheduling situations that project managers tend to cache the required materials as soon as possible in order to avoid later delays due to material shortages. This greedy practice usually leads both to excess stocks for some projects and materials, and simultaneously, to shortages for others. The aim of this study is to develop a model for the material allocation problem of a production plant, where a central decision maker—the inventory—should assign the resources arriving at different points in time to the jobs. Since the actual due dates are not known by the inventory, the mechanism design approach is applied with the projects as the self-interested agents. The goal of the mechanism is to elicit the required information and allocate the available materials such that it minimizes the maximal tardiness among the projects. It is assumed that except the due dates, the inventory is familiar with every other parameters of the problem. A further requirement is that due to practical considerations monetary transfer is not allowed. Therefore a mechanism without money is sought which excludes some widely applied solutions such as the Vickrey–Clarke–Groves scheme. In this work, a type of Serial Dictatorship Mechanism (SDM) is presented for the studied problem, including a polynomial-time algorithm for computing the material allocation. The resulted mechanism is both truthful and Pareto optimal. Thus the randomization over the possible priority orderings of the projects results in a universally truthful and Pareto optimal randomized mechanism. However, it is shown that in contrast to problems like the many-to-many matching market, not every Pareto optimal solution can be generated with an SDM. In addition, no performance guarantee can be given compared to the optimal solution, therefore this approximation characteristic is investigated with experimental study. All in all, the current work studies a practically relevant scheduling problem and presents a novel truthful material allocation mechanism which eliminates the potential benefit of the greedy behavior that negatively influences the outcome. The resulted allocation is also shown to be Pareto optimal, which is the most widely used criteria describing a necessary condition for a reasonable solution.

Keywords: material allocation, mechanism without money, polynomial-time mechanism, project scheduling

Procedia PDF Downloads 300
26 Customer Focus in Digital Economy: Case of Russian Companies

Authors: Maria Evnevich

Abstract:

In modern conditions, in most markets, price competition is becoming less effective. On the one hand, there is a gradual decrease in the level of marginality in main traditional sectors of the economy, so further price reduction becomes too ‘expensive’ for the company. On the other hand, the effect of price reduction is leveled, and the reason for this phenomenon is likely to be informational. As a result, it turns out that even if the company reduces prices, making its products more accessible to the buyer, there is a high probability that this will not lead to increase in sales unless additional large-scale advertising and information campaigns are conducted. Similarly, a large-scale information and advertising campaign have a much greater effect itself than price reductions. At the same time, the cost of mass informing is growing every year, especially when using the main information channels. The article presents generalization, systematization and development of theoretical approaches and best practices in the field of customer focus approach to business management and in the field of relationship marketing in the modern digital economy. The research methodology is based on the synthesis and content-analysis of sociological and marketing research and on the study of the systems of working with consumer appeals and loyalty programs in the 50 largest client-oriented companies in Russia. Also, the analysis of internal documentation on customers’ purchases in one of the largest retail companies in Russia allowed to identify if buyers prefer to buy goods for complex purchases in one retail store with the best price image for them. The cost of attracting a new client is now quite high and continues to grow, so it becomes more important to keep him and increase the involvement through marketing tools. A huge role is played by modern digital technologies used both in advertising (e-mailing, SEO, contextual advertising, banner advertising, SMM, etc.) and in service. To implement the above-described client-oriented omnichannel service, it is necessary to identify the client and work with personal data provided when filling in the loyalty program application form. The analysis of loyalty programs of 50 companies identified the following types of cards: discount cards, bonus cards, mixed cards, coalition loyalty cards, bank loyalty programs, aviation loyalty programs, hybrid loyalty cards, situational loyalty cards. The use of loyalty cards allows not only to stimulate the customer to purchase ‘untargeted’, but also to provide individualized offers, as well as to produce more targeted information. The development of digital technologies and modern means of communication has significantly changed not only the sphere of marketing and promotion, but also the economic landscape as a whole. Factors of competitiveness are the digital opportunities of companies in the field of customer orientation: personalization of service, customization of advertising offers, optimization of marketing activity and improvement of logistics.

Keywords: customer focus, digital economy, loyalty program, relationship marketing

Procedia PDF Downloads 139
25 Governance in the Age of Artificial intelligence and E- Government

Authors: Mernoosh Abouzari, Shahrokh Sahraei

Abstract:

Electronic government is a way for governments to use new technology that provides people with the necessary facilities for proper access to government information and services, improving the quality of services and providing broad opportunities to participate in democratic processes and institutions. That leads to providing the possibility of easy use of information technology in order to distribute government services to the customer without holidays, which increases people's satisfaction and participation in political and economic activities. The expansion of e-government services and its movement towards intelligentization has the ability to re-establish the relationship between the government and citizens and the elements and components of the government. Electronic government is the result of the use of information and communication technology (ICT), which by implementing it at the government level, in terms of the efficiency and effectiveness of government systems and the way of providing services, tremendous commercial changes are created, which brings people's satisfaction at the wide level will follow. The main level of electronic government services has become objectified today with the presence of artificial intelligence systems, which recent advances in artificial intelligence represent a revolution in the use of machines to support predictive decision-making and Classification of data. With the use of deep learning tools, artificial intelligence can mean a significant improvement in the delivery of services to citizens and uplift the work of public service professionals while also inspiring a new generation of technocrats to enter government. This smart revolution may put aside some functions of the government, change its components, and concepts such as governance, policymaking or democracy will change in front of artificial intelligence technology, and the top-down position in governance may face serious changes, and If governments delay in using artificial intelligence, the balance of power will change and private companies will monopolize everything with their pioneering in this field, and the world order will also depend on rich multinational companies and in fact, Algorithmic systems will become the ruling systems of the world. It can be said that currently, the revolution in information technology and biotechnology has been started by engineers, large economic companies, and scientists who are rarely aware of the political complexities of their decisions and certainly do not represent anyone. Therefore, it seems that if liberalism, nationalism, or any other religion wants to organize the world of 2050, it should not only rationalize the concept of artificial intelligence and complex data algorithm but also mix them in a new and meaningful narrative. Therefore, the changes caused by artificial intelligence in the political and economic order will lead to a major change in the way all countries deal with the phenomenon of digital globalization. In this paper, while debating the role and performance of e-government, we will discuss the efficiency and application of artificial intelligence in e-government, and we will consider the developments resulting from it in the new world and the concepts of governance.

Keywords: electronic government, artificial intelligence, information and communication technology., system

Procedia PDF Downloads 51
24 Description of Decision Inconsistency in Intertemporal Choices and Representation of Impatience as a Reflection of Irrationality: Consequences in the Field of Personalized Behavioral Finance

Authors: Roberta Martino, Viviana Ventre

Abstract:

Empirical evidence has, over time, confirmed that the behavior of individuals is inconsistent with the descriptions provided by the Discounted Utility Model, an essential reference for calculating the utility of intertemporal prospects. The model assumes that individuals calculate the utility of intertemporal prospectuses by adding up the values of all outcomes obtained by multiplying the cardinal utility of the outcome by the discount function estimated at the time the outcome is received. The trend of the discount function is crucial for the preferences of the decision maker because it represents the perception of the future, and its trend causes temporally consistent or temporally inconsistent preferences. In particular, because different formulations of the discount function lead to various conclusions in predicting choice, the descriptive ability of models with a hyperbolic trend is greater than linear or exponential models. Suboptimal choices from any time point of view are the consequence of this mechanism, the psychological factors of which are encapsulated in the discount rate trend. In addition, analyzing the decision-making process from a psychological perspective, there is an equivalence between the selection of dominated prospects and a degree of impatience that decreases over time. The first part of the paper describes and investigates the anomalies of the discounted utility model by relating the cognitive distortions of the decision-maker to the emotional factors that are generated during the evaluation and selection of alternatives. Specifically, by studying the degree to which impatience decreases, it’s possible to quantify how the psychological and emotional mechanisms of the decision-maker result in a lack of decision persistence. In addition, this description presents inconsistency as the consequence of an inconsistent attitude towards time-delayed choices. The second part of the paper presents an experimental phase in which we show the relationship between inconsistency and impatience in different contexts. Analysis of the degree to which impatience decreases confirms the influence of the decision maker's emotional impulses for each anomaly in the utility model discussed in the first part of the paper. This work provides an application in the field of personalized behavioral finance. Indeed, the numerous behavioral diversities, evident even in the degrees of decrease in impatience in the experimental phase, support the idea that optimal strategies may not satisfy individuals in the same way. With the aim of homogenizing the categories of investors and to provide a personalized approach to advice, the results proven in the experimental phase are used in a complementary way with the information in the field of behavioral finance to implement the Analytical Hierarchy Process model in intertemporal choices, useful for strategic personalization. In the construction of the Analytic Hierarchy Process, the degree of decrease in impatience is understood as reflecting irrationality in decision-making and is therefore used for the construction of weights between anomalies and behavioral traits.

Keywords: analytic hierarchy process, behavioral finance, financial anomalies, impatience, time inconsistency

Procedia PDF Downloads 37
23 Artificial Law: Legal AI Systems and the Need to Satisfy Principles of Justice, Equality and the Protection of Human Rights

Authors: Begum Koru, Isik Aybay, Demet Celik Ulusoy

Abstract:

The discipline of law is quite complex and has its own terminology. Apart from written legal rules, there is also living law, which refers to legal practice. Basic legal rules aim at the happiness of individuals in social life and have different characteristics in different branches such as public or private law. On the other hand, law is a national phenomenon. The law of one nation and the legal system applied on the territory of another nation may be completely different. People who are experts in a particular field of law in one country may have insufficient expertise in the law of another country. Today, in addition to the local nature of law, international and even supranational law rules are applied in order to protect basic human values and ensure the protection of human rights around the world. Systems that offer algorithmic solutions to legal problems using artificial intelligence (AI) tools will perhaps serve to produce very meaningful results in terms of human rights. However, algorithms to be used should not be developed by only computer experts, but also need the contribution of people who are familiar with law, values, judicial decisions, and even the social and political culture of the society to which it will provide solutions. Otherwise, even if the algorithm works perfectly, it may not be compatible with the values of the society in which it is applied. The latest developments involving the use of AI techniques in legal systems indicate that artificial law will emerge as a new field in the discipline of law. More AI systems are already being applied in the field of law, with examples such as predicting judicial decisions, text summarization, decision support systems, and classification of documents. Algorithms for legal systems employing AI tools, especially in the field of prediction of judicial decisions and decision support systems, have the capacity to create automatic decisions instead of judges. When the judge is removed from this equation, artificial intelligence-made law created by an intelligent algorithm on its own emerges, whether the domain is national or international law. In this work, the aim is to make a general analysis of this new topic. Such an analysis needs both a literature survey and a perspective from computer experts' and lawyers' point of view. In some societies, the use of prediction or decision support systems may be useful to integrate international human rights safeguards. In this case, artificial law can serve to produce more comprehensive and human rights-protective results than written or living law. In non-democratic countries, it may even be thought that direct decisions and artificial intelligence-made law would be more protective instead of a decision "support" system. Since the values of law are directed towards "human happiness or well-being", it requires that the AI algorithms should always be capable of serving this purpose and based on the rule of law, the principle of justice and equality, and the protection of human rights.

Keywords: AI and law, artificial law, protection of human rights, AI tools for legal systems

Procedia PDF Downloads 44
22 The Digital Transformation of Life Insurance Sales in Iran With the Emergence of Personal Financial Planning Robots; Opportunities and Challenges

Authors: Pedram Saadati, Zahra Nazari

Abstract:

Anticipating and identifying future opportunities and challenges facing industry activists for the emergence and entry of new knowledge and technologies of personal financial planning, and providing practical solutions is one of the goals of this research. For this purpose, a future research tool based on receiving opinions from the main players of the insurance industry has been used. The research method in this study was in 4 stages; including 1- a survey of the specialist salesforce of life insurance in order to identify the variables 2- the ranking of the variables by experts selected by a researcher-made questionnaire 3- holding a panel of experts with the aim of understanding the mutual effects of the variables and 4- statistical analyzes of the mutual effects matrix in Mick Mac software is done. The integrated analysis of influencing variables in the future has been done with the method of Structural Analysis, which is one of the efficient and innovative methods of future research. A list of opportunities and challenges was identified through a survey of best-selling life insurance representatives who were selected by snowball sampling. In order to prioritize and identify the most important issues, all the issues raised were sent to selected experts who were selected theoretically through a researcher-made questionnaire. The respondents determined the importance of 36 variables through scoring, so that the prioritization of opportunity and challenge variables can be determined. 8 of the variables identified in the first stage were removed by selected experts, and finally, the number of variables that could be examined in the third stage became 28 variables, which, in order to facilitate the examination, were divided into 6 categories, respectively, 11 variables of organization and management. Marketing and sales 7 cases, social and cultural 6 cases, technological 2 cases, rebranding 1 case and insurance 1 case were divided. The reliability of the researcher-made questionnaire was confirmed with the Cronbach's alpha test value of 0.96. In the third stage, by forming a panel consisting of 5 insurance industry experts, the consensus of their opinions about the influence of factors on each other and the ranking of variables was entered into the matrix. The matrix included the interrelationships of 28 variables, which were investigated using the structural analysis method. By analyzing the data obtained from the matrix by Mic Mac software, the findings of the research indicate that the categories of "correct training in the use of the software, the weakness of the technology of insurance companies in personalizing products, using the approach of equipping the customer, and honesty in declaring no need Customer to Insurance", the most important challenges of the influencer and the categories of "salesforce equipping approach, product personalization based on customer needs assessment, customer's pleasant experience of being consulted with consulting robots, business improvement of the insurance company due to the use of these tools, increasing the efficiency of the issuance process and optimal customer purchase" were identified as the most important opportunities for influence.

Keywords: personal financial planning, wealth management, advisor robots, life insurance, digital transformation

Procedia PDF Downloads 18
21 Ethical Artificial Intelligence: An Exploratory Study of Guidelines

Authors: Ahmad Haidar

Abstract:

The rapid adoption of Artificial Intelligence (AI) technology holds unforeseen risks like privacy violation, unemployment, and algorithmic bias, triggering research institutions, governments, and companies to develop principles of AI ethics. The extensive and diverse literature on AI lacks an analysis of the evolution of principles developed in recent years. There are two fundamental purposes of this paper. The first is to provide insights into how the principles of AI ethics have been changed recently, including concepts like risk management and public participation. In doing so, a NOISE (Needs, Opportunities, Improvements, Strengths, & Exceptions) analysis will be presented. Second, offering a framework for building Ethical AI linked to sustainability. This research adopts an explorative approach, more specifically, an inductive approach to address the theoretical gap. Consequently, this paper tracks the different efforts to have “trustworthy AI” and “ethical AI,” concluding a list of 12 documents released from 2017 to 2022. The analysis of this list unifies the different approaches toward trustworthy AI in two steps. First, splitting the principles into two categories, technical and net benefit, and second, testing the frequency of each principle, providing the different technical principles that may be useful for stakeholders considering the lifecycle of AI, or what is known as sustainable AI. Sustainable AI is the third wave of AI ethics and a movement to drive change throughout the entire lifecycle of AI products (i.e., idea generation, training, re-tuning, implementation, and governance) in the direction of greater ecological integrity and social fairness. In this vein, results suggest transparency, privacy, fairness, safety, autonomy, and accountability as recommended technical principles to include in the lifecycle of AI. Another contribution is to capture the different basis that aid the process of AI for sustainability (e.g., towards sustainable development goals). The results indicate data governance, do no harm, human well-being, and risk management as crucial AI for sustainability principles. This study’s last contribution clarifies how the principles evolved. To illustrate, in 2018, the Montreal declaration mentioned eight principles well-being, autonomy, privacy, solidarity, democratic participation, equity, and diversity. In 2021, notions emerged from the European Commission proposal, including public trust, public participation, scientific integrity, risk assessment, flexibility, benefit and cost, and interagency coordination. The study design will strengthen the validity of previous studies. Yet, we advance knowledge in trustworthy AI by considering recent documents, linking principles with sustainable AI and AI for sustainability, and shedding light on the evolution of guidelines over time.

Keywords: artificial intelligence, AI for sustainability, declarations, framework, regulations, risks, sustainable AI

Procedia PDF Downloads 66
20 Behavioral Analysis of Anomalies in Intertemporal Choices Through the Concept of Impatience and Customized Strategies for Four Behavioral Investor Profiles With an Application of the Analytic Hierarchy Process: A Case Study

Authors: Roberta Martino, Viviana Ventre

Abstract:

The Discounted Utility Model is the essential reference for calculating the utility of intertemporal prospects. According to this model, the value assigned to an outcome is the smaller the greater the distance between the moment in which the choice is made and the instant in which the outcome is perceived. This diminution determines the intertemporal preferences of the individual, the psychological significance of which is encapsulated in the discount rate. The classic model provides a discount rate of linear or exponential nature, necessary for temporally consistent preferences. Empirical evidence, however, has proven that individuals apply discount rates with a hyperbolic nature generating the phenomenon of intemporal inconsistency. What this means is that individuals have difficulty managing their money and future. Behavioral finance, which analyzes the investor's attitude through cognitive psychology, has made it possible to understand that beyond individual financial competence, there are factors that condition choices because they alter the decision-making process: behavioral bias. Since such cognitive biases are inevitable, to improve the quality of choices, research has focused on a personalized approach to strategies that combines behavioral finance with personality theory. From the considerations, it emerges the need to find a procedure to construct the personalized strategies that consider the personal characteristics of the client, such as age or gender, and his personality. The work is developed in three parts. The first part discusses and investigates the weight of the degree of impatience and impatience decrease in the anomalies of the discounted utility model. Specifically, the degree of decrease in impatience quantifies the impact that emotional factors generated by haste and financial market agitation have on decision making. The second part considers the relationship between decision making and personality theory. Specifically, four behavioral categories associated with four categories of behavioral investors are considered. This association allows us to interpret intertemporal choice as a combination of bias and temperament. The third part of the paper presents a method for constructing personalized strategies using Analytic Hierarchy Process. Briefly: the first level of the analytic hierarchy process considers the goal of the strategic plan; the second level considers the four temperaments; the third level compares the temperaments with the anomalies of the discounted utility model; and the fourth level contains the different possible alternatives to be selected. The weights of the hierarchy between level 2 and level 3 are constructed considering the degrees of decrease in impatience derived for each temperament with an experimental phase. The results obtained confirm the relationship between temperaments and anomalies through the degree of decrease in impatience and highlight that the actual impact of emotions in decision making. Moreover, it proposes an original and useful way to improve financial advice. Inclusion of additional levels in the Analytic Hierarchy Process can further improve strategic personalization.

Keywords: analytic hierarchy process, behavioral finance anomalies, intertemporal choice, personalized strategies

Procedia PDF Downloads 71
19 A Comprehensive Finite Element Model for Incremental Launching of Bridges: Optimizing Construction and Design

Authors: Mohammad Bagher Anvari, Arman Shojaei

Abstract:

Incremental launching, a widely adopted bridge erection technique, offers numerous advantages for bridge designers. However, accurately simulating and modeling the dynamic behavior of the bridge during each step of the launching process proves to be tedious and time-consuming. The perpetual variation of internal forces within the deck during construction stages adds complexity, exacerbated further by considerations of other load cases, such as support settlements and temperature effects. As a result, there is an urgent need for a reliable, simple, economical, and fast algorithmic solution to model bridge construction stages effectively. This paper presents a novel Finite Element (FE) model that focuses on studying the static behavior of bridges during the launching process. Additionally, a simple method is introduced to normalize all quantities in the problem. The new FE model overcomes the limitations of previous models, enabling the simulation of all stages of launching, which conventional models fail to achieve due to underlying assumptions. By leveraging the results obtained from the new FE model, this study proposes solutions to improve the accuracy of conventional models, particularly for the initial stages of bridge construction that have been neglected in previous research. The research highlights the critical role played by the first span of the bridge during the initial stages, a factor often overlooked in existing studies. Furthermore, a new and simplified model termed the "semi-infinite beam" model, is developed to address this oversight. By utilizing this model alongside a simple optimization approach, optimal values for launching nose specifications are derived. The practical applications of this study extend to optimizing the nose-deck system of incrementally launched bridges, providing valuable insights for practical usage. In conclusion, this paper introduces a comprehensive Finite Element model for studying the static behavior of bridges during incremental launching. The proposed model addresses limitations found in previous approaches and offers practical solutions to enhance accuracy. The study emphasizes the importance of considering the initial stages and introduces the "semi-infinite beam" model. Through the developed model and optimization approach, optimal specifications for launching nose configurations are determined. This research holds significant practical implications and contributes to the optimization of incrementally launched bridges, benefiting both the construction industry and bridge designers.

Keywords: incremental launching, bridge construction, finite element model, optimization

Procedia PDF Downloads 68
18 An Adaptive Oversampling Technique for Imbalanced Datasets

Authors: Shaukat Ali Shahee, Usha Ananthakumar

Abstract:

A data set exhibits class imbalance problem when one class has very few examples compared to the other class, and this is also referred to as between class imbalance. The traditional classifiers fail to classify the minority class examples correctly due to its bias towards the majority class. Apart from between-class imbalance, imbalance within classes where classes are composed of a different number of sub-clusters with these sub-clusters containing different number of examples also deteriorates the performance of the classifier. Previously, many methods have been proposed for handling imbalanced dataset problem. These methods can be classified into four categories: data preprocessing, algorithmic based, cost-based methods and ensemble of classifier. Data preprocessing techniques have shown great potential as they attempt to improve data distribution rather than the classifier. Data preprocessing technique handles class imbalance either by increasing the minority class examples or by decreasing the majority class examples. Decreasing the majority class examples lead to loss of information and also when minority class has an absolute rarity, removing the majority class examples is generally not recommended. Existing methods available for handling class imbalance do not address both between-class imbalance and within-class imbalance simultaneously. In this paper, we propose a method that handles between class imbalance and within class imbalance simultaneously for binary classification problem. Removing between class imbalance and within class imbalance simultaneously eliminates the biases of the classifier towards bigger sub-clusters by minimizing the error domination of bigger sub-clusters in total error. The proposed method uses model-based clustering to find the presence of sub-clusters or sub-concepts in the dataset. The number of examples oversampled among the sub-clusters is determined based on the complexity of sub-clusters. The method also takes into consideration the scatter of the data in the feature space and also adaptively copes up with unseen test data using Lowner-John ellipsoid for increasing the accuracy of the classifier. In this study, neural network is being used as this is one such classifier where the total error is minimized and removing the between-class imbalance and within class imbalance simultaneously help the classifier in giving equal weight to all the sub-clusters irrespective of the classes. The proposed method is validated on 9 publicly available data sets and compared with three existing oversampling techniques that rely on the spatial location of minority class examples in the euclidean feature space. The experimental results show the proposed method to be statistically significantly superior to other methods in terms of various accuracy measures. Thus the proposed method can serve as a good alternative to handle various problem domains like credit scoring, customer churn prediction, financial distress, etc., that typically involve imbalanced data sets.

Keywords: classification, imbalanced dataset, Lowner-John ellipsoid, model based clustering, oversampling

Procedia PDF Downloads 390
17 Flexible Design Solutions for Complex Free form Geometries Aimed to Optimize Performances and Resources Consumption

Authors: Vlad Andrei Raducanu, Mariana Lucia Angelescu, Ion Cinca, Vasile Danut Cojocaru, Doina Raducanu

Abstract:

By using smart digital tools, such as generative design (GD) and digital fabrication (DF), problems of high actuality concerning resources optimization (materials, energy, time) can be solved and applications or products of free-form type can be created. In the new digital technology materials are active, designed in response to a set of performance requirements, which impose a total rethinking of old material practices. The article presents the design procedure key steps of a free-form architectural object - a column type one with connections to get an adaptive 3D surface, by using the parametric design methodology and by exploiting the properties of conventional metallic materials. In parametric design the form of the created object or space is shaped by varying the parameters values and relationships between the forms are described by mathematical equations. Digital parametric design is based on specific procedures, as shape grammars, Lindenmayer - systems, cellular automata, genetic algorithms or swarm intelligence, each of these procedures having limitations which make them applicable only in certain cases. In the paper the design process stages and the shape grammar type algorithm are presented. The generative design process relies on two basic principles: the modeling principle and the generative principle. The generative method is based on a form finding process, by creating many 3D spatial forms, using an algorithm conceived in order to apply its generating logic onto different input geometry. Once the algorithm is realized, it can be applied repeatedly to generate the geometry for a number of different input surfaces. The generated configurations are then analyzed through a technical or aesthetic selection criterion and finally the optimal solution is selected. Endless range of generative capacity of codes and algorithms used in digital design offers various conceptual possibilities and optimal solutions for both technical and environmental increasing demands of building industry and architecture. Constructions or spaces generated by parametric design can be specifically tuned, in order to meet certain technical or aesthetical requirements. The proposed approach has direct applicability in sustainable architecture, offering important potential economic advantages, a flexible design (which can be changed until the end of the design process) and unique geometric models of high performance.

Keywords: parametric design, algorithmic procedures, free-form architectural object, sustainable architecture

Procedia PDF Downloads 344
16 The Design and Development of Online Infertility Prevention Education in the Frame of Mayer's Multimedia Learning Theory

Authors: B. Baran, S. N. Kaptanoglu, M. Ocal, Y. Kagnici, E. Esen, E. Siyez, D. M. Siyez

Abstract:

Infertility is the fact that couples cannot have children despite 1 year of unprotected sexual life. Infertility can be considered as an important problem affecting not only sexual life but also social and psychological conditions of couples. Learning about information about preventable factors related to infertility during university years plays an important role in preventing a possible infertility case in older ages. The possibility to facilitate access to information with the internet has provided the opportunity to reach a broad audience in the diverse learning environments and educational environment. Moreover, the internet has become a basic resource for the 21st-century learners. Providing information about infertility over the internet will enable more people to reach in a short time. When studies conducted abroad about infertility are examined, interactive websites and online education programs come to the fore. In Turkey, while there is no comprehensive online education program for university students, it seems that existing studies are aimed to make more advertisements for doctors or hospitals. In this study, it was aimed to design and develop online infertility prevention education for university students. Mayer’s Multimedia Learning Theory made up the framework for the online learning environment in this study. The results of the needs analysis collected from the university students in Turkey who were selected with sampling to represent the audience for online learning contributed to the design phase. In this study, an infertility prevention online education environment designed as a 4-week education was developed by explaining the theoretical basis and needs analysis results. As a result; in the development of the online environment, different kind of visual aids that will increase teaching were used in the environment of online education according to Mayer’s principles of extraneous processing (coherence, signaling, spatial contiguity, temporal contiguity, redundancy, expectation principles), essential processing (segmenting, pre-training, modality principles) and generative processing (multimedia, personalization, voice principles). For example, the important points in reproductive systems’ expression were emphasized by visuals in order to draw learners’ attention, and the presentation of the information was also supported by the human voice. In addition, because of the limited knowledge of university students in the subject, the issue of female reproductive and male reproductive systems was taught before preventable factors related to infertility. Furthermore, 3D video and augmented reality application were developed in order to embody female and male reproductive systems. In conclusion, this study aims to develop an interactive Online Infertility Prevention Education in which university students can easily access reliable information and evaluate their own level of knowledge about the subject. It is believed that the study will also guide the researchers who want to develop online education in this area as it contains design-stage decisions of interactive online infertility prevention education for university students.

Keywords: infertility, multimedia learning theory, online education, reproductive health

Procedia PDF Downloads 139
15 An Integrated Lightweight Naïve Bayes Based Webpage Classification Service for Smartphone Browsers

Authors: Mayank Gupta, Siba Prasad Samal, Vasu Kakkirala

Abstract:

The internet world and its priorities have changed considerably in the last decade. Browsing on smart phones has increased manifold and is set to explode much more. Users spent considerable time browsing different websites, that gives a great deal of insight into user’s preferences. Instead of plain information classifying different aspects of browsing like Bookmarks, History, and Download Manager into useful categories would improve and enhance the user’s experience. Most of the classification solutions are server side that involves maintaining server and other heavy resources. It has security constraints and maybe misses on contextual data during classification. On device, classification solves many such problems, but the challenge is to achieve accuracy on classification with resource constraints. This on device classification can be much more useful in personalization, reducing dependency on cloud connectivity and better privacy/security. This approach provides more relevant results as compared to current standalone solutions because it uses content rendered by browser which is customized by the content provider based on user’s profile. This paper proposes a Naive Bayes based lightweight classification engine targeted for a resource constraint devices. Our solution integrates with Web Browser that in turn triggers classification algorithm. Whenever a user browses a webpage, this solution extracts DOM Tree data from the browser’s rendering engine. This DOM data is a dynamic, contextual and secure data that can’t be replicated. This proposal extracts different features of the webpage that runs on an algorithm to classify into multiple categories. Naive Bayes based engine is chosen in this solution for its inherent advantages in using limited resources compared to other classification algorithms like Support Vector Machine, Neural Networks, etc. Naive Bayes classification requires small memory footprint and less computation suitable for smartphone environment. This solution has a feature to partition the model into multiple chunks that in turn will facilitate less usage of memory instead of loading a complete model. Classification of the webpages done through integrated engine is faster, more relevant and energy efficient than other standalone on device solution. This classification engine has been tested on Samsung Z3 Tizen hardware. The Engine is integrated into Tizen Browser that uses Chromium Rendering Engine. For this solution, extensive dataset is sourced from dmoztools.net and cleaned. This cleaned dataset has 227.5K webpages which are divided into 8 generic categories ('education', 'games', 'health', 'entertainment', 'news', 'shopping', 'sports', 'travel'). Our browser integrated solution has resulted in 15% less memory usage (due to partition method) and 24% less power consumption in comparison with standalone solution. This solution considered 70% of the dataset for training the data model and the rest 30% dataset for testing. An average accuracy of ~96.3% is achieved across the above mentioned 8 categories. This engine can be further extended for suggesting Dynamic tags and using the classification for differential uses cases to enhance browsing experience.

Keywords: chromium, lightweight engine, mobile computing, Naive Bayes, Tizen, web browser, webpage classification

Procedia PDF Downloads 136
14 Training for Search and Rescue Teams: Online Training for SAR Teams to Locate Lost Persons with Dementia Using Drones

Authors: Dalia Hanna, Alexander Ferworn

Abstract:

This research provides detailed proposed training modules for the public safety teams and, specifically, SAR teams responsible for search and rescue operations related to finding lost persons with dementia. Finding a lost person alive is the goal of this training. Time matters if a lost person is to be found alive. Finding lost people living with dementia is quite challenging, as they are unaware they are lost and will not seek help. Even a small contribution to SAR operations could contribute to saving a life. SAR operations will always require expert professional and human volunteers. However, we can reduce their time, save lives, and reduce costs by providing practical training that is based on real-life scenarios. The content for the proposed training is based on the research work done by the researcher in this area. This research has demonstrated that, based on utilizing drones, the algorithmic approach could support a successful search outcome. Understanding the behavior of the lost person, learning where they may be found, predicting their survivability, and automating the search are all contributions of this work, founded in theory and demonstrated in practice. In crisis management, human behavior constitutes a vital aspect in responding to the crisis; the speed and efficiency of the response often get affected by the difficulty of the context of the operation. Therefore, training in this area plays a significant role in preparing the crisis manager to manage the emotional aspects that lead to decision-making in these critical situations. Since it is crucial to gain high-level strategic choices and the ability to apply crisis management procedures, simulation exercises become central in training crisis managers to gain the needed skills to respond critically to these events. The training will enhance the responders’ ability to make decisions and anticipate possible consequences of their actions through flexible and revolutionary reasoning in responding to the crisis efficiently and quickly. As adult learners, search and rescue teams will be approaching training and learning by taking responsibility of the learning process, appreciate flexible learning and as contributors to the teaching and learning happening during that training. These are all characteristics of adult learning theories. The learner self-reflects, gathers information, collaborates with others and is self-directed. One of the learning strategies associated with adult learning is effective elaboration. It helps learners to remember information in the long term and use it in situations where it might be appropriate. It is also a strategy that can be taught easily and used with learners of different ages. Designers must design reflective activities to improve the student’s intrapersonal awareness.

Keywords: training, OER, dementia, drones, search and rescue, adult learning, UDL, instructional design

Procedia PDF Downloads 70
13 Liquid Illumination: Fabricating Images of Fashion and Architecture

Authors: Sue Hershberger Yoder, Jon Yoder

Abstract:

“The appearance does not hide the essence, it reveals it; it is the essence.”—Jean-Paul Sartre, Being and Nothingness Three decades ago, transarchitect Marcos Novak developed an early form of algorithmic animation he called “liquid architecture.” In that project, digitally floating forms morphed seamlessly in cyberspace without claiming to evolve or improve. Change itself was seen as inevitable. And although some imagistic moments certainly stood out, none was hierarchically privileged over another. That project challenged longstanding assumptions about creativity and artistic genius by posing infinite parametric possibilities as inviting alternatives to traditional notions of stability, originality, and evolution. Through ephemeral processes of printing, milling, and projecting, the exhibition “Liquid Illumination” destabilizes the solid foundations of fashion and architecture. The installation is neither worn nor built in the conventional sense, but—like the sensual art forms of fashion and architecture—it is still radically embodied through the logics and techniques of design. Appearances are everything. Surface pattern and color are no longer understood as minor afterthoughts or vapid carriers of dubious content. Here, they become essential but ever-changing aspects of precisely fabricated images. Fourteen silk “colorways” (a term from the fashion industry) are framed selections from ongoing experiments with intricate pattern and complex color configurations. Whether these images are printed on fabric, milled in foam, or illuminated through projection, they explore and celebrate the untapped potentials of the surficial and superficial. Some components of individual prints appear to float in front of others through stereoscopic superimpositions; some figures appear to melt into others due to subtle changes in hue without corresponding changes in value; and some layers appear to vibrate via moiré effects that emerge from unexpected pattern and color combinations. The liturgical atmosphere of Liquid Illumination is intended to acknowledge that, like the simultaneously sacred and superficial qualities of rose windows and illuminated manuscripts, artistic and religious ideologies are also always malleable. The intellectual provocation of this paper pushes the boundaries of current thinking concerning viable applications for fashion print designs and architectural images—challenging traditional boundaries between fine art and design. The opportunistic installation of digital printing, CNC milling, and video projection mapping in a gallery that is normally reserved for fine art exhibitions raises important questions about cultural/commercial display, mass customization, digital reproduction, and the increasing prominence of surface effects (color, texture, pattern, reflection, saturation, etc.) across a range of artistic practices and design disciplines.

Keywords: fashion, print design, architecture, projection mapping, image, fabrication

Procedia PDF Downloads 65
12 Nurturing Students' Creativity through Engagement in Problem Posing and Self-Assessment of Its Development

Authors: Atara Shriki, Ilana Lavy

Abstract:

In a rapidly changing technological society, creativity is considered as an engine of economic and social progress. No doubt the education system has a central role in nurturing all students’ creativity, however, it is normally not encouraged at school. The causes of this reality are related to a variety of circumstances, among them: external pressures to cover the curriculum and succeed in standardized tests that mostly require algorithmic thinking and implementation of rules; teachers’ tendency to teach similarly to the way they themselves were taught as school students; relating creativity to giftedness, and therefore avoid nurturing all students' creativity; lack of adequate learning materials and accessible tools for following and evaluating the development of students’ creativity; and more. Since success in academic studies requires, among other things, creativity, lecturers in higher education institutions should consider appropriate ways to nurture students’ creative thinking and assess its development. Obviously, creativity has a multifaceted nature, numerous definitions, various perspectives for studying its essence (e.g., process, personality, environment, and product), and several approaches aimed at evaluating and assessing creative expressions (e.g., cognitive, social-personal, and psychometric). In this framework, we suggest nurturing students’ creativity through engaging them in problem posing activities that are part of inquiry assignments. In order to assess the development of their creativity, we propose to employ a model that was designed for this purpose, based on the psychometric approach, viewing the posed problems as the “creative product”. The model considers four measurable aspects- fluency, flexibility, originality, and organization, as well as a total score of creativity that reflects the relative weights of each aspect. The scores given to learners are of two types: (1) Total scores- the absolute number of posed problems with respect to each of the four aspects, and a final score of creativity; (2) Relative scores- each absolute number is transformed into a number that relates to the relative infrequency of the posed problems in student’s reference group. Through converting the scores received over time into a graphical display, students can assess their progress both with respect to themselves and relative to their reference group. Course lecturers can get a picture of the strengths and weaknesses of each student as well as the class as a whole, and to track changes that occur over time in response to the learning environment they had generated. Such tracking may assist lecturers in making pedagogical decisions about emphases that should be put on one or more aspects of creativity, and about the students that should be given a special attention. Our experience indicates that schoolteachers and lecturers in higher education institutes find the combination of engaging learners in problem posing along with self-assessment of their progress through utilizing the graphical display of accumulating total and relative scores has the potential to realize most learners’ creative potential.

Keywords: creativity, problem posing, psychometric model, self-assessment

Procedia PDF Downloads 281
11 An Evolutionary Approach for Automated Optimization and Design of Vivaldi Antennas

Authors: Sahithi Yarlagadda

Abstract:

The design of antenna is constrained by mathematical and geometrical parameters. Though there are diverse antenna structures with wide range of feeds yet, there are many geometries to be tried, which cannot be customized into predefined computational methods. The antenna design and optimization qualify to apply evolutionary algorithmic approach since the antenna parameters weights dependent on geometric characteristics directly. The evolutionary algorithm can be explained simply for a given quality function to be maximized. We can randomly create a set of candidate solutions, elements of the function's domain, and apply the quality function as an abstract fitness measure. Based on this fitness, some of the better candidates are chosen to seed the next generation by applying recombination and permutation to them. In conventional approach, the quality function is unaltered for any iteration. But the antenna parameters and geometries are wide to fit into single function. So, the weight coefficients are obtained for all possible antenna electrical parameters and geometries; the variation is learnt by mining the data obtained for an optimized algorithm. The weight and covariant coefficients of corresponding parameters are logged for learning and future use as datasets. This paper drafts an approach to obtain the requirements to study and methodize the evolutionary approach to automated antenna design for our past work on Vivaldi antenna as test candidate. The antenna parameters like gain, directivity, etc. are directly caged by geometries, materials, and dimensions. The design equations are to be noted here and valuated for all possible conditions to get maxima and minima for given frequency band. The boundary conditions are thus obtained prior to implementation, easing the optimization. The implementation mainly aimed to study the practical computational, processing, and design complexities that incur while simulations. HFSS is chosen for simulations and results. MATLAB is used to generate the computations, combinations, and data logging. MATLAB is also used to apply machine learning algorithms and plotting the data to design the algorithm. The number of combinations is to be tested manually, so HFSS API is used to call HFSS functions from MATLAB itself. MATLAB parallel processing tool box is used to run multiple simulations in parallel. The aim is to develop an add-in to antenna design software like HFSS, CSTor, a standalone application to optimize pre-identified common parameters of wide range of antennas available. In this paper, we have used MATLAB to calculate Vivaldi antenna parameters like slot line characteristic impedance, impedance of stripline, slot line width, flare aperture size, dielectric and K means, and Hamming window are applied to obtain the best test parameters. HFSS API is used to calculate the radiation, bandwidth, directivity, and efficiency, and data is logged for applying the Evolutionary genetic algorithm in MATLAB. The paper demonstrates the computational weights and Machine Learning approach for automated antenna optimizing for Vivaldi antenna.

Keywords: machine learning, Vivaldi, evolutionary algorithm, genetic algorithm

Procedia PDF Downloads 84
10 Racial Distress in the Digital Age: A Mixed-Methods Exploration of the Effects of Social Media Exposure to Police Brutality on Black Students

Authors: Amanda M. McLeroy, Tiera Tanksley

Abstract:

The 2020 movement for Black Lives, ignited by anti-Black police brutality and exemplified by the public execution of George Floyd, underscored the dual potential of social media for political activism and perilous exposure to traumatic content for Black students. This study employs Critical Race Technology Theory (CRTT) to scrutinize algorithmic anti-blackness and its impact on Black youth's lives and educational experiences. The research investigates the consequences of vicarious exposure to police brutality on social media among Black adolescents through qualitative interviews and quantitative scale data. The findings reveal an unprecedented surge in exposure to viral police killings since 2020, resulting in profound physical, socioemotional, and educational effects on Black youth. CRTT forms the theoretical basis, challenging the notion of digital technologies as post-racial and neutral, aiming to dismantle systemic biases within digital systems. Black youth, averaging over 13 hours of daily social media use, face constant exposure to graphic images of Black individuals dying. The study connects this exposure to a range of physical, socioemotional, and mental health consequences, emphasizing the urgent need for understanding and support. The research proposes questions to explore the extent of police brutality exposure and its effects on Black youth. Qualitative interviews with high school and college students and quantitative scale data from undergraduates contribute to a nuanced understanding of the impact of police brutality exposure on Black youth. Themes of unprecedented exposure to viral police killings, physical and socioemotional effects, and educational consequences emerge from the analysis. The study uncovers how vicarious experiences of negative police encounters via social media lead to mistrust, fear, and psychosomatic symptoms among Black adolescents. Implications for educators and counselors are profound, emphasizing the cultivation of empathy, provision of mental health support, integration of media literacy education, and encouragement of activism. Recognizing family and community influences is crucial for comprehensive support. Professional development opportunities in culturally responsive teaching and trauma-informed approaches are recommended for educators. In conclusion, creating a supportive educational environment that addresses the emotional impact of social media exposure to police brutality is crucial for the well-being and development of Black adolescents. Counselors, through safe spaces and collaboration, play a vital role in supporting Black youth facing the distressing effects of social media exposure to police brutality.

Keywords: black youth, mental health, police brutality, social media

Procedia PDF Downloads 26