Disertación/Tesis

Clique aqui para acessar os arquivos diretamente da Biblioteca Digital de Teses e Dissertações da UNIFEI

2023
Disertaciones
1
  • CLEIDIANA REIS DOS SANTOS
  • Desafio IoT: Serious Game for Immersion in Embedded Software Development in the Context of Smart Homes

  • Líder : RODRIGO DUARTE SEABRA
  • MIEMBROS DE LA BANCA :
  • BRUNO GUAZZELLI BATISTA
  • LUIS HENRIQUE NUNES
  • RODRIGO DUARTE SEABRA
  • Data: 13-feb-2023


  • Resumen Espectáculo
  • The technological evolution provided by the Internet of Things is in increasing
    development and the demand for professionals increases proportionally. Parallel to
    this reality, the use of games as a learning tool is a suitable practice especially for the
    younger audience, as they present elements of fun and engagement. Serious games
    can help players acquire new experiences and complex knowledge, which are
    obtained by solving the challenges. In this context, the serious game proposed in this
    research, Desafio IoT, aims to provide an overview of some problems and solutions in
    embedded software development for smart homes. In addition to a serious purpose,
    the game seeks to awaken students' interest in the subject, spreading the idea and
    motivating to work in the development area. The implementation of the game was
    carried out according to the Learning Mechanics – Game Mechanics (LM-GM)
    specification model. In order to investigate the educational impact provided by the
    experience of using the game, besides the questionnaires on usage and technical
    knowledge, the MEEGA+ questionnaire was used to evaluate the game. The results
    allow concluding that the game was able to introduce students to the Internet of Things
    area and motivate them to further their knowledge on the subject. The evaluation of
    the game by the students presented a positive overall result, as well as approval in
    seven of the eight dimensions used in the analysis.

2
  • JONAS HENRIQUE RIBEIRO PAULA
  • Collaborative Committee: The use of a collaborative system in the development of participatory legal instruments

  • Líder : MELISE MARIA VEIGA DE PAULA
  • MIEMBROS DE LA BANCA :
  • JONICE OLIVEIRA SAMPAIO
  • ADLER DINIZ DE SOUZA
  • MELISE MARIA VEIGA DE PAULA
  • Data: 24-feb-2023


  • Resumen Espectáculo
  • The Cities Statute (CS) regulates articles 182 and 183 of the Federal Constitution, establishing a set of participatory legal instruments that combined determine how urban policy should be conducted throughout the country. However, the application of the CS represents a challenge for the municipal administration, one of the reasons is the demand for popular participation in the management of urban policy to ensure the equity of citizens' rights. To promote this participation representative committees are created with members who represent the parts of society in the municipality. However, the members of the committees involved in the elaboration of the CS instruments, in most cases, have divergent demands on public choices, conflicting agendas and other characteristics that make it difficult for the work to be conducted in a harmonious and efficient way. Therefore, it is always necessary to try to encourage collaboration so that everyone involved can present their point of view and actively participate. In this way, this work proposes the analysis of the intervention of a collaborative system, called Collaborative Committee, in the activities conducted by committees during the elaboration of participatory legal instruments of the CS. The Collaborative Committee was implemented in real projects conducted by a research and extension group from the Federal University of Itajubá that helps municipalities in the elaboration and revision of these instruments, following the guidelines of the action research methodology. In the analysis of the results, it was possible to notice that there was an improvement in the coordination of activities, an advance in the cooperation between those involved and better opportunities for communication between the members with the intervention of the Collaborative Committee.

3
  • IGOR MOREIRA ALVES
  • Ideb, the federation units and the profile of public schools: an unsupervised exploratory data analysis.

  • Líder : CARLOS HENRIQUE DA SILVEIRA
  • MIEMBROS DE LA BANCA :
  • CARLOS HENRIQUE DA SILVEIRA
  • MELISE MARIA VEIGA DE PAULA
  • CRISTIANE NERI NOBRE
  • Data: 24-feb-2023


  • Resumen Espectáculo
  • This dissertation seeks to use data science, through unsupervised exploratory analysis, to prove what the literature already knows and/or present new discoveries about the influence of infrastructure on Brazilian basic education. School infrastructure is not limited to the architectural issue of schools, but also to the educational and administrative environment, equipment, educational resources, practices, curricula and the teaching and learning process. Data collection was carried out on open data from the 2019 School Census (Basic Education) and the Basic Education Development Index (Ideb), for the years 2005-2019. The choice of the year 2019 was because it was the last year that schools presented results before the influence of the COVID-19 pandemic. After several data treatments and the choice of attending only the initial segment of fundamental education, two analysis methodologies were applied: Correlogram and Factor Analysis (FA). For clarity in the results, new attributes were created referring to the federative entities that allowed identifying which states and school profiles are better related to the growth and good results of the Ideb. For these correlations, the Sigma of the Gaussian Copula was chosen, which takes into account the categorical and continuous data and also generated a definite positive matrix. The Correlogram generated a square matrix that presented the attribute relationships in a Heatmap Dendrogram. Divided into 4 large groups, each one had specific characteristics and relationships with federal entities. The first group had a strong relationship with basic infrastructure; the second group, with IDEB and the most sophisticated infrastructures; the third group showed few relationships between the attributes; and the last group had strong negative correlations and contained greater precariousness in infrastructure. After verifying the compatibility of the database for the application of the FA, it was estimated that 10 factors would be suitable for this study. Four factors were associated with the attributes of the Ideb, the focus of this work. Three patterns were also observed in the attributes that listed good results in the Ideb with different infrastructures, policies and/or educational proposals: the first group, guided by São Paulo state, presented basic sanitation offered by the public service, quality internet for use in learning and institutions schoolchildren; the second group, headed by Minas Gerais state, indicates an association with flexibility in traditional teaching, with school cycles and non-serial classrooms; the third group was marked by complementary activities and specialized care, represented by the Ceará state . In contradiction to these parameters, schools with the EJA modality, mainly in the northeast, tend to have lower results in the Ideb. The other 6 factors added a lot of relevant information, including those related to the correlations and anti-correlations of the federative entities and specific attributes. As seen, data science has a lot to add to the field of education. Future works are expected to add even more data, such as longitudinal studies on the Ideb and to add other educational indices such as the Ioeb and the socioeconomic level of the population.

4
  • DOMINGOS SAVIO FARIA PAES
  • Predictive detection of anomalies in computer networks using machine learning.

  • Líder : BRUNO GUAZZELLI BATISTA
  • MIEMBROS DE LA BANCA :
  • BRUNO GUAZZELLI BATISTA
  • CARLOS HENRIQUE VALERIO DE MORAES
  • LOURENCO ALVES PEREIRA JUNIOR
  • RAFAEL DE MAGALHAES DIAS FRINHANI
  • Data: 27-feb-2023


  • Resumen Espectáculo
  • With the increasing dependence on technologies on a daily basis, it is evident the con-
    cern to maintain the infrastructures that support its operation, thus guaranteeing a good
    experience for the end user. Thus, denial of service attacks are among the main causes
    of anomalies in computer networks, which can cause degradation or even interruption of
    services. In this context, the application of new technologies, such as artificial intelligence
    or machine learning, becomes increasingly necessary to ensure more agility in detecting
    problems, reducing their impacts. Thus, this work presents an analysis between different
    methods of classifier supervised machine learning, applied to data collected fromnetwork
    equipment, of the switch type, in order to detect anomalies in the network infrastruc-
    ture of a higher education institution. The machine learning methods used in this work
    were: Decision Tree, Random Forest, Extra Tree, Gradient Boosting, Extreme Gradient
    Boosting and Histogram Gradient Boosting. The models generated from these methods
    showed promise, being able to achieve results with 99.88% in the Weighted F1 metric
    and 99.16% of Balanced Accuracy. Other points, such as training time, prediction time
    and save file size, were also taken into account for the classification of the best method.
    Given the importance of fault detection tools, this work contributes to the definition of
    the best approaches and thus allows the development of new and more efficient tools for
    this purpose.

5
  • Otávio Soares Silva
  • Differential Transconductance Amplifier Characterization Based on CMOS Inverters

  • Líder : RODRIGO APARECIDO DA SILVA BRAGA
  • MIEMBROS DE LA BANCA :
  • DALTON MARTINI COLOMBO
  • PAULO MARCOS PINTO
  • RODRIGO APARECIDO DA SILVA BRAGA
  • SANDRO CARVALHO IZIDORO
  • Data: 03-abr-2023


  • Resumen Espectáculo
  • This dissertation presents the Differential Transconductance Differences Amplifier (FDDTA) architecture based on CMOS inverters. Designed on a 130 nm CMOS process, it operates in weak inversion when supplied with 0.25 V. Furthermore, FDDTA does not require supplemental external calibration circuitry such as bias current or voltage sources as it relies on the distributed layout technique that intrinsically matches CMOS inverters. For analytical purposes, we performed a detailed investigation that describes all the concepts and the entire functioning of the FDDTA architecture.

6
  • NATALIA SÁNCHEZ SÁNCHEZ
  • Topological Location System using Computer Vision

  • Líder : GIOVANI BERNARDES VITOR
  • MIEMBROS DE LA BANCA :
  • RUBEN DARIO HERNANDEZ BELENO
  • GIOVANI BERNARDES VITOR
  • RAFAEL FRANCISCO DOS SANTOS
  • Data: 20-abr-2023


  • Resumen Espectáculo
  • This paper presents an innovative methodology that uses computer vision techniques to perform the topological localization of an autonomous vehicle. The great advantage of this technique is that it eliminates the use of GPS or any continuous time position sensor, which can significantly increase safety in regions where location sensors are limited or absent. The methodology consists of building a topological map of the region of interest, where the points of interest are defined. To do this, several images of each coordinate are collected and go through filters and processing to form a georeferenced image bank. From there, the system receives as input a video, where the images are compared with the images in the image bank, using the SURF algorithm, to define if there is a correspondence with the coordinates of interest. If a match is identified, the algorithm defines the location of the vehicle on the topological map. The results of the experiments performed show a 91.5\% accuracy rate in detecting the points of interest within the topological map, indicating that this methodology can complement the navigation system of an autonomous vehicle efficiently and accurately.

7
  • LUCAS GOMES DE ALMEIDA
  • Driver’s Behavior Classification in Vehicular Communication Networks for Commercial Vehicles

  • Líder : BRUNO TARDIOLE KUEHNE
  • MIEMBROS DE LA BANCA :
  • STEPHAN REIFF-MARGANIEC
  • BRUNO TARDIOLE KUEHNE
  • EDVARD MARTINS DE OLIVEIRA
  • OTÁVIO DE SOUZA MARTINS GOMES
  • Data: 26-may-2023


  • Resumen Espectáculo
  • Vehicles are becoming more intelligent and connected due to the demand for faster, efficient, and safer transportation. For this transformation, it was necessary to increase the
    amount of data transferred between electronic modules in the vehicular network since it is
    vital for an intelligent system’s decision-making process. Hundreds of messages travel all
    the time in a vehicle, creating opportunities for analysis and development of new functions
    to assist the driver’s decision. Given this scenario, the dissertation presents the results of
    research to characterize driving styles of drivers using available information in vehicular
    communication network. This master thesis focuses on the process of information extraction from a vehicular network, analysis of the extracted features, and driver classification based on the extracted data. The study aims to identify aggressive driving behavior using real-world data collected from five different trucks running for a period of three months. The driver scoring method used in this study dynamically identifies aggressive driving behavior during predefined time windows by calculating jerk derived from the acquired data. In addition, the K-Means clustering technique was explored to group different behaviors into data clusters. Chapter 2 provides a comprehensive overview of the theoretical framework necessary for the successful development of this thesis. Chapter 3 details the process of data extraction from real and uncontrolled environments, including the steps taken to extract and refine the data. Chapter 4 focuses on the study of features extracted from the preprocessed data, and Chapter 5 presents two methods for identifying or grouping the data into clusters. The results obtained from this study have advanced the state-of-the-art of driver behavior classification and have proven to be satisfactory. The thesis addresses the gap in the literature by using data from real and uncontrolled environments, which required preprocessing before analysis. Furthermore, the study represents one of the pioneering studies conducted on commercial vehicles in an uncontrolled environment. In conclusion, this thesis provides insights into the development of driver behavior classification models using real-world data. Future research can build upon the techniques presented in this study and further refine the classification models. The thesis also addresses the threats to validity that were mitigated and provides recommendations for future research.

8
  • DANIEL PAIVA FERNANDES
  • Cost-efficient blockchain application to secure data transmission in heterogeneous FANETs

  • Líder : JEREMIAS BARBOSA MACHADO
  • MIEMBROS DE LA BANCA :
  • SIDNEY NASCIMENTO GIVIGI JUNIOR
  • JEREMIAS BARBOSA MACHADO
  • RODRIGO MAXIMIANO ANTUNES DE ALMEIDA
  • SERGIO RONALDO BARROS DOS SANTOS
  • Data: 26-jun-2023


  • Resumen Espectáculo
  • The development of vehicular networks has found a more fertile scenario with the advancement of ultra-reliable low latency communications (URLLC), deployment of fifth generation (5G) networks worldwide, empowerment of edge computing and adopting “Internet of Things” solutions in smart cities. To guarantee the success of these networks, it is essential to ensure that the communication process is reliable, safe from malicious actions, and that the solution has low computational complexity and energy consumption.
    Among vehicular networks that can take advantage of these new technologies are FANETs (Flying Ad-Hoc Networks), which can play a critical role in rescue missions and reconnaissance of risk areas. These networks need a solution that guarantees transparency, security and fault tolerance in a decentralised way to function correctly.
    Therefore, the present work proposes a proof-of-concept solution to ensure crash-fault tolerant communication in emulated heterogeneous Flying Ad-Hoc Networks (FANETs) using the Proof of Elapsed Time (PoET) consensus algorithm.

9
  • Bianca da Rocha Bartolomei
  • A proposal to support urban planning based on the use of data generated in the elaboration of public policy instruments

  • Líder : MELISE MARIA VEIGA DE PAULA
  • MIEMBROS DE LA BANCA :
  • JOSE MARIA NAZAR DAVID
  • ADLER DINIZ DE SOUZA
  • MELISE MARIA VEIGA DE PAULA
  • VANESSA CRISTINA OLIVEIRA DE SOUZA
  • Data: 30-jun-2023


  • Resumen Espectáculo
  • One of the existing goals in the 2030 Agenda for Sustainable Development is to increase inclusive and sustainable urbanization. Achieving this goal is already considered a challenge, since in Brazil many cities have already gone through, and still go through, a process of expansion and urbanization. In this context, the concept of urban planning emerges, understood as one of the ways to systematize this process, since it allows a better allocation of financial and human resources, in addition to defining actions and objectives in favor of solving collective problems. For this, urban policy instruments are defined, closely related to urban planning, since urban policy instruments are the tools and mechanisms used to implement planning and achieve the goals established for the city. The objective of the research presented in this master's thesis is to investigate solutions that support decision-making in the context of urban planning. For this, data collected in projects for the elaboration and updating of urban policy instruments were analyzed so that these data could be used in the elaboration of a solution. The methodology used was the Design Science Research Methodology (DSRM), the artifact proposed and developed was a decision support system in the form of a visualization panel of information about the spatial composition of a municipality in the interior of Minas Gerais . For this, concepts of geographic data analysis and information visualization were used. The panel was evaluated by a group of potential users and the hypothesis that the use of data generated by urban policy instruments can help municipal urban planning was corroborated by the responses obtained. With the study, it was possible to highlight the importance of the data considered, the potential of the proposed artifact, in addition to identifying opportunities for future work.

10
  • Iago Felicio Dornelas
  • Application of Laban's basic effort actions in an interactive 2D tool to support choreographic composition

  • Líder : RODRIGO DUARTE SEABRA
  • MIEMBROS DE LA BANCA :
  • LINA MARIA GARCES RODRIGUEZ
  • LUCIANA APARECIDA MARTINEZ ZAINA
  • RODRIGO DUARTE SEABRA
  • Data: 12-jul-2023


  • Resumen Espectáculo
  • The planning of movement in space by choreographers is crucial in choreographic composition, requiring a complex cognitive effort to transform an abstract product into a visual representation. Different means, from symbols and notations to digital tools, have been used to record and simulate movements. However, due to the specific nature of dance and its lack of availability as a technical training in Brazil, the methods consolidated over time, such as the concepts developed by choreographer Rudolf Laban, are not widespread or accessible to professional and amateur choreographers. We thus developed the Move Note tool, which allowed the participants in this research to explore dancers' trajectories through abstract animations. The tool made it possible to apply effects to the dancers' displacements, providing an innovative approach to represent Laban's basic effort actions in a two-dimensional environment. The development of the tool was based on an extensive bibliographic review, analysis of the state of the art and a survey on potential users. In order to investigate whether the application of Laban's concepts in an interactive tool could support choreographic composition, evaluations of users' experiences were carried out, adapted from the TAM (Technology Acceptance Model) and TTF (Task-Technology Fit) models. The results indicated that the tool developed was able to provide adequate support, since the satisfaction rates obtained in the analyses, together with the positive comments from the participants, evidenced the contribution provided by the tool. This work presents contributions both in terms of discussion about the interpretation of the data collected and reflection on the practical relevance of the research theme. Additionally, it introduces to the academic community a model of representation of Laban's basic effort actions in a two-dimensional environment, thus expanding the possibilities of research and application of these concepts to the fields of dance and technology.

11
  • MOISÉS PINHEIRO SOUZA
  • ProtCool 2.0: a client/server model for a docking protocol generator and molecular dynamics simulations in protein-ligand complex.

  • Líder : CARLOS HENRIQUE DA SILVEIRA
  • MIEMBROS DE LA BANCA :
  • KARINA DOS SANTOS MACHADO
  • CARLOS HENRIQUE DA SILVEIRA
  • RODRIGO APARECIDO DA SILVA BRAGA
  • Data: 18-jul-2023


  • Resumen Espectáculo
  • The COVID-19 pandemic has made it clear the high demand for computational systems
    that expedite the discovery of new drugs. In this regard, understanding the dynamic
    behavior of biomolecular complexes is crucial. Techniques involving molecular
    dynamics simulations of these complexes have increasingly been used to accelerate the
    identification of better drug candidates. However, the preparation of such simulations is
    highly complex, and their numerous details are not always adequately emphasized,
    compromising their reproducibility and reusability. To address this, the ProtCool tool was
    proposed—a protocol generator focused on integrating docking and molecular dynamics
    of protein-ligand complexes. In its initial version, this tool was restricted to the user's
    local environment. This work presents version 2.0 of ProtCool, developed under a clientserver model with a web interface. The aim is to fill the gaps left by the previous version,
    enhancing the software in three fundamental aspects: making it multi-platform, enabling
    access to multiple users, and making the tool more intuitive. The development of a userfriendly interface allows this new version to expand its scope of use to inexperienced or
    novice researchers in computational chemistry. ProtCool 2.0 does not execute the
    dynamics or perform result analyses; it is designed to be an expert in preparation based
    on workflows, following the programmed workflow and generating all the necessary
    configuration files for reliable execution of molecular dynamics on the user's
    computational setup. With the entire process being properly recorded, this allows for
    greater reproducibility and reusability of the preparations. Many of its functionalities are
    based on the adaptation of well-known tools from the literature in the field of molecular
    dynamics simulations. ProtCool 2.0 was developed using best practices and software
    engineering processes. Its client-server architecture implemented under the web standard
    enables it to be cross-platform and multi-user, providing benefits in availability and
    performance. It features a minimalist graphical interface with interactive resources that
    ensure user safety in correctly filling out their study parameters, thus preventing errors.
    To demonstrate its use, ProtCool 2.0 underwent a case study on the simulation of the
    interaction between acetylcholinesterase and galantamine, used in the treatment of
    Alzheimer's disease, which allowed for validation through replication of a simulation
    certified by peers in an international publication. The preparation of the simulation
    successfully enabled the execution of reliable molecular dynamics, reproducing the
    expected results. It is expected that this tool will not only bring greater speed,
    reproducibility, and reusability to molecular dynamics preparations but also contribute to
    smoothing the learning curve for these simulations in computational chemistry.

12
  • THIAGO MOREIRA DE FREITAS
  • Genetic Algorithms Applied to the Vehicle Routing Problem with Multiple Depots

  • Líder : RAFAEL FRANCISCO DOS SANTOS
  • MIEMBROS DE LA BANCA :
  • FERNANDO BERNARDES DE OLIVEIRA
  • RAFAEL FRANCISCO DOS SANTOS
  • SANDRO CARVALHO IZIDORO
  • Data: 30-ago-2023


  • Resumen Espectáculo
  • The Vehicle Routing Problem (VRP) has wide applications in logistics and transportation with great economic importance. VRP is a generalization of a large number of routing problems, which consist of finding the optimal number of routes, leaving a single depot, to serve a set of customers, minimizing routing costs and meeting a set of constraints. The Multi Depot Vehicle Routing Problem (MDVRP) is an extension of VRP, in which there is more than one depot distributed in a given geographic area. The rest of the problem is identical to VRP. There are several methods for solving MDVRP such as exact techniques, approximate algorithms and heuristics. Genetic Algorithms (GAs) are meta-heuristics widely used to find solutions to the MDVRP problem due to the stochastic characteristics of GAs and the efficiency in solving combinatorial problems and, for this reason, they were selected to be applied in this work. The developed algorithm was tested using instances present in the literature and compared with existing methodologies, in which the genetic algorithm found good results and the work contributed to the technique of selecting customers who can exchange between deposits. The results achieved show that this algorithm can be evaluated in real projects, making it possible to improve the operation of projects that face this type of problem, reducing transportation costs, distance, delivery time, services, among other benefits.

13
  • FERNANDO HIDEKI TAKENAKA
  • Review Summarizer Using TextRank and Topic Modeling

  • Líder : LAERCIO AUGUSTO BALDOCHI JUNIOR
  • MIEMBROS DE LA BANCA :
  • ISABELA NEVES DRUMMOND
  • LAERCIO AUGUSTO BALDOCHI JUNIOR
  • RAFAEL DUARTE COELHO DOS SANTOS
  • Data: 31-ago-2023


  • Resumen Espectáculo
  • Over the past decade, the Internet has changed the way people work, shop and socialize. Those changes resulted in the increase of User Generated Content (UGC) such as: ratings, reviews, wikis, and videos. UCG contains relevant information for decision-making, especially with regard to the acquisition of goods and services. However, the large volume and dispersion of this content makes it difficult to obtain relevant information. Text summarization appears as a way to make this content more accessible to people.
    A summary A can be considered better than another B when A is shorter than B while maintaining the same content relevance, or when A, despite being longer, presents more relevant content. Analyzing the literature, we observed that it is possible to produce better quality summaries than those produced by algorithms that correspond to the state of the art in text summarization. We present a multilingual automatic text summarizer that combines and extends the algorithms Latent Dirichlet Allocation (LDA) and TextRank. Our approach, when compared to the state of the art, generates better text summaries in terms of size and content relevance.

14
  • THIAGO SALES FREIRE LUZ
  • Analysis and comparison of ensemble classification algorithms on the exoplanet discovery

  • Líder : ENIO ROBERTO RIBEIRO
  • MIEMBROS DE LA BANCA :
  • ENIO ROBERTO RIBEIRO
  • ROBERTO SILVA NETTO
  • RODRIGO APARECIDO DA SILVA BRAGA
  • SANDRO CARVALHO IZIDORO
  • Data: 28-sep-2023


  • Resumen Espectáculo
  • Exoplanets are planets discovered outside our solar system. Their discovery happens because of scientific work with telescopes such as the Kepler. The data collected by Kepler is known as Kepler Object of Interest. Machine Learning algorithms are trained to classify these data into exoplanets or non-exoplanets. An Ensemble Algorithm is a type of Machine Learning technique that combines the prediction performance of two or more algorithms to gain an improved final prediction. The current works on exoplanet identification use mostly traditional non-Ensemble algorithms. Therefore, research that uses Ensemble algorithms for exoplanet identification is scarce. This paper performs a comparison among some Ensemble algorithms on the exoplanet identification process. Each algorithm is implemented with a set of different values for its parameters and executed multiple times. All executions are performed with the cross-validation method. A confusion matrix is created for each algorithm implementation. The results of each confusion matrix provided data to evaluate the following algorithm’s performance metrics: accuracy, sensitivity, specificity, precision, and F1 score. The Ensemble algorithms achieved an average performance of more than 80% in all metrics. Changing the default values of the Ensemble algorithms parameters improved their predictive performance. The algorithm with the best performance is Stacking. In summary, the Ensemble algorithms have great potential to improve exoplanet prediction. The Stacking algorithm achieved a higher performance than the other algorithms. This aspect is discussed in the text. The results of this work show that it is reasonable to increase the use of Ensemble algorithms. The reason is their high prediction performance to improve exoplanet identification.

15
  • RENATO FIGUEIREDO FRADE
  • Temporal and spatial characterization of street robberies contrasting pre-pandemic and pandemic contexts

  • Líder : CARLOS HENRIQUE DA SILVEIRA
  • MIEMBROS DE LA BANCA :
  • ADRIANO VELASQUE WERHLI
  • ALEXANDRE CARLOS BRANDAO RAMOS
  • CARLOS HENRIQUE DA SILVEIRA
  • Data: 09-dic-2023


  • Resumen Espectáculo
  • A multidisciplinary study conducted in Minas Gerais investigated the temporal dynamics
    of street robberies, analyzing both pre-pandemic and pandemic periods. Utilizing data
    from the Military Police, time series data were examined at various scales, including
    hourly, daily, 10-day intervals, and monthly, employing advanced statistical methods such
    as spectral frequency analysis, autocorrelations, and decomposition techniques.

16
  • BRENO DE OLIVEIRA RENÓ
  • Conf-eHealth - An Architecture for Developing eHealth Applications with Trustworthiness

  • Líder : EDVARD MARTINS DE OLIVEIRA
  • MIEMBROS DE LA BANCA :
  • HENRIQUE YOSHIKAZU SHISHIDO
  • BRUNO TARDIOLE KUEHNE
  • EDVARD MARTINS DE OLIVEIRA
  • Data: 13-dic-2023


  • Resumen Espectáculo
  • This work presents the Conf-eHealth architecture, which aims to be a reference model for developing eHealth applications with trustworthiness. The architecture is relevant due to the needs of technologies related to patient monitoring. Initially, a systematic literature review was carried out on the state of the art of eHealth applications and the main challenges faced in their development. After the literature review, the proposed reference architecture is described. The conception of needs, quality attributes and the methodology used to build the architecture are described and subsequently the architecture is presented through the concept of architectural visions.

    In order to guarantee the desired quality attributes, the work presents the evaluation of the proposed architecture in two parts. First an architectural evaluation was carried out based on the Software Architecture Analysis Method, resulting in a conceptual understanding of the architecture's ability to encompass quality attributes. Afterwards, an experiment was carried out, involving the construction of a mobile application according to the component mapped in the proposed architecture, explaining the possibility of implementing this component using a real database. The results show that the application is capable of handling the data received and assisting in decision making accurately. Finally, the conclusions of the work are presented, highlighting the results achieved, the importance of the Conf-eHealth architecture for the advancement of the area and indicating future work.

17
  • ANDRE LUIZ ALVES DIAS
  • Detection of Dissatisfaction in Public Servants with Artificial Intelligence

  • Líder : CARLOS HENRIQUE VALERIO DE MORAES
  • MIEMBROS DE LA BANCA :
  • AHMED ALI ABDALLA ESMIN
  • CARLOS HENRIQUE VALERIO DE MORAES
  • JOAO PAULO REUS RODRIGUES LEITE
  • Data: 15-dic-2023


  • Resumen Espectáculo
  • This work highlights a comprehensive investigation into the application of Artificial Intelligence (A.I.) in human resources management, with a specific focus on identifying employee dissatisfaction through machine learning approaches. The research included a review of scientific articles discussing both the implementation of A.I. in the context of human resources and the use of machine learning techniques to detect cases of turnover/attrition, along with the relationship between dissatisfaction and turnover/attrition cases.
    To assess these approaches, four validated public databases were selected. Three of them contained fictional employee data, and one contained real employee turnover data. Each database underwent a process of textual field factorization, followed by analyses to highlight the data distributions in each set.
    In conducting the research, different machine learning approaches were applied to each of the databases, aiming to verify the feasibility of identifying dissatisfaction through A.I. The techniques used included anomaly or novelty detection, classifiers, and optimized sets of classifiers. The results were quantified, revealing promising scores, with performances exceeding 90%. These results emphasize the overall effectiveness of machine learning in identifying employee dissatisfaction, demonstrating its potential for practical applications in the human resources environment.

18
  • CHARLY BRAGA VENTURA
  • Identification of miner and rust in coffee plants using digital image processing and convolutional neural networks

  • Líder : SANDRO CARVALHO IZIDORO
  • MIEMBROS DE LA BANCA :
  • RODRIGO APARECIDO DA SILVA BRAGA
  • SANDRO CARVALHO IZIDORO
  • VALDETE MARIA GONÇALVES DE ALMEIDA
  • Data: 15-dic-2023


  • Resumen Espectáculo
  • The world’s demand for coffee increases every year, reaching 178.5 million 60 kg bags in
    the period 2022-2023, an increase of 1.7% compared to the previous period 2021-2022.
    The total production of Brazil’s coffee harvest in 2022 was calculated at 50.92 million bags
    of 60 kg of processed coffee 2, thus making it the world’s largest producer of the product.
    With this production volume, there is a growing need to improve product quality due
    to the demands of national and international markets. However, pests such as leaf miner
    and rust cause extensive damage to coffee plantations, resulting in crop losses annually.
    Various methods and techniques have been developed and applied to assess the level of
    infestation and control of these pests. Among these techniques are the use of computer
    vision and convolutional neural networks (CNNs). Thus, the objective of this work was to
    develop computational tools to correctly identify the presence of pests, reducing evaluation
    time, evaluator error, and labor costs. The accuracies of these methods developed were
    between 99.67% and 97.00%. In addition, a tool was developed to quantify the degree of
    infestation, achieving an accuracy of 86.67%.

19
  • RAFFAEL CLEISSON DE CARVALHO
  • An analysis of the application of Nudge in public consultations

  • Líder : MELISE MARIA VEIGA DE PAULA
  • MIEMBROS DE LA BANCA :
  • PABLO VIEIRA FLORENTINO
  • ADLER DINIZ DE SOUZA
  • MELISE MARIA VEIGA DE PAULA
  • Data: 18-dic-2023


  • Resumen Espectáculo
  • Public consultation is a direct democracy device that aims to allow the population to participate in decision-making in various areas of public policy. In Brazil, the Fiscal Responsibility Law establishes that a public consultation must precede the preparation of the public budget, which aims to regulate the state's financial revenue and expenditure activities for one year. This example of public consultation is called Participatory Budgeting. However, population engagement is still a problem for several reasons. Therefore, it is important to investigate strategies that increase citizen participation. In some scenarios, Nudges theory can be used. Nudges are small, low-investment modifications made to a person's environment to change their behavior. The application of Nudge in this research aims to encourage citizens to participate in a public consultation context. To enable analysis, it was part of the scope of this research to propose and develop a tool. The application was analyzed in a real scenario which allowed us to conclude that the results with the application of Nudge were adequate.

2022
Disertaciones
1
  • GUILHERME XAVIER FERREIRA
  • Visual Analysis for Industrial Vehicle Fleets for Prospecting Logistics Solutions Using a Model-Oriented Approach

  • Líder : BRUNO GUAZZELLI BATISTA
  • MIEMBROS DE LA BANCA :
  • ADLER DINIZ DE SOUZA
  • BRUNO GUAZZELLI BATISTA
  • MELISE MARIA VEIGA DE PAULA
  • SERGIO ASSIS RODRIGUES
  • Data: 03-mar-2022


  • Resumen Espectáculo
  • Fleet planning and management is the subject of optimization research, where vehicle fleet models are created to assist managers in short, medium and long-term decision making. In steel companies, industrial vehicles are used to move heavy materials. The planning and optimization of vehicles quantity of the fleet are important to reduce costs and accomplish production plans. Logistics analysts are responsible for these activities, using systems to monitor industrial vehicle operations. However, the analysis and evaluation of the fleet utilization is still challenging, mainly due to the volume and form in which data is generated by the systems. This is a kind of problem that is generally out of the scope of fleet planning optimization models. In databases with high volume and complexity, visual analysis is an alternative for extracting insights and important information with the application of information visualization and data analysis techniques. This dissertation proposes the investigation of visual analysis as a means of assisting in the planning of industrial vehicle fleets. The research was conducted through the investigation of a real context of industrial vehicle planning through the lens of Model Building Visual Analytics and Design Science Research (DSR). As a result two artifcats were developed: a fleet measurement model was developed to address the lack of a fleet measurement standard; and Fleet Profile, a solution to support fleet planning based on the visual description of the fleet. Two research cycles were realized, each one producing a version of the Fleet Profile. At the end, an evaluation of the Fleet Profile is carried out with two cases of real fleets based on the experience of potential users with the solution. From the analysis of the qualitative data, it was found that the solution was able to support the evaluation of the fleet use and other activities of the analysts from the visual description of the fleet.

2
  • LEANDRO DINIZ DE JESUS
  • Transmission line inspection using low-cost multi-drones.

  • Líder : ALEXANDRE CARLOS BRANDAO RAMOS
  • MIEMBROS DE LA BANCA :
  • ALEXANDRE CARLOS BRANDAO RAMOS
  • HILDEBRANDO FERREIRA DE CASTRO FILHO
  • LINA MARIA GARCES RODRIGUEZ
  • Data: 15-jun-2022


  • Resumen Espectáculo
  • Periodically the electric power transmission lines need to be inspected to avoid future
    problems or even to solve problems that have been noticed. In order to carry out these
    inspections, companies have been trying to make it as safe as possible so that there is not
    a very high expense and do not put any person at risk, for this they are increasingly using
    drones to inspect, however these drones need to have people nearby operating navigation
    and with that it often ends up not being as efficient as it ends up passing some problem.
    To solve the problem of detecting a possible problem on the line I come to present a
    solution using three drones that will make an autonomous flight over the line. The first
    drone will initially travel to the first tower and with a camera, using a convolutional
    neural network it will detect the tower and any possible problem that is found in it and
    record the location of the problem tower for someone to do maintenance on it, the the
    second drone will be responsible for checking the wires that conduct the electrical energy
    and the third drone will be responsible for detecting if there is something under the line
    that may be at risk or have come to cause future problems for the line.
    The solution was implemented in the airsim simulator, where an environment already
    available by the software for autonomous navigation tests with a high voltage line was
    used. The navigation of the drones was made from the positioning of the towers where
    the location of the latitude and longitude of each one is already known.

3
  • DJENANE CRISTINA SILVEIRA DOS SANTOS
  • Proposal for a decision support tool: towards a smarter city

  • Líder : ADLER DINIZ DE SOUZA
  • MIEMBROS DE LA BANCA :
  • ANGELA MARIA ALVES
  • ADLER DINIZ DE SOUZA
  • MELISE MARIA VEIGA DE PAULA
  • Data: 27-jul-2022


  • Resumen Espectáculo
  • The strategic planning of a city should preferably be based on the Municipal Master Plan (PDM), which must be prepared from the perspective of sustainable urban development and contribute to the achievement of the 17 Sustainable Development Goals (SDGs) of the United Nations 2030 Agenda. The development of smart cities is a global phenomenon that is closely related to the 17 SDGs. Most Brazilian cities are still far from what can be considered a smart city and some do not even know which way to go to evolve in this scenario. In view of the above, the present work proposes a tool that aims to support municipalities in the quest to become more intelligent. At first, the tool aims to support the elaboration and review of municipal master plans and other urban development plans prepared to implement the master plan, through the current analysis of the municipality in relation to the Brazilian standard (NBR) ISO 37122, of Smart Cities. In a second step, the tool aims to enable the management and balancing of the portfolio of candidate projects to be executed by the city, indicating the projects, programs and portfolios that will maximize the reach of the ISO 37122 indicators and the reach of the SDGs. The tool will make it possible to analyze how the portfolios of candidate projects to be executed are in fact aligned with the PDM strategies. The case study and validation of the tool brought positive results, and one can see the real need and importance of the tool developed in this work for public management. The main objective was achieved, to propose and validate a methodology and a tool that allows: (i) the diagnosis of cities in relation to the indicators of ISO 37122 and the SDGs, and the support to the decision making of municipal managers for the (ii) selection and balancing the stock portfolio (projects, programs and portfolio) that maximize the chances of making it smarter.

4
  • FELIPE AUGUSTO FEICHAS
  • EVALUATION OF PERCEPTION OF USE OF A GAMIFIED PLATFORM FROM THE STUDENT PERSPECTIVE: AN APPROACH IN THE UML STUDY

  • Líder : RODRIGO DUARTE SEABRA
  • MIEMBROS DE LA BANCA :
  • LUCIA MARIA MARTINS GIRAFFA
  • LINA MARIA GARCES RODRIGUEZ
  • RODRIGO DUARTE SEABRA
  • Data: 29-jul-2022


  • Resumen Espectáculo
  • Software modeling is considered one of the most important topics in software engineering education. Currently, the Unified Modeling Language (UML) is the most widespread and used software modeling language in the software engineering industry. Although the UML is constantly being improved and studied, many studies show that there is difficulty in teaching and learning the subject, due to the complexity of its concepts and the students' cognitive difficulties with abstraction. Also, students face difficulties in understanding the semantics and syntax of models, as well as structuring the information in these models. In addition, there are difficulties for teachers in finding different pedagogical strategies, in order to teach modeling. In this sense, some researches thus search for new tools, techniques or methodologies that help teachers and motivate students regarding the study of UML. This work proposed the development of a web platform to support the studying of software modeling with the UML, using gamification resources. The platform proposed allowed students to complement their UML knowledge in an environment with game elements. Aiming to investigate the impact of using the developed gamified platform, a case study was carried out to evaluate the user experience and satisfaction from the student perspective. From the results, it can be concluded that the platform obtained great acceptance and satisfaction of use. Most of the students participating in the research were satisfied with the usability of the platform, reporting a feeling of contribution of the tool in the studying of the content, in addition to pointing out the satisfaction of using gamification as a pedagogical strategy. As a result, the platform was effective in terms of engaging and motivating students, being a complement to the traditional teaching method.

2021
Disertaciones
1
  • Caio Pinheiro Santana
  • Uma Meta-análise de Técnicas de Classificação Baseadas em Aprendizado de Máquina Usando Dados de rs-fMRI para Diagnóstico do Transtorno do Espectro Autista

  • Líder : GUILHERME SOUSA BASTOS
  • MIEMBROS DE LA BANCA :
  • RICARDO ZORZETTO NICOLIELLO VENCIO
  • ADLER DINIZ DE SOUZA
  • GUILHERME SOUSA BASTOS
  • LUCELMO LACERDA DE BRITO
  • RAFAEL DE MAGALHAES DIAS FRINHANI
  • Data: 24-feb-2021


  • Resumen Espectáculo
  • The Autism Spectrum Disorder (ASD) is a complex and heterogeneous neurodevelopmental condition characterized by cognitive, behavioral, and social dysfunction. Much effort is being made to identify brain imaging biomarkers and develop tools that could facilitate its diagnosis - currently based on behavioral criteria through a lengthy and time-consuming process. In particular, the use of Machine Learning (ML) classifiers based on resting-state functional Magnetic Resonance Imaging (rs-fMRI) data is promising, but there is an ongoing need for further research on their accuracy. Therefore, we conducted a systematic review and meta-analysis to summarize and aggregate the available evidence in the literature so far. The systematic search resulted in the selection of 93 articles, whose data were extracted and analyzed through the systematic review. A bivariate random-effects meta-analytic model was implemented to investigate the sensitivity and specificity across the 55 studies (132 independent samples) that offered sufficient information for a quantitative analysis. Our results indicated overall summary sensitivity and specificity estimates of 73.8% (95% CI: 71.8-75.8%) and 74.8% (95% CI: 72.3-77.1%), respectively, and Support Vector Machine (SVM) stood out as the most used classifier, presenting summary estimates above 76%. Studies with bigger samples tended to obtain worse accuracies, except in the subgroup analysis for Artificial Neural Network (ANN) classifiers. The use of other brain imaging or phenotypic data to complement rs-fMRI information seem to be promising, achieving specially higher sensitivities (p = 0.002) when compared to rs-fMRI data alone (84.7% - 95% CI: 78.5-89.4% - versus 72.8% - 95% CI: 70.6-74.8%). Lower values of sensitivity/specificity were found when the number of Regions of Interest (ROIs) increased. We also highlight the performance of the approaches using the Automated Anatomical Labelling atlas with 116 ROIs (AAL116). Regarding the features used to train the classifiers, we found better results using the Pearson Correlation (PC) Fisher-transformed or other features in comparison to the use of the PC without modifications. Finally, our analysis showed AUC values between acceptable and excellent, but given the many limitations indicated in our study, further well-designed studies are warranted to extend the potential use of those classification algorithms to clinical settings.

2
  • IGOR DUARTE RODRIGUES
  • Identification of Brain Regions for TEA Severity Classification Using Machine Learning and fMRI

  • Líder : GUILHERME SOUSA BASTOS
  • MIEMBROS DE LA BANCA :
  • GUILHERME SOUSA BASTOS
  • RAFAEL DE MAGALHAES DIAS FRINHANI
  • RICARDO ZORZETTO NICOLIELLO VENCIO
  • Data: 26-feb-2021


  • Resumen Espectáculo
  • Autism Spectrum Disorder (ASD) is an age- and sex-related lifelong neurodevelopmental disorder characterized primarily by social impairments. Current ASD prevalence indicates that 1/59 children are diagnosed inside the spectrum. The Autism Diagnostic Observation Schedule, Second Edition (ADOS-2) classifies ASD according to the disorder severity. ADOS-2 classifies as 'autism' cases that manifest more severe symptoms and as 'ASD non-autism' cases that exhibit milder symptoms. Many papers aimed to create algorithms to diagnose ASD through Machine Learning (ML) and functional Magnetic Resonance Images (fMRI). Such approaches evaluate the oxygen flow in the brain to classify the subjects as ASD or typical development. However, most of these works, do not provided information regarding the disorder severity. This paper aims to use ML and fMRI to classify the disorder severity, aim to find brain regions potentially related to the disorder severity. We used fMRI data of 202 subjects and their ADOS-2 scores available at the ABIDE consortium to determine the correct ASD sub-class for each one. Our results corroborate the initial hypothesis of functional differences within ASD, with some brain regions where the functional difference was enough to create classification accuracy of 74%. This paper has limitations regarding the total number of samples. However, it shows a promising approach to ASD diagnosis.

3
  • VINICIUS DE ALMEIDA PAIVA
  • GASS-Metal: a web server for metal-binding site identification in proteins based on parallel genetic algorithms

  • Líder : SANDRO CARVALHO IZIDORO
  • MIEMBROS DE LA BANCA :
  • VALDETE MARIA GONÇALVES DE ALMEIDA
  • GIOVANI BERNARDES VITOR
  • SABRINA DE AZEVEDO SILVEIRA
  • SANDRO CARVALHO IZIDORO
  • Data: 05-mar-2021


  • Resumen Espectáculo
  • Metals are present in more than 30% of proteins found in nature and perform important biological functions, in addition they act in the maintenance of protein structure. Metal ions in proteins are bounded to groups of atoms and this set is called a metal-binding site. Metal-binding sites can perform catalytic, structural, transport and electron transfer functions in a protein.

    Traditional and experimental techniques for metal-binding site prediction usually find obstacles related to time and cost of execution, making computational tools that can assist in predictions become even more important. Several methods in the literature have made efforts to predict metal-binding sites and have shown great results, but they still encounter barriers due to issues related to protein size, type of ions and ligands, ability to find inter-domain residues and even when obtaining not good accuracy rates.

    The main goal of this master thesis is to adapt GASS algorithm (Genetic Active Site Search), initially proposed for the prediction of catalytic sites, to search for metal-binding sites. The method developed, GASS-Metal, divides residues of a protein in three-dimensional space and uses parallelism of genetic algorithms to find candidate sites that are close in relation to the distance of cured templates from M-CSA and MetalPDB.

    The results of the sanity and homologous protein tests showed that GASS-Metal is a robust method, capable of finding metal-binding sites in different types of ions and does not restrict its search to a single chain. In addition, when using conservative mutations, the prediction accuracy rate improves even more, helping to find sites in situations where it was previously impossible, due to the lack of residues in certain proteins.

    In comparison to state-of-the-art predictors, GASS-Metal achieved satisfactory performance in predicting metal-binding sites of different ions. The results showed that the method was superior in the prediction in 5 of the 12 metal ions evaluated and still obtained equivalent performance in other 6 different metal-binding sites.


4
  • ANTONIO JOSIVALDO DANTAS FILHO
  • Digital Image Processing Algorithms for Tracking Power Transmission Lines

  • Líder : ALEXANDRE CARLOS BRANDAO RAMOS
  • MIEMBROS DE LA BANCA :
  • ALEXANDRE CARLOS BRANDAO RAMOS
  • CARLOS HENRIQUE DA SILVEIRA
  • ELCIO HIDEITI SHIGUEMORI
  • Data: 06-dic-2021


  • Resumen Espectáculo
  • High power power transmission lines are of great importance for the functioning of all sectors of society. To guarantee the reliability and availability of the energy supply, regular and occasional inspections are carried out, always manually. These inspections seek to identify anomalies in the transmission network and signs of danger to structures. Unmanned aerial vehicles, popularly known as drones, have been increasingly employed in repetitive and tedious activities, due to their low cost and increased safety, and can be applied to tracking these structures. However, the variety of environments and towers, error in location measurements, and high cost are difficult obstacles to overcome for accurate and quality inspection. The main objective of this work is to study, develop and compare real-time image processing techniques to determine movements to be performed by the drone in trajectory correction, and thus perform a tracking of transmission lines with high precision and low cost. Several solutions are found in the literature, and are located and systematically reviewed in this study. Regarding the analysis of the hardware line, controller boards and frameworks were compared, as well as ground station and simulator software, with the main focus on image processing using edge detector, deep learning and reinforcement learning. Furthermore, in relation to the training of networks, a dataset was made for identifying lines, and another for actions to be performed. Also, in order to evaluate the solution's performance, a realistic virtual scenario based on a real route was produced. In short, a system based on a prototype equipped with two quadricopter drones equipped with only a camera and a computer, working in conjunction with a cooperative algorithm and deep learning image processing, resulted in a brief tracking in a power transmission line, simulating an inspection in a real environment.

5
  • FLÁVIO BELIZÁRIO DA SILVA MOTA
  • Analysis of Personality Traits in Electronic Participation Environmentsf

  • Líder : MELISE MARIA VEIGA DE PAULA
  • MIEMBROS DE LA BANCA :
  • FLAVIA CRISTINA BERNARDINI
  • CARLOS HENRIQUE DA SILVEIRA
  • ISABELA NEVES DRUMMOND
  • MELISE MARIA VEIGA DE PAULA
  • Data: 13-dic-2021


  • Resumen Espectáculo
  • Electronic Participation, or e-Participation, is defined as citizens’ participation in decision making from public administration using Information and Communications Technologies.
    The e-Participation is a multidisciplinary research field, with contributions from areas
    such as political science, sociology, administration, psychology, and economics, as well as
    contributions of a more technical nature, such as computer science. The field of psychology,
    specifically, allows the investigation of human personality and its motivations, providing
    structural representations of personality traits and making it possible to describe
    an individual’s intentions to participate in these environments. Thus, this work aims to
    study computational techniques for identifying and analyzing personality traits in Electronic
    Participation environments. To aim this, Machine Learning algorithms, specifically
    regression models, and the Five-Factor Model, also known as the Big Five, were used.
    Analyzing the results, it was found that the Random Forest model had the best performance,
    with a mean absolute error of 0.02619. In addition, in the context of the tool
    analyzed, the personality trait that stands out the most is Openness, accompanied by
    Conscientiousness and Agreeableness. Extraversion and Neuroticism traits appear with
    lower scores.

6
  • DIEGO EUGÊNIO FERNANDES PEREIRA
  • OPEN EDUCATIONAL RESOURCE FOR THE STUDY OF ALGORITHMS AND PROGRAMMING LOGIC: AN APPROACH IN TECHNICAL EDUCATION INTEGRATED INTO HIGH SCHOOL

  • Líder : RODRIGO DUARTE SEABRA
  • MIEMBROS DE LA BANCA :
  • ANDRÉ PIMENTA FREIRE
  • MELISE MARIA VEIGA DE PAULA
  • RODRIGO DUARTE SEABRA
  • Data: 16-dic-2021


  • Resumen Espectáculo
  • The teaching and learning of programming, in general, has proven to be a challenge for students of computer and related courses, since they present challenges and require complex skills, such as logical-mathematical reasoning, for their good development. Furthermore, the traditional teaching model is not able to motivate students and arouse their interest in the topic. There is, therefore, a need for the actors involved in education to review their way of teaching, integrating information and communication technologies in education in the same way as they are present in our daily lives. In this sense, the use of an Open Educational Resource (OER) not only has the potential to contribute to the modernization of the current teaching model, but also to collaborate on improving the quality of education. The tool proposed herein, the REA-LP, aims to facilitate studying and retention of content related to the discipline of programming logic at the technical level, by presenting its content through various types of media, such as audio, video, text, and static and dynamic images. Therefore, the tool allows students to actively participate in the construction of their knowledge, favoring engagement and motivation, in addition to enabling the review of content considered essential for the proper development of the discipline. To assess the perception of use by students and its impact on the study of algorithms and programming logic, as well as to evaluate the resource pedagogically, an empirical study was carried out with 39 students of an informatics technical course integrated with secondary school. In the research, some questionnaires were applied, as well as focus groups interviews and a performance test were carried out. The results allow concluding that the tool was very well accepted, being effective in its function of facilitating and assisting students in their learning, motivation, and interest in classes, mainly due to the way in which the content is presented at REA-LP, with emphasis on animated media and review modules, exceeding their expectations.

7
  • Gustavo Emmanuel Reis Guedes Alves
  • An exploratory study to analyze the impact of using the P-EPV framework in an electronic participation environment

  • Líder : MELISE MARIA VEIGA DE PAULA
  • MIEMBROS DE LA BANCA :
  • SERGIO ASSIS RODRIGUES
  • ADLER DINIZ DE SOUZA
  • MELISE MARIA VEIGA DE PAULA
  • Data: 17-dic-2021


  • Resumen Espectáculo
  • The dissemination of Information and Communication Technologies (ICTs) provided the development of initiatives related to social and political activities, including electronic participation. These initiatives help to solve society's daily problems, but there is still a very relevant challenge in relation to the engagement of citizens in these projects. In this study, an electronic participation social network called SoPa (Participatory Society) was investigated in the context of its use as an instrument of electronic participation in the construction of sector and directors plans in some cities in the south of Minas Gerais. In this context, the objective of this study was to formulate and validate hypotheses for improvements that would impact to increase engagement of users in this social network. For this, digital marketing and growth hacking strategies were used, with the aid of a framework called P-EPV, which defines steps to be performed in order to organize the development and the validation of these hypotheses. Based on the collected data, it was found that some of these digital marketing strategies, together with the application of this framework, could be useful in solving the challenges associated with engagement in this type of initiative. In addition to the validation of the P-EPV framework, the main contributions of this study are the analysis of the digital marketing strategies used and their results in this real context of electronic participation. These strategies have proven to be effective in terms of increasing engagement and can serve as a basis for other initiatives that face the same difficulties.

2020
Disertaciones
1
  • WICTOR SOUZA MARTINS
  • Optimization Model for Big Data Architectures

  • Líder : BRUNO TARDIOLE KUEHNE
  • MIEMBROS DE LA BANCA :
  • BRUNO TARDIOLE KUEHNE
  • EDMILSON MARMO MOREIRA
  • LOURENCO ALVES PEREIRA JUNIOR
  • Data: 14-feb-2020


  • Resumen Espectáculo
  • Big data technology is increasingly present in industry and academic research by providing value to massive data analysis. Improvement of this technology is necessary due to the exponential growth of information provided by various sources present in our daily lives, such as environment sensors, cameras, smartphones and social networks. A big challenge for big data applications is balancing their performance against their operating costs. Thus, the Big Data application provider needs to continuously monitor their system to identify improvement points within their architecture. In this context, this work develops a refernce method for performance evaluation for Big Data architectures, called by Optimization Model for Big Data Architectures (OMBDA) aimming to improve the performance and consequently raising quality of service provided. Its main contribution is directed to small companies and startups that have limited financial resources, so investing in ready-made solutions in the market is impossible. The proposed model considers the relationship of processes within a data processing flow to find possible bottlenecks and optimization points. To this end, OMBDA collects system logs to compose functional metrics (e.g., processing time) and non-functional metrics (e.g., CPU and memory utilization, and other cloud computing infrastructure resources). These metrics are stored in an external data analysis tool that investigates the correlation of performance between processes. In this work, the model is applied to the architecture of a Big Data application, which provides solutions in fleet logistics. Through the application of OMBDA it was possible to identify performance bottlenecks, enabling the architecture reconfiguration to increase the quality of service at the lowest possible cost.

2
  • LUIZ GUSTAVO MIRANDA PINTO
  • Analysis and Deployment of an OCR - SSD Deep Learning Technique for Real-Time Active Car Tracking and Positioning on a Quadrotor

  • Líder : ALEXANDRE CARLOS BRANDAO RAMOS
  • MIEMBROS DE LA BANCA :
  • ALEXANDRE CARLOS BRANDAO RAMOS
  • ELCIO HIDEITI SHIGUEMORI
  • ROBERTO AFFONSO DA COSTA JUNIOR
  • Data: 17-feb-2020


  • Resumen Espectáculo
  • This work has the goal of presenting a technique build with a deep learning algorithm for real-time object tracking called Single-Shot MultiBox Detector – SSD as a source for object detection in images, in combination with a Optical Character Recognition OCR open library for Automatic License Plate Recognition - ALPR, called OpenALPR, for a vehicle real-time license plate recognition, and MAVSDK, a MAVLink protocol SDK, serving as assistance for determining the real-time positioning and tracking for a F450 quadrotor during software assisted autonomous flight. The whole algorithm was implemented in Python programming language as a combination of OpenCV library for video and image processing, remote PX4 control with MAVSDK, which is the drone flight stack, OpenALPR for plate recognition and a custom SSD deep learning model trained using Caffe and TensorFlow frameworks. A mini FPV camera was used as a real-time streaming source for the SSD and OpenALPR, being processed by OpenCV and coordinated by MAVSDK, which made it possible to achieve both position and track control. Both simulated and outdoor experiments were conducted to collect results from both positioning and tracking for further analysis and refinement.

3
  • PEDRO LUCAS DE BRITO
  • An Analysis of Use of Image Processing and Neural Networks for

    Window Crossing in an Autonomous Drone

  • Líder : ALEXANDRE CARLOS BRANDAO RAMOS
  • MIEMBROS DE LA BANCA :
  • ALEXANDRE CARLOS BRANDAO RAMOS
  • ROBERTO CLAUDINO DA SILVA
  • ELCIO HIDEITI SHIGUEMORI
  • Data: 17-feb-2020


  • Resumen Espectáculo
  • This work investigates the implementation of an autonomous control system for dro-
    nes, this system is capable of crossing windows on flights through closed places, through

    image processing and the use of convolutional neural networks. The strategy used is the
    detection of objects, through its location in the captured image it is possible to carry out

    a programmable route for the drone. In this study, this location of the object was esta-
    blished by bounding boxes, which define the quadrilateral around the found object. The

    system was based on the use of an open source autopilot, Pixhawk, which has a control

    and simulation environment capable of doing the job. Two detection techniques were stu-
    died, the first based on image processing filters, which captured polygons that represent

    a passage inside a window. The other approach was studied for a more real environment,
    implemented with the use of convolutional neural networks for object detection, with this

    type of network it is possible to detect a large number of windows. The system also in-
    volves the implementation of a control method, which is able to apply the data captured

    by the detectors and perform the calculation of the appropriate movement speed. The
    study brings evaluations of the system and demonstrations of the tests performed for its
    validation.

4
  • PAULO RICARDO ZAMBELLI TAVEIRA

  • Non-Intrusive Load Identification by Random Forest with Fireworks Optimization

  • Líder : CARLOS HENRIQUE VALERIO DE MORAES
  • MIEMBROS DE LA BANCA :
  • CARLOS HENRIQUE VALERIO DE MORAES
  • ISABELA NEVES DRUMMOND
  • LUIZ EDUARDO DA SILVA
  • Data: 14-abr-2020


  • Resumen Espectáculo
  • The control of expenses related to electricity has been showing a great growth, especially
    in residential environments. Monitoring of electrical loads that are turning on and off from
    a home are often performed using smart-plugs, providing to the consumers information
    about operation intervals and power consumed by each device. Despite a practical solution
    to control and reduce electricity costs, it has a high cost due to the amount of meters
    required. The high cost problem can be worked around by using a non-intrusive load
    monitoring proposal (NILM), where voltage and current measurements are taken at the
    home entrance, in counterpart demand a extra processing step. In this extra step, it is
    necessary to calculate the powers, identification of the occurrence of events and finally,
    the identification of which equipment was turned on or off. The proposals of this work
    were to use a new power calculation standard proposed by the IEEE (1459-2010), the
    elaboration of a heuristic event detector using floating analysis windows to locate stability
    zones in the power signals after indicating a power variation above a predetermined value,
    testing the best way to dispose of event identifier data to identify which load has been
    added or removed from the monitored circuit, and optimization of the parameters of the
    Random Forest classifier using the fireworks optimization algorithm (FA). The proposed
    event identifier and classifier tests were performed on the dataset BLUED, which contains
    data collected at a north american residence over a period of one week. For the classifier
    tests, four different forms of data entry were used, and subsequently the two forms that
    obtained the best performances were used in the classifier optimization process. The event
    identifier results were compared with other publications that used different approaches
    and obtained satisfactory results. And the results of the classifications were compared
    to each other, for using different data entry forms, and also as an ideal classifier, where
    an improvement in the results was also observed when compared with the results of a
    classifier with commonly used parameters, presenting a larger number of trees used in
    each RF, but with a limited depth of each tree. And the importance of the variables
    involved in the classification process were calculated.

5
  • JOSÉ WAGNER DE ANDRADE JÚNIOR
  • Retroactive Data Structures: Applications in dynamizing algorithms

  • Líder : RODRIGO DUARTE SEABRA
  • MIEMBROS DE LA BANCA :
  • MAYRON CESAR DE OLIVEIRA MOREIRA
  • PEDRO HENRIQUE DEL BIANCO HOKAMA
  • RODRIGO DUARTE SEABRA
  • Data: 09-jun-2020


  • Resumen Espectáculo
  • The retroactivity in programming is the study of a modification in a timeline for a data structure and the effects that this modification exerts throughout its existence. In general, the analysis and implementation tend to be more costly computationally, because a modification on these data structure in the past can generate a cascade effect through all the data structure timeline. The concept of retroactivity generates tools and structures that optimize the solutions facing these temporal problems. This type of data structure can be used in, for example, shortest path algorithms, security applications, and geometric problems. In
    this thesis, we have the theoretical subsidies about these data structures, a detailed material about the implementation of this structures, using retroactivity, and the implementation of some problems that retroactivity can be used, for example, the fully dynamic minimum spanning tree problem. For each data structure, we executed practical tests about this data retroactive data structures and a comparison between these solutions and other approaches. The tests showed that the retroactive implementations proposed by Demaine et. al (2007) obtained the best results from a temporal point of view. It was proposed two algorithms which used the retroactivity concepts inside its development: the fully retroactive minimum spanning tree and the single source dynamic shortest path problem in dynamic graphs. Let m be data structure’s timeline, V (G) and A(G) the sets of vertices and edges from graph G. We reached an amortized time complexity O( √m · lg |V (G)|) per query/update operation in the fully retroactive minimum spanning tree algorithm. The algorithm to solve the single source dynamic shortest path problem in dynamic graphs proposed by Sunita et. al [52] obtained a time complexity O(|A(G)| · lg |V (G)|) per modification using a non-oblivious retroactive priority queue.

6
  • Anderson Tadeu de Oliveira Vicente
  • Mapping, Conversion and Automatic Migration of Relational Databases to Graph Databases

  • Líder : EDMILSON MARMO MOREIRA
  • MIEMBROS DE LA BANCA :
  • ALEXANDRE DONIZETI ALVES
  • EDMILSON MARMO MOREIRA
  • ENZO SERAPHIM
  • VANESSA CRISTINA OLIVEIRA DE SOUZA
  • Data: 28-ago-2020


  • Resumen Espectáculo
  • Relational Databases are the most used models in several applications due to the ease of their query language and use in multi-user environments. With the great volume of information that we have today and, since they are increasingly related, databases oriented to graphs appear as a way to deal with this new demand, given the difficulties of the relational model to this new scenario. In view of this, this research dealt with the processes of mapping, conversion and migration from the relational model to the graph-oriented one, dealing, above all, with the semantic overload of constructors between the two models. The aim of this study was to develop an application, called \ textit {ThrusterDB}, that performs this process of converting the relational model to the graph-oriented one automatically. The research contributes by integrating the phases of mapping, conversion and automatic migration from a relational database to a graph-oriented one. This dissertation presents results that show that the generated database, after the process, provides a better performance in the average time of consultations carried out, in addition to preserving the semantics of the source relational database, without any loss or redundancy of data.

7
  • KAYQUE WILLY REIS DE OLIVEIRA
  • PARTICIPATING WITH GAMES: AN ANALYSIS OF GAMIFICATION AS INTERVENTION IN SURVEYS

  • Líder : MELISE MARIA VEIGA DE PAULA
  • MIEMBROS DE LA BANCA :
  • VANINHA VIEIRA DOS SANTOS
  • ADLER DINIZ DE SOUZA
  • MELISE MARIA VEIGA DE PAULA
  • Data: 06-nov-2020


  • Resumen Espectáculo
  • Survey over the years have become a practical and common way of collecting data using software for the most diverse purposes. Participation in Survey can be considered a phenomenon in which collaborative systems can be applied, since these systems are designed to allow people to communicate and work together in a shared space. In this approach, the survey can be considered as an instrument that operationalizes the cooperation between a research requester and the respondent. However, participation in Survey, regardless of the modality, is greatly influenced by the lack of engagement of the participants. Gamification, which is defined as the use of game elements and characteristics in non-game contexts, has been considered an approach to increase engagement and encourage participation in Survey. Thus, the objective of this dissertation was to investigate how gamification can be used as a motivational factor to stimulate engagement in Survey. The methodology of this work was defined based on the epistemological paradigm Design Science Research (DSR), which was used to carry out two research and two artifacts were developed, the Collaborative Engagement Model and the Opina Aí application. The artifacts were evaluated in two experiments in which two versions of the application were compared, gamified and non-gamified versions. Although the results regarding the impact of gamification on engagement have not been conclusive, this work can expand knowledge and support future studies in contexts where participation can be exercised collaboratively and provide guidance on the use of gamification in Survey.

8
  • DIMITRIUS GUILHERME FERREIRA BORGES
  • Management of a signalized intersection using Reinforcement Learning and Options Framework

  • Líder : EDMILSON MARMO MOREIRA
  • MIEMBROS DE LA BANCA :
  • CELIA LEIKO OGAWA KAWABATA
  • EDMILSON MARMO MOREIRA
  • EDVARD MARTINS DE OLIVEIRA
  • JOAO PAULO REUS RODRIGUES LEITE
  • Data: 18-dic-2020


  • Resumen Espectáculo
  • The number of vehicles on the streets across the world has quickly grown in the last decade, directly impacting how urban traffic is managed. The signalized junctions control is a vastly known and studied problem. Although an increasing number of technologies is explored and used to solve it, there still are challenges and opportunities to deal with it, especially when considering the inefficiency of the widely known fixed time traffic controllers, which are incapable of dealing with dynamic events. This study aims to apply Hierarchical Reinforcement Learning (HRL) on the control of a signalized vehicular junction and compare its performance with a fixed time traffic controller, configured using the Webster Method. HRL is a Reinforcement Learning (RL) variation, where secondary objectives, represented by sub-policies, are organized and proposed in a hierarchical model, managed by a macro-policy, responsible for selecting said sub-policies when those are capable of reaching its best results, where The Q-Learning Framework rules both sub and macro policies. Hierarchical Reinforcement Learning was chosen because it combines the ability to learn and make decisions while taking observations from the environment, in real-time, a typical ability from Reinforcement Learning, with a Divide to Conquer approach, where the problem is divided into sub-problems. These capabilities bring to a highly dynamic problem a more significant power of adaptability, which is impossible to be taken into account when using deterministic models like the Webster Method. The test scenarios, composed of several vehicle fluxes applied to a cross of two lanes, were built using the SUMO simulation tool. HRL, its sub-policies and the Webster Method are applied and assessed through these scenarios. According to the obtained results, HRL shows better results than the Webster Method and its isolated sub-policies, indicating a simple and efficient alternative.

2019
Disertaciones
1
  • LUIS HENRIQUE MEAZZINI SEPULVENE
  • Aplicação de Técnicas de Aprendizado de Máquina para o Diagnóstico de Falhas em Módulos Rastreadores de Frotas Veiculares

  • Líder : BRUNO GUAZZELLI BATISTA
  • MIEMBROS DE LA BANCA :
  • BRUNO GUAZZELLI BATISTA
  • ISABELA NEVES DRUMMOND
  • JOAO PAULO REUS RODRIGUES LEITE
  • WESLEY NUNES GONCALVES
  • Data: 23-abr-2019


  • Resumen Espectáculo
  • With industry 4.0, data-based approaches are in vogue. However, extracting the essential
    features is not an easy task and greatly influences the final result. There is also a need
    for specialized system knowledge to monitor the environment and diagnose faults. In this
    context, the diagnosis of faults is significant, for example, in a vehicle fleet monitoring
    system, since it is possible to diagnose faults even before the customer is aware of the
    fault, in addition to minimizing the maintenance costs of the modules. In this work,
    several models using Machine Learning (ML) techniques were applied and analyzed during
    the fault diagnosis process in vehicle fleet tracking modules. This research proposes two
    approaches, with knowledge and without knowledge, to explore the dataset using ML
    techniques to generate classifiers that can assist in the fault diagnosis process in vehicle
    fleet tracking modules. The approach with knowledge performs the feature extraction
    manually, using the ML techniques: Random Forest, Naive Bayes, SVM and MLP; and
    the approach without knowledge performs an automatic feature extraction, through a
    Convolutional Neural Network (CNN). The results showed that the proposed approaches
    are promising. The best models with manual feature extraction obtained a precision of
    99,76% and 99,68% for detection and detection and identification of faults, respectively,
    in the provided dataset. The best models performing an automatic feature extraction
    obtained respectively 88,43% and 54,98% for detection and detection and identification of
    failures. These models can serve as prototypes to diagnose faults remotely and confirm that
    traditional ML techniques with manual extraction of features are still effective resources
    for fault diagnosis.

2
  • PAULO VICENTE GOMES DOS SANTOS
  • RSE: um Framework para Avaliação de Desempenho de Sistemas de Recomendação

  • Líder : BRUNO TARDIOLE KUEHNE
  • MIEMBROS DE LA BANCA :
  • BRUNO TARDIOLE KUEHNE
  • EDVARD MARTINS DE OLIVEIRA
  • LUIS HENRIQUE NUNES
  • Data: 02-may-2019


  • Resumen Espectáculo
  • Recommender systems are filters that suggest products of interest to customers, which
    may cause positive impact on sales. Nowadays there are a multitude of algorithms, and
    it is important to choose the most suitable option given a situation. However, it is not
    a trivial task. In this context, we propose the Recommender Systems Evaluator (RSE):
    a framework aimed to accomplish an offline performance evaluation of recommender systems.
    We argue that the usage of a proper methodology is crucial when evaluating. Yet
    it is frequently overlooked, leading to inconsistent results. RSE hides the complexity involved
    in the evaluation and is based on statistical concepts to provide reliable conclusions.
    Studies conducted proved its effectiveness, demonstrating that it can be adapted to be
    used in another context rather than recommender systems.

3
  • CHRISTOPHER DE SOUZA LIMA FRANCISCO
  • Extensão da técnica Earned Value Management utilizando dados da qualidade em projetos de software.

  • Líder : ADLER DINIZ DE SOUZA
  • MIEMBROS DE LA BANCA :
  • ADLER DINIZ DE SOUZA
  • CARLOS EDUARDO SANCHES DA SILVA
  • LEONARDO AUGUSTO DOS SANTOS OLIVEIRA
  • RAFAEL DE MAGALHAES DIAS FRINHANI
  • Data: 03-may-2019


  • Resumen Espectáculo
  • This work presents an extension of the Earned Value Management (EVM) technique.
    The proposed technique integrates quality data, based on quality requirements, to update
    traditional EVM indexes.
    The main objective of this proposal is to improve the predictability of the Cost Performance
    Index (CPI) and the Schedule Performance Index (SPI) and introduce quality
    measures to the EVM technique.
    The proposed technique was evaluated in accordance with an evidence-based methodology.
    The studies conducted showed a better accuracy in the proposed technique of extension
    of the EVM technique in relation to the traditional EVM technique

4
  • RAFAEL DE MOURA MOREIRA
  • Classificação de Batidas do Coração Usando Algoritmo de Baixo Custo

  • Líder : ROBSON LUIZ MORENO
  • MIEMBROS DE LA BANCA :
  • EVALDO RENÓ FARIA CINTRA
  • ROBSON LUIZ MORENO
  • TALES CLEBER PIMENTA
  • Data: 26-jul-2019


  • Resumen Espectáculo
  • The electrocardiogram is a powerful tool for heart disease diagnosys. Several conditions
    manifest themselves as artifacts on the heart’s electric signal waveform. The development
    of user-friendly portable devices able to analyse an electrocardiogram signal automatically
    could allow patients to monitor their own conditions at home, and allow for large scale
    examinations on low income communities without doctors or well equipped hospitals. A full
    system combining different algorithms is proposed to perform online heartbeat classification
    using dedicated hardware with limited resources. Some of the techniques utilized, such
    as the Pan Tompkins QRS detection algorithm, have been extensively tested and used
    in different heartbeat classification systems, while others, such as dynamic segmentation
    and Hjorth parameters, have been previously shown to work for offline classification and
    use few computational resources. The proposed model tests how the different techniques
    integrate and work with no previous information about the signal, verifies their accuracy
    using the MIT-BIH Arrhythmia dataset and checks its execution time. Although it had
    good accuracy and was able to perform online classification on a conventional laptop, on a
    microcontroller it exceeded the execution time required for online classification.

5
  • MAURICIO XAVIER ZAPAROLI
  • SmartLock: Controle de acesso através de Smart Contracts e Smart Property

  • Líder : ADLER DINIZ DE SOUZA
  • MIEMBROS DE LA BANCA :
  • ADLER DINIZ DE SOUZA
  • CARLOS EDUARDO DE ANDRADE
  • RAFAEL DE MAGALHAES DIAS FRINHANI
  • Data: 06-ago-2019


  • Resumen Espectáculo
  • Blockchain is the technology that allows the transaction of money between parties in
    a peer-to-peer manner without the need of a trusted intermediary such as banks. This
    technology is known for its attractive characteristics: (i) the data integrity, and (ii) security.
    To take advantage of such characteristics, some blockchain networks are not focused
    on their cryptocurrency. One of those networks is the Ethereum network, which is a
    platform for smart contracts. Smart contracts are applications that represent an agreement
    between parties in a decentralized manner. These algorithms are deployed on the system
    and can be globally accessed. They are executed without the possibility of censorship,
    fraud, or external third party intervention. One of their possible applications is the smart
    property concept that is transacting property on the blockchain. This can be applied on
    the property rental context and the raising use of services such as AirBnB due to its
    financial attractiveness to its users, there is a need to develop a system capable of solving
    the security issues while providing comfort. This work will presents the development of a
    project that studies the viability of applying the concept of smart property. A reservation
    platform was developed where the user makes one reservation and, when approved, a
    smart contract is deployed to the Ethereum platform. To access the property, the user
    uses the application developed for Android which transfer the credentials using a data
    through sound protocol. The lock verifies these credentials accessing the smart contract
    and activates the lock’s circuit allowing access to the property. To examine the viability of
    this project, tests were performed with supposed users of this system who also answered a
    survey. After the analysis of this survey, the project proved to be viable, more attractive
    than the traditional standard, functional, and provided comfort.

6
  • CARLOS MINORU TAMAKI
  • Study of Multifunctional Sensors for Analysis of Fragility and its Components in the Elderly

  • Líder : ALEXANDRE CARLOS BRANDAO RAMOS
  • MIEMBROS DE LA BANCA :
  • ALEXANDRE CARLOS BRANDAO RAMOS
  • HILDEBRANDO FERREIRA DE CASTRO FILHO
  • ROBERTO AFFONSO DA COSTA JUNIOR
  • ROBERTO CLAUDINO DA SILVA
  • Data: 06-dic-2019


  • Resumen Espectáculo
  • Currently, the Brazilian age pyramid has shown a large increase in the range of elder people. With this, the growing development of methods, techniques and tools to assess their fragility people is increasingly important in the national scenario. Frailty is a syndrome characterized by reduced physical and cognitive reserves, making elder people more vulnerable to adverse events, hospitalizations, falls, loss of independence and death. For this syndrome to be evaluated, it is necessary to use inertia sensors, accelerometers and algorithms. Sensors are used to quantify time evaluation in the Time Up and Go (TUG) test, accelerometers are used during balance evaluation, and algorithms detect fragile, non-fragile, and robust elderly people. In this context, in partnership with researchers from UNIFEI and UNIVAS, an electronic device composed of high quality, multifunctional and low cost sensors was developed and tested, which, through the combination of a 3-axis gyroscope, 3-axis accelerometer, thermometer, oximeter and heart rate meter, analyze the quality of movement, energy expenditure, gait speed, change in balance and heart rate variability during movement and quality of contraction of the pentriceps muscle. Data are analyzed in specific software developed for the prototype. The validation of the sensors does not cause damage or harm to the health of the participants. The developed system has proven to be an effective tool for frailty syndrome analysis and continues to be successfully tested in elderly volunteers.

7
  • LUCIANO DO VALE RIBEIRO
  • Study of Computer Vision Algorithms for Identification and Tracking of Power Transmission Lines with Multirotors

  • Líder : ALEXANDRE CARLOS BRANDAO RAMOS
  • MIEMBROS DE LA BANCA :
  • ALEXANDRE CARLOS BRANDAO RAMOS
  • HILDEBRANDO FERREIRA DE CASTRO FILHO
  • ROBERTO AFFONSO DA COSTA JUNIOR
  • ROBERTO CLAUDINO DA SILVA
  • Data: 06-dic-2019


  • Resumen Espectáculo
  • The Brazilian electricity transmission network requires regular preventive inspection and maintenance to ensure its supply to consumers. Transmission companies carry out regular inspections to identify anomalies in the transmission network, such as cable defects, insulator cracks and structures close to the network, such as trees, among others. These inspections can be performed by expert personnel using binoculars or by manned helicopters containing a set of sensors, such as infrared cameras to detect hot spots, among other sensors. Interest in using unmanned aerial vehicles (UAVs) to conduct transmission line inspections has grown in recent years due to the lower cost and increased safety compared to manned helicopter inspections. With this technology, the UAV can be controlled via radio control by a pilot or semi automatic, with a pre-programmed mission in the aircraft’s onboard pilot. One of the major difficulties in performing autonomous flights is the dependence on GPS (Global Positioning System) to obtain the UAV position. Errors in GPS position measurements can affect mission progress. This project aims to develop a real-time image processing technique to identify and track transmission lines and thus determine their position relative to the UAV. Without knowing exactly the position of the UAV and the transmission lines, but knowing the relative position between them, it can be possible to correct the UAV trajectory through commands sent to the aircraft autopilot. This project was developed under the ROS (Robot Operating System) framework and the image processing was performed using the OpenCV library. To evaluate the performance of the solution, a virtual scenario was created in the Gazebo robot simulator, which made it possible to process the images generated by a multirotor during an inspection mission.

SIGAA | DTI - Diretoria de Tecnologia da Informação - (35) 3629-1080 | Copyright © 2006-2024 - UFRN - sigaa03.unifei.edu.br.sigaa03