First stab at Appdeamon

I believe this should do it?

http://www.dishanywhere.com/guide

I’m not sure if I need to be logged on first or not for it to appear.

I’ll keep searching for a local guide listing

without loggin on i cant see anything there, so you need something else :wink:

https://www.mydish.com/guide

I’m not sure there is a way without giving you my info. Because different regions receive different programs. I’m looking for Cleveland Browns games.

image

this is what I see if I open the page

thats info i can see to, so thats something i can work with.
to make sure that i know what i see.

  1. on the left are the channels?
  2. on top is the time and
  3. in the center are the programs.

but all i see when i use the url are channel 71 to 100
that could be transformed to sensor.channel71 to channel100 and then the state would be the program and an attribute for the next program.

but thats about all i can do with that url.

it could be that when you copy a cookie, you could stay logged in or when you set a PW in a certain way when you make a request you could see more, but still the problem rises to go through the programlist with requests.
you only can scrape what you see on the page when you use the url.
mimicing js commands to view other parts is another story. (which is outside my knowledge for the moment.)

what i can offer you is to help you on discord in chat, to learn how to scrape yourself.
in that case you maybe could find the programs from your favourite channels on their own website and turn those into sensors.

i created an app that can be used @Corey_Maxim
i even commented it to the extreme, so you see what it all does.

###########################################################################################
#                                                                                         #
#  Rene Tode ( [email protected] )                                                            #
#                                                                                         #
#  2018/10/07 Germany                                                                     #
#                                                                                         #
#                                                                                         #
#  an app to that creates a sensor out of data collected from                             #
#  https://www.clevelandbrowns.com/schedule/                                              #
#                                                                                         #
###########################################################################################

import appdaemon.plugins.hass.hassapi as hass
import datetime
import time

import requests
from socket import timeout
from bs4 import BeautifulSoup

class browns(hass.Hass):

  def initialize(self):
    #################################################################
    # when initialising the sensor needs to be imported             #
    # but we need to run the same code again to get the next values #
    # thats why i only start the call_back from here                #
    #################################################################
    self.get_values(self)        

  def get_values(self,kwargs):
    #################################################################
    # first we set some values, this could be done in the yaml      #
    # but this app is specialized and will only work for this       #
    # webpage, so why bother                                        #
    #################################################################
    self.url = "https://www.clevelandbrowns.com/schedule/"
    self.sensorname = "sensor.browns"
    self.friendly_name = "Next game from Cleveland Browns"
    next_game_time = None
    #################################################################
    # now we read the webpage                                       #
    #################################################################
    try:
      response = requests.get(self.url,timeout=10)
    except:
      self.log("i couldnt read the browns schedule page")
      return
    page = response.content
    #################################################################
    # now that we got the webpage we make the data readable         #
    #################################################################
    soup = BeautifulSoup(page, "html.parser")
    #################################################################
    # in the google chrome console we are going down the tree from  #
    # body. every time an indention is visible we add the next      #
    # element. untill we see main, which contains a lot of section  #
    # elements. nextSibling makes us go to the next element on the  #
    # same level. untill we reach the table containing the schedule #
    # cards. some invisible empty siblings make that we need more   #
    # rimes nextSibling then the amount of sections                 #
    #################################################################   
    cards_table = soup.body.div.main.section.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling
    #################################################################
    # to see if we got the right data we log it. uncomment when     #
    # you expect that the webpage is changed                        #
    #self.log(cards_table)                                          #
    #################################################################
    #################################################################
    # now we find the first card inside the table                   #
    #################################################################
    first_card = cards_table.div.div
    #################################################################
    # the first card is the title card containing "regular season"  #
    # now we are going to loop to the cards following that first 1  #
    #################################################################
    for schedule_card in first_card.find_next_siblings():
        #############################################################
        # lets find the date we want out of the card                #
        #############################################################
        try:
            game_start = schedule_card.div["data-gametime"]
        except:
            #########################################################
            # there is no date found in this card (probably an add) #
            #########################################################
            game_start = ""
        #############################################################
        # if we find a date, then we need to translate the date to  #
        # a time we can compare. in this case we find a date like   #
        # like this 2018-09-09T17:00:00Z which is %Y-%m-%dT%H:%M:%S #
        # (python datetime lib docs tell us that)                   #
        #############################################################
        if game_start != "":
            game_time = datetime.datetime.strptime(game_start,"%Y-%m-%dT%H:%M:%SZ")
            #########################################################
            # find out if this date is in the future                #
            #########################################################
            if game_time > datetime.datetime.now():
                #####################################################
                # check if we didnt find one before, when not set it#
                #####################################################
                if next_game_time == None:
                    next_game_time = game_time
                    #################################################
                    # now that we know that this is the next game   #
                    # lets also lookup the opponent in the card     #
                    # it will make a nice attribute for the sensor  #
                    # to remove all whitespace we use strip()       #
                    # again we can find that by looking at the      #
                    # google chrome console                         #
                    #################################################
                    opponent = schedule_card.div.div.nextSibling.nextSibling.p.nextSibling.nextSibling.string.strip()
    #################################################################
    # now we got all data we need but the date isnt what we need    #
    # we translate that again to the timeformat we want to see      #
    # for the HA sensor                                             #
    #################################################################
    next_game_str = next_game_time.strftime("%Y/%m/%d %H:%M:%S")
    #################################################################
    # now we got all info we need and we can create a sensor.       #
    # the first time that the code is run it will create a warning  #
    # that the sensor doesnt exist. if we see that in the log we    #
    # know that the sensor is created.                              #
    #################################################################
    self.set_state(self.sensorname, state = next_game_str, attributes = {"friendly_name": self.friendly_name,"Opponent": opponent})
    #################################################################
    # now al we need to do is make sure that the sensor stays up to #
    # date. we could check the webpage every minute, but that would #
    # be unneccesary traffic. we dont know exactly when the webpage #
    # is updated, we need to use a short time after the game, but   #
    # we dont want it to be too long                                #
    # if the sensor isnt up to date, just check the page, restart   #
    # the app and or change the extra time we now add               #
    #################################################################
    update_time = next_game_time + datetime.timedelta(hours= 4)
    #################################################################
    # so we got a time that we want to update the sensor. so we run #
    # this code again at that time.                                 #
    #################################################################
    self.run_at(self.get_values,update_time)

all you need to do is save this app as browns.py and
save this yaml as browns,yaml

browns:
  module: browns
  class: browns

now the app creates a sensor with the next game.
it is possible that HA needs to get next_game_time instead of next_game_str to be able to use the sensor for time based automations.

its possible to create an app that listens to the sensor and uses a run_at function to start a script.

the sensor it creates looks like this:
browns%20statecard

4 Likes

Thank You so much for doing this! You are awesome!! I wish there was a way to pay you back! Let me know if there is anything I can ever do to help you out!!!

Seriously ty !

I’ll get it installed and let you know how it went!!

the way you react is payment enough for me.

1 Like

I know this is a noob question, but where exactally do I place the browns.py and the browns.yaml file?

i set up appdeamon addon, in hassos.

also will the sensors show up in my main UI? or do I need to set up my automation in appdeamon?
Also, will there be a sensor to tell me what network the game will be on, ie: NBC, CBS, ABC, FOX etc.?
Im sorry to ask, im just trying to learn.

Thanks!

the files go in your appdaemon config dir in the subdir apps
and yeah the sensor will show up in your home assistant frontend.

it wont tell you which network, because thats not on the site.

you can create an automation in HA or appdaemon just as you like.

oops, i did overlook several times that the channel is shown.
ill add that an attribute for the channel later on.

oke i added the channel as attribute
browns%20statecard

###########################################################################################
#                                                                                         #
#  Rene Tode ( [email protected] )                                                            #
#                                                                                         #
#  2018/10/07 Germany                                                                     #
#                                                                                         #
#                                                                                         #
#  an app to that creates a sensor out of data collected from                             #
#  https://www.clevelandbrowns.com/schedule/                                              #
#                                                                                         #
###########################################################################################

import appdaemon.plugins.hass.hassapi as hass
import datetime
import time

import requests
from socket import timeout
from bs4 import BeautifulSoup

class browns(hass.Hass):

  def initialize(self):
    #################################################################
    # when initialising the sensor needs to be imported             #
    # but we need to run the same code again to get the next values #
    # thats why i only start the call_back from here                #
    #################################################################
    self.get_values(self)        

  def get_values(self,kwargs):
    #################################################################
    # first we set some values, this could be done in the yaml      #
    # but this app is specialized and will only work for this       #
    # webpage, so why bother                                        #
    #################################################################
    self.url = "https://www.clevelandbrowns.com/schedule/"
    self.sensorname = "sensor.browns"
    self.friendly_name = "Next game from Cleveland Browns"
    next_game_time = None
    #################################################################
    # now we read the webpage                                       #
    #################################################################
    try:
      response = requests.get(self.url,timeout=10)
    except:
      self.log("i couldnt read the browns schedule page")
      return
    page = response.content
    #################################################################
    # now that we got the webpage we make the data readable         #
    #################################################################
    soup = BeautifulSoup(page, "html.parser")
    #################################################################
    # in the google chrome console we are going down the tree from  #
    # body. every time an indention is visible we add the next      #
    # element. untill we see main, which contains a lot of section  #
    # elements. nextSibling makes us go to the next element on the  #
    # same level. untill we reach the table containing the schedule #
    # cards. some invisible empty siblings make that we need more   #
    # rimes nextSibling then the amount of sections                 #
    #################################################################   
    cards_table = soup.body.div.main.section.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling.nextSibling
    #################################################################
    # to see if we got the right data we log it. uncomment when     #
    # you expect that the webpage is changed                        #
    #self.log(cards_table)                                          #
    #################################################################
    #################################################################
    # now we find the first card inside the table                   #
    #################################################################
    first_card = cards_table.div.div
    #################################################################
    # the first card is the title card containing "regular season"  #
    # now we are going to loop to the cards following that first 1  #
    #################################################################
    for schedule_card in first_card.find_next_siblings():
        #############################################################
        # lets find the date we want out of the card                #
        #############################################################
        try:
            game_start = schedule_card.div["data-gametime"]
        except:
            #########################################################
            # there is no date found in this card (probably an add) #
            #########################################################
            game_start = ""
        #############################################################
        # if we find a date, then we need to translate the date to  #
        # a time we can compare. in this case we find a date like   #
        # like this 2018-09-09T17:00:00Z which is %Y-%m-%dT%H:%M:%S #
        # (python datetime lib docs tell us that)                   #
        #############################################################
        if game_start != "":
            game_time = datetime.datetime.strptime(game_start,"%Y-%m-%dT%H:%M:%SZ")
            #########################################################
            # find out if this date is in the future                #
            #########################################################
            if game_time > datetime.datetime.now():
                #####################################################
                # check if we didnt find one before, when not set it#
                #####################################################
                if next_game_time == None:
                    next_game_time = game_time
                    #################################################
                    # now that we know that this is the next game   #
                    # lets also lookup the opponent in the card     #
                    # it will make a nice attribute for the sensor  #
                    # to remove all whitespace we use strip()       #
                    # again we can find that by looking at the      #
                    # google chrome console                         #
                    #################################################
                    opponent = schedule_card.div.div.nextSibling.nextSibling.p.nextSibling.nextSibling.string.strip()
                    #################################################
                    # and we want to find the channel that it will  #
                    # be on.                                        #
                    #################################################
                    channel = schedule_card.div.div.nextSibling.nextSibling.div.nextSibling.nextSibling.div.div.span.nextSibling.nextSibling.string.strip()
    #################################################################
    # now we got all data we need but the date isnt what we need    #
    # we translate that again to the timeformat we want to see      #
    # for the HA sensor                                             #
    #################################################################
    next_game_str = next_game_time.strftime("%Y/%m/%d %H:%M:%S")
    #################################################################
    # now we got all info we need and we can create a sensor.       #
    # the first time that the code is run it will create a warning  #
    # that the sensor doesnt exist. if we see that in the log we    #
    # know that the sensor is created.                              #
    #################################################################
    self.set_state(self.sensorname, state = next_game_str, attributes = {"friendly_name": self.friendly_name,"Opponent": opponent,"Channel": channel})
    #################################################################
    # now al we need to do is make sure that the sensor stays up to #
    # date. we could check the webpage every minute, but that would #
    # be unneccesary traffic. we dont know exactly when the webpage #
    # is updated, we need to use a short time after the game, but   #
    # we dont want it to be too long                                #
    # if the sensor isnt up to date, just check the page, restart   #
    # the app and or change the extra time we now add               #
    #################################################################
    update_time = next_game_time + datetime.timedelta(hours= 4)
    #################################################################
    # so we got a time that we want to update the sensor. so we run #
    # this code again at that time.                                 #
    #################################################################
    self.run_at(self.get_values,update_time)
1 Like

:open_mouth: Wow, you are so AWESOME :sunglasses:, thank you thank you thank you. I can’t wait to get home from work tonight and start playing

1 Like

2018-10-08 07:10:29.963074 INFO AppDaemon: App ‘browns’ added 2018-10-08 07:10:29.964320 INFO AppDaemon: Adding /config/appdaemon/apps to module import path 2018-10-08 07:10:29.966597 INFO AppDaemon: Loading App Module: /config/appdaemon/apps/hello.py 2018-10-08 07:10:30.002264 INFO AppDaemon: Loading App Module: /config/appdaemon/apps/browns.py 2018-10-08 07:10:30.012440 WARNING AppDaemon: ------------------------------------------------------------ 2018-10-08 07:10:30.012990 WARNING AppDaemon: Unexpected error loading module: /config/appdaemon/apps/browns.py: 2018-10-08 07:10:30.014653 WARNING AppDaemon: ------------------------------------------------------------ 2018-10-08 07:10:30.023025 WARNING AppDaemon: Traceback (most recent call last): File “/usr/lib/python3.6/site-packages/appdaemon/appdaemon.py”, line 2015, in check_app_updates self.read_app(mod[“name”], mod[“reload”]) File “/usr/lib/python3.6/site-packages/appdaemon/appdaemon.py”, line 1802, in read_app self.modules[module_name] = importlib.import_module(module_name) File “/usr/lib/python3.6/importlib/init.py”, line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File “<frozen importlib._bootstrap>”, line 994, in _gcd_import File “<frozen importlib._bootstrap>”, line 971, in _find_and_load File “<frozen importlib._bootstrap>”, line 955, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 665, in _load_unlocked File “<frozen importlib._bootstrap_external>”, line 678, in exec_module File “<frozen importlib._bootstrap>”, line 219, in _call_with_frames_removed File “/config/appdaemon/apps/browns.py”, line 19, in <module> from bs4 import BeautifulSoup ModuleNotFoundError: No module named ‘bs4’ 2018-10-08 07:10:30.027918 WARNING AppDaemon: Removing associated apps: 2018-10-08 07:10:30.023737 WARNING AppDaemon: ------------------------------------------------------------ 2018-10-08 07:10:30.039472 WARNING AppDaemon: browns 2018-10-08 07:10:30.042948 INFO AppDaemon: Initializing app hello_world using class HelloWorld from module hello 2018-10-08 07:10:30.353098 INFO hello_world: Hello from AppDaemon 2018-10-08 07:10:30.357956 INFO hello_world: You are now ready to run Apps! 2018-10-08 07:10:30.360481 INFO AppDaemon: App initialization complete

In your addon config.json file, look for

"options": {
    "log_level": "info",
    "system_packages": [],
    "python_packages": []
  }

and change it to

"options": {
    "log_level": "info",
    "system_packages": [],
    "python_packages": ['beautifulsoup4']
  }

And restart the addon and see if it works.

Regards

1 Like

thanks @Odianosen25 i forgot about that and i wouldnt have known how to configure that in hassio :wink:
but i believe the package that needs to be installed is bs4

at least thats what i pip :wink:

so then it would be

"options": {
    "log_level": "info",
    "system_packages": [],
    "python_packages": ['bs4']
  }
1 Like

Thanks guys I definitely do not have that in my add-on config

That definitely explains a lot of my issues with terminal!

Thanks again!

I love my HA family! :heart:

what you need to change in your addon config has nothing to do with terminal.
you need to change that in the addon config from appdaemon, so that for the appdaemon addon beautifullsoup is installed.
which is a python package which normally is installed with pip, but you cant use pip inside an addon.

Thanks guys, this is what I had to put.

{
  "log_level": "info",
  "system_packages": [],
  "python_packages": [
    "beautifulsoup4"
  ]
}

The sensor shows up, Im so EXCITED! Thank You so much!!!

1 Like

remember, i wasnt sure which way to use time for HA, so its possible HA doesnt recognize it as time.
you need to try out automations for that.

and dont forget that this webpage is only for this year. its very well possible that they change something in the page next year, and then with the help from my comments you should be able to figure out how to change your code accordingly.

1 Like

You are awesome M8, I’m going to study this and try to do more like it! Thanks so much

1 Like