Skip to main content

Drag and Drop page builder to let us create static pages on Paper.Li websites

Comments

1 comment

  • The Editor

    My suggestion is for you to create a New Twitter account, and then put that in as a feed in your Paper.li configuration. Then whatever new content you add between scrape times will automatically appear in your next edition.

    Of course you can retweet any items you want to remain "sticky" and they will appear again. You can retweet other peoples articles about Marina del Rey.

    Of course there is also the possibility of using a third party automation program to place common articles on a repetitive basis. They would work by you recording yourself navigating to a web URL, and then clipping the article to your Paper.li, and then you just playback that recording each day, automatically. Repeat as necessary for more URLs.

    Again it is possible to paste permanent URLs in the Editors Comments section of a Paper.li. If you have the Pro version of course you can customise the layout and have permanent sections, by adjusting the CSS script. Third party WYSIWYG editors can output CSS script, which can be inserted here presumably?

    Wordpress is a different concept really though, and this here is really an aggregator concept. You could of course have a Wordpress site and then get the Paper.li scraper to scan that "blog" each day for the new stories, but you would have to write them (or some other human person).

    It's possible that some fine tuning of the Paper.li scraper could achieve what you want by a user being able to reset the start date to some specified date, for scraping a particular website or twitter feed & etc. Maybe that is a suggestion which could be much easier to program into the site backend ?

    Comment actions Permalink

Please sign in to leave a comment.